Author: Newton, Sarah

Upcoming RMME/STAT Colloquium (12/6): Kristen Olson, “Recent Lessons on Mixing Mail and Web Data Collection Modes in Household Surveys”

RMME/STAT Joint Colloquium

Recent Lessons on Mixing Mail and Web Data Collection Modes in Household Surveys

Dr. Kristen Olson

University of Nebraska-Lincoln

Friday, December 6, at 11 AM ET

https://tinyurl.com/rmme-Olson

Survey designs are increasingly incorporating self-administered modes – namely mail and web – to counter decreasing response rates and increasing costs in interviewer-administered surveys. In this talk, I will cover recent research related to mixing modes of self-administered data collection in household surveys, with a focus on designing for survey participation. I will explore theory for why sample members may select one self-administered mode over another, how modes affect response rates, representation and costs, and design decisions related to recruitment materials for mixed-mode surveys.

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Upcoming RMME/STAT Colloquium (11/15): Pascale Deboeck, “Effects Across Time to Modeling States of a System: A Dynamical Systems Perspective on Modeling Social Science Data”

RMME/STAT Joint Colloquium

Effects Across Time to Modeling States of a System: A Dynamical Systems Perspective on Modeling Social Science Data

Dr. Pascale Deboeck

University of Utah

Friday, November 15, at 11 AM ET

https://tinyurl.com/rmme-Deboeck

Modeling of repeated observations across time generally falls into one of two categories. The first is modeling observations as they change across time; that is, as a function of time. The second is to model the observations as they relate to prior observations; many common models in the social sciences are represented in this second category, including auto-regressive models, cross-lagged panel models, and latent difference score models. A less-common approach in this latter category are differential equation models. These models express relations between the state of observations and how they are changing. These models offer the opportunity to imagine change relations that may be commonly overlooked when modeling repeated measures. This talk will introduce the application of differential equation modeling to intensive, longitudinal data. Specific examples will include estimating derivatives from noisy data and the possibility of testing novel models of intraindividual dynamics.

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

RMME Community Members Present at AEA 2024

Multiple Research Methods, Measurement, & Evaluation (RMME) community members are presenting at the 2024 annual meeting of the American Evaluation Association (AEA) this week. Remember to check out these excellent presentations–we hope to see you there!

Wednesday, October 23
4:15 PM to 5:15 PM
Location: B117-119
Session Title: (Panel + AEA Sponsored Sessions)
Presentation Title: Amplifying and Empowering Voices through Evaluation Journals
Chair: Bianca Montrosse-Moorhead*
Presenters: Florence Etta/Laura Peck/Nicole Bowman (Lunaape/Mohican)/Ayesha Boyce/ Thomas Archibald/Daniel Balsaobre-Lorente/Michael A. Harnar/Sarah Mason/Hanh Cao Yu

Thursday, October 24
10:15 AM to 11:15 AM
Location: G131-132
Session Title: (Panel + Theories of Evaluation)
Presentation Title: What Happens When You Give Evaluators a Box of Crayons? Reflections on Visualizations of Evaluation Theory
Chair: Sebastian Lemire
Presenters: Christina Christie/Tarek Azzam/Bianca Montrosse-Moorhead*
Discussants: Melvin M. Mark

6:00 PM to 7:30 PM
Location: N/A
Session Title: Poster Reception #2 (Poster + Organizational Learning and Evaluation Capacity Building)
Presentation Title: 257 – Amplifying the voices of new Evaluation Capacity Builders: Lessons learned from transferring new skills and ideas into the workplace
Chair: Leslie A. Fierro*
Presenters: Emily Frost, Lashonda Williams/Renee Boyer/Jordyn J. Livingston
Discussants: Heather D. Codd/ Renee Boyer

Friday, October 25
11:30 AM to 12:30 PM
Location: C120-122
Session Title: (Multipaper + Health Evaluation)
Presentation Title: The Updated CDC Program Evaluation Framework: Revised to Meet Current and Future Public Health and Evaluation Needs
Presenters: Daniel P. Kidder/Heather Salvaggio/Elena Luna
Discussants: Leslie A. Fierro*
Authors: Daniel P. Kidder/Cassandra Davis/Destiny Bruce/Michael Kerzner

12:45 PM to 1:45 PM
Location: Exhibit Hall A
Session Title: Birds of a feather (STEM Education and Training)
Presentation Title: 62 – Amplifying the Voices of Multigenerational and Multidisciplinary STEM Evaluation Journeys – A Roundtable Bridge of Resources and Mentorship
Presenters: Kevin Keane/Clara M. Pelfrey/Breonte Guy/Leslie A. Fierro*/Carlos Romero/Alexandra Budenz/Robert Garcia/Rechelle Paranal
Facilitators: Rajiv R. Ramdat/Ciara C. Knight

2:30 PM to 3:30 PM
Location: Portland Ballroom 254
Session Title: (Roundtable + Research on Evaluation)
Presentation Title: What does it mean to be an evaluator?
Presenters: Doris Espelien/Dana J. Linnell/Bianca Montrosse-Moorhead*/Aileen M. Reid/Cherie M. Avent/Grettel Arias Orozco/ Onyinyechukwu Onwuka du Bruyn
Authors: Doris Espelien/Dana J. Linnell/Bianca Montrosse-Moorhead*/Aileen M. Reid/Cherie M. Avent/Grettel Arias Orozco/ Onyinyechukwu Onwuka du Bruyn

3:45 PM to 4:45 PM PST
Location: B113-114
Session Title: (Think Tank + Use and Influence of Evaluation)
Presentation Title: The Critical Role of Equity and Inclusion in Defining Rigor: What It Takes to Reimagine Peer Review Journals
Chair: Hanh Cao Yu
Presenters: Rodney Hopson/Bianca Montrosse-Moorhead*

4:15 PM to 4:30 PM
Location: D133-134
Session Title: Building Evaluator Capacity and Competencies (Paper + Graduate Students and New Evaluators and Research on Evaluation)
Presentation Title: Doctoral Student Experiences with Research on Evaluation: Insights and Opportunities from a Collaborative Autoethnography
Presenters: Amanda Sutter*/ Valerie Marshall/Rachael Kenney/Allison Prieur/ Kari Ross Nelson/ Christine Liboon

 

*RMME Community member

Upcoming RMME/STAT Colloquium (10/11): Sandip Sinharay, “Assessment of Fit of Item Response Theory Models: Full-information and Limited-information Methods, Item and Person Fit Analysis, and Beyond”

RMME/STAT Joint Colloquium

Assessment of Fit of Item Response Theory Models: Full-information and Limited-information Methods, Item and Person Fit Analysis, and Beyond

Dr. Sandip Sinharay

Educational Testing Service (ETS) Research Institute

Friday, October 11, at 11:15 AM ET

AUST 110

https://tinyurl.com/rmme-Sinharay

Item response theory (IRT) is one of the central methodological pillars supporting many large and high-profile assessment programs globally. IRT analysis is essentially a type of discrete multivariate analysis and is performed using IRT models that are latent variable models for discrete data. However, IRT models involve multiple assumptions like conditional independence, monotonicity etc. and the results obtained from IRT models may not be accurate if one or more of the assumptions are not met, that is, if there is IRT model misfit. This presentation will include a comprehensive review of the literature on the assessment of fit of IRT models. The presenter will discuss various approaches and concepts regarding IRT model fit including full-information and limited-information methods, residual analysis, item and person-fit analysis, Bayesian methods, analysis for differential item functioning, and assessment of practical significance of misfit. A real data example will be used to illustrate some of the approaches. One goal of the presentation is to stimulate discussions involving the audience members regarding IRT model-fit assessment.

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

RMME Community Members Present at SREE 2024

Several Research Methods, Measurement, & Evaluation (RMME) community members are presenting at the 2024 annual meeting of the Society for Research on Educational Effectiveness (SREE) this week. Check out their excellent contributions to the conference below:

Thursday, September 19

9.00 am to 10.30 pm
Location: Harborside Foyer (4, Baltimore Marriott Waterfront)
Session Title: 3G. – Exploring Predictive Validity and Disparities in Education: From Test Scores to Behavior Screeners
Presentation Title: Mapping the Research Base for Universal Behavior Screeners
Presenting Author: Kathleen Lane
Authors: Katie Pelton*

12.30 pm to 2.00 pm
Location: Kent AB (4, Baltimore Mariott Waterfront)
Session Title: 5F. – Artificial Intelligence and the Future of Educational Measurement and Evaluation
Presentation Title: Low-Cost Measurement with Large Language Models: An Application of Few-Shot Classification in Educational Evaluations
Author: Claudia Ventura*
Presenting Author: Kylie Anglin*

2.15 pm to 3.15 pm
Location: Kent AB (4, Baltimore Marriott Waterfront)
Session Title: Open Science Affinity Group Panel: How Do We Move Open Science Forward in Educational Research?
Panelists: Kara Finnigan, Elizabeth Tipton, Sean Grant
Moderator: D. Betsy McCoach*

Friday, September 20

9.00 am to 10.30 pm
Location: Harborside Foyer (4, Baltimore Marriott Waterfront)
Session Title: 7H. – Advancing Measurement and Assessment in Special Education
Presentation Title: Measurement Invariance and Predictive Validity of a Free-Access Universal Screening Tool: The SRSS-IE
Presenting Author: Kathleen Lane
Authors: Katie Pelton*

1.15 pm to 2.15 pm
Location: Harborside Foyer (4, Baltimore Marriott Waterfront)
Session Title: 8D. Organization of Schools and Systems
Presentation Title: Evaluation of a Brief Intervention: Achievement Gaps and Reliability
Authors: Joselyn Perez*, D. Betsy McCoach*

1.15 pm to 2.15 pm
Location: Kent AB (4, Baltimore Marriott Waterfront)
Session Title: 8F. Research Methods
Presentation Title: Deciphering Hyperparameter Choices in Machine Learning for Propensity Score Estimation: A Systematic Review of GBM and Random Forest Methods
Authors: Huibin Zhang, Walter Leite, Zachary Collier*, Kamal Chawla, Lingchen Kong, YongSeok Lee, Jia Quan
Presenting Author: Huibin Zhang

1.15 pm to 2.15 pm
Location: Harborside Foyer (4, Baltimore Marriott Waterfront)
Session Title: 8F. Research Methods
Presentation Title: Exploring Estimates of Multilevel Reliability for School Based Behavioral Measures
Authors: Katie Pelton*, D. Betsy McCoach*

2.15 p.m. – 3.45 p.m.
Location: Kent AB (4, Baltimore Mariott Waterfront)
Session Title: 9E. Advancing Educational Outcomes through Machine Learning and Predictive Analytics
Chair: Kylie Anglin*

*RMME Community member

RMME Programs Celebrates its Summer 2024 Grads!!!

UConn’s Research Methods, Measurement, & Evaluation (RMME) Programs Community is thrilled to recognize its Summer 2024 RMME Master’s degree program and Graduate Certificate in Program Evaluation program graduates! We are so proud of you and cannot wait to see the many ways you represent RMME Programs as our outstanding alumni! Congratulations, from the entire Research Methods, Measurement, & Evaluation Community!