Upcoming Events

RMME Upcoming Events

Upcoming RMME/STAT Colloquium (2/7): Paul De Boeck, “Adventitious Error Everywhere”

RMME/STAT Joint Colloquium

Adventitious Error Everywhere

Dr. Paul De Boeck

The Ohio State University

Friday, February 7, at 11 AM ET

https://tinyurl.com/rmme-PDeBoeck

Adventitious error is a concept introduced by Wu and Browne (Psychometrika, 2015) to explain imperfect goodness of fit of covariance structure models (CSMs, i.e., factor models, SEM). The paper was published together with critical remarks from the reviewers. In my presentation I will discuss, illustrate, and speculate about the potential of adventitious error beyond CSM, (a) as a unitary framework to understand and deal with underestimated inferential uncertainty regarding relations between variables, heterogeneity in meta-analysis, violations of measurement invariance, individual differences in the validity of tests, and (b) as a joint framework for reliability and validity. The presentation is partly based on De Boeck, DeKay, and Pek (Psychometrika, 2024, 89, 1055-1073).

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Upcoming RMME/STAT Colloquium (1/24): Walter Dempsey, “Challenges in Time-varying Causal Effect Moderation Analysis in Mobile Health”

RMME/STAT Joint Colloquium

Challenges in Time-varying Causal Effect Moderation Analysis in Mobile Health

Dr. Walter Dempsey

University of Michigan

Friday, January 24, at 11 AM ET

https://tinyurl.com/rmme-Dempsey

Twin revolutions in wearable technologies and smartphone-delivered digital health interventions have significantly expanded the accessibility and uptake of mobile health (mHealth) interventions in multiple domains of health sciences. Sequentially randomized experiments called micro-randomized trials (MRTs) have grown in popularity as a means to empirically evaluate the effectiveness of mHealth intervention components. MRTs have motivated a new class of causal estimands, termed “causal excursion effects”, that allow health scientists to answer important scientific questions about how intervention effectiveness may change over time or be moderated by individual characteristics, time-varying context, or past responses. In this talk, we present two new tools for causal effect moderation analysis. First, we consider a meta-learner perspective, where any supervised learning algorithm can be used to assist in the estimation of the causal excursion effect. We will present theoretical results and accompanying simulation experiments to demonstrate relative efficiency gains. Practical utility of the proposed methods is demonstrated by analyzing data from a multi-institution cohort of first year medical residents in the United States. Second, we will consider effect moderation with tens or hundreds of potential moderators. In this setting, it becomes necessary to use the observed data to select a simpler model for effect moderation and then make valid statistical inference. We propose a two-stage procedure to solve this problem that leverages recent advances in post-selective inference using randomization. We will discuss asymptotic validity of the conditional selective inference procedure and the importance of randomization. Simulation studies verify the asymptotic results. We end with an analysis of an MRT for promoting physical activity in cardiac rehabilitation to demonstrate the utility of the method.

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

RMME to Host Exhibits Booth at STEM Super Saturday on January 11, 2025!

The Rhode Island Society for Technology in Education (RISTE), jointly with the Rhode Island Science Teachers Association (RISTA) and the Rhode Island Math Teachers Association (RIMTA), are hosting STEM Super Saturday! This annual event for educators will take place at the Rhode Island Nursing Education Center, in Providence, RI, on January 11, 2025.

UConn’s RMME Programs are thrilled to participate in, and support the event, by hosting an exhibits booth for attendees! So, if you will be there, be sure to stop by for an opportunity to chat personally with Dr. Sarah D. Newton, the Associate Director of RMME Online Programs! We cannot wait to see you there!

Upcoming RMME/STAT Colloquium (12/6): Kristen Olson, “Recent Lessons on Mixing Mail and Web Data Collection Modes in Household Surveys”

RMME/STAT Joint Colloquium

Recent Lessons on Mixing Mail and Web Data Collection Modes in Household Surveys

Dr. Kristen Olson

University of Nebraska-Lincoln

Friday, December 6, at 11 AM ET

https://tinyurl.com/rmme-Olson

Survey designs are increasingly incorporating self-administered modes – namely mail and web – to counter decreasing response rates and increasing costs in interviewer-administered surveys. In this talk, I will cover recent research related to mixing modes of self-administered data collection in household surveys, with a focus on designing for survey participation. I will explore theory for why sample members may select one self-administered mode over another, how modes affect response rates, representation and costs, and design decisions related to recruitment materials for mixed-mode surveys.

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Upcoming RMME/STAT Colloquium (11/15): Pascale Deboeck, “Effects Across Time to Modeling States of a System: A Dynamical Systems Perspective on Modeling Social Science Data”

RMME/STAT Joint Colloquium

Effects Across Time to Modeling States of a System: A Dynamical Systems Perspective on Modeling Social Science Data

Dr. Pascale Deboeck

University of Utah

Friday, November 15, at 11 AM ET

https://tinyurl.com/rmme-Deboeck

Modeling of repeated observations across time generally falls into one of two categories. The first is modeling observations as they change across time; that is, as a function of time. The second is to model the observations as they relate to prior observations; many common models in the social sciences are represented in this second category, including auto-regressive models, cross-lagged panel models, and latent difference score models. A less-common approach in this latter category are differential equation models. These models express relations between the state of observations and how they are changing. These models offer the opportunity to imagine change relations that may be commonly overlooked when modeling repeated measures. This talk will introduce the application of differential equation modeling to intensive, longitudinal data. Specific examples will include estimating derivatives from noisy data and the possibility of testing novel models of intraindividual dynamics.

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

RMME Community Members Present at AEA 2024

Multiple Research Methods, Measurement, & Evaluation (RMME) community members are presenting at the 2024 annual meeting of the American Evaluation Association (AEA) this week. Remember to check out these excellent presentations–we hope to see you there!

Wednesday, October 23
4:15 PM to 5:15 PM
Location: B117-119
Session Title: (Panel + AEA Sponsored Sessions)
Presentation Title: Amplifying and Empowering Voices through Evaluation Journals
Chair: Bianca Montrosse-Moorhead*
Presenters: Florence Etta/Laura Peck/Nicole Bowman (Lunaape/Mohican)/Ayesha Boyce/ Thomas Archibald/Daniel Balsaobre-Lorente/Michael A. Harnar/Sarah Mason/Hanh Cao Yu

Thursday, October 24
10:15 AM to 11:15 AM
Location: G131-132
Session Title: (Panel + Theories of Evaluation)
Presentation Title: What Happens When You Give Evaluators a Box of Crayons? Reflections on Visualizations of Evaluation Theory
Chair: Sebastian Lemire
Presenters: Christina Christie/Tarek Azzam/Bianca Montrosse-Moorhead*
Discussants: Melvin M. Mark

6:00 PM to 7:30 PM
Location: N/A
Session Title: Poster Reception #2 (Poster + Organizational Learning and Evaluation Capacity Building)
Presentation Title: 257 – Amplifying the voices of new Evaluation Capacity Builders: Lessons learned from transferring new skills and ideas into the workplace
Chair: Leslie A. Fierro*
Presenters: Emily Frost, Lashonda Williams/Renee Boyer/Jordyn J. Livingston
Discussants: Heather D. Codd/ Renee Boyer

Friday, October 25
11:30 AM to 12:30 PM
Location: C120-122
Session Title: (Multipaper + Health Evaluation)
Presentation Title: The Updated CDC Program Evaluation Framework: Revised to Meet Current and Future Public Health and Evaluation Needs
Presenters: Daniel P. Kidder/Heather Salvaggio/Elena Luna
Discussants: Leslie A. Fierro*
Authors: Daniel P. Kidder/Cassandra Davis/Destiny Bruce/Michael Kerzner

12:45 PM to 1:45 PM
Location: Exhibit Hall A
Session Title: Birds of a feather (STEM Education and Training)
Presentation Title: 62 – Amplifying the Voices of Multigenerational and Multidisciplinary STEM Evaluation Journeys – A Roundtable Bridge of Resources and Mentorship
Presenters: Kevin Keane/Clara M. Pelfrey/Breonte Guy/Leslie A. Fierro*/Carlos Romero/Alexandra Budenz/Robert Garcia/Rechelle Paranal
Facilitators: Rajiv R. Ramdat/Ciara C. Knight

2:30 PM to 3:30 PM
Location: Portland Ballroom 254
Session Title: (Roundtable + Research on Evaluation)
Presentation Title: What does it mean to be an evaluator?
Presenters: Doris Espelien/Dana J. Linnell/Bianca Montrosse-Moorhead*/Aileen M. Reid/Cherie M. Avent/Grettel Arias Orozco/ Onyinyechukwu Onwuka du Bruyn
Authors: Doris Espelien/Dana J. Linnell/Bianca Montrosse-Moorhead*/Aileen M. Reid/Cherie M. Avent/Grettel Arias Orozco/ Onyinyechukwu Onwuka du Bruyn

3:45 PM to 4:45 PM PST
Location: B113-114
Session Title: (Think Tank + Use and Influence of Evaluation)
Presentation Title: The Critical Role of Equity and Inclusion in Defining Rigor: What It Takes to Reimagine Peer Review Journals
Chair: Hanh Cao Yu
Presenters: Rodney Hopson/Bianca Montrosse-Moorhead*

4:15 PM to 4:30 PM
Location: D133-134
Session Title: Building Evaluator Capacity and Competencies (Paper + Graduate Students and New Evaluators and Research on Evaluation)
Presentation Title: Doctoral Student Experiences with Research on Evaluation: Insights and Opportunities from a Collaborative Autoethnography
Presenters: Amanda Sutter*/ Valerie Marshall/Rachael Kenney/Allison Prieur/ Kari Ross Nelson/ Christine Liboon

 

*RMME Community member

Upcoming RMME/STAT Colloquium (10/11): Sandip Sinharay, “Assessment of Fit of Item Response Theory Models: Full-information and Limited-information Methods, Item and Person Fit Analysis, and Beyond”

RMME/STAT Joint Colloquium

Assessment of Fit of Item Response Theory Models: Full-information and Limited-information Methods, Item and Person Fit Analysis, and Beyond

Dr. Sandip Sinharay

Educational Testing Service (ETS) Research Institute

Friday, October 11, at 11:15 AM ET

AUST 110

https://tinyurl.com/rmme-Sinharay

Item response theory (IRT) is one of the central methodological pillars supporting many large and high-profile assessment programs globally. IRT analysis is essentially a type of discrete multivariate analysis and is performed using IRT models that are latent variable models for discrete data. However, IRT models involve multiple assumptions like conditional independence, monotonicity etc. and the results obtained from IRT models may not be accurate if one or more of the assumptions are not met, that is, if there is IRT model misfit. This presentation will include a comprehensive review of the literature on the assessment of fit of IRT models. The presenter will discuss various approaches and concepts regarding IRT model fit including full-information and limited-information methods, residual analysis, item and person-fit analysis, Bayesian methods, analysis for differential item functioning, and assessment of practical significance of misfit. A real data example will be used to illustrate some of the approaches. One goal of the presentation is to stimulate discussions involving the audience members regarding IRT model-fit assessment.

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

RMME Community Members Present at SREE 2024

Several Research Methods, Measurement, & Evaluation (RMME) community members are presenting at the 2024 annual meeting of the Society for Research on Educational Effectiveness (SREE) this week. Check out their excellent contributions to the conference below:

Thursday, September 19

9.00 am to 10.30 pm
Location: Harborside Foyer (4, Baltimore Marriott Waterfront)
Session Title: 3G. – Exploring Predictive Validity and Disparities in Education: From Test Scores to Behavior Screeners
Presentation Title: Mapping the Research Base for Universal Behavior Screeners
Presenting Author: Kathleen Lane
Authors: Katie Pelton*

12.30 pm to 2.00 pm
Location: Kent AB (4, Baltimore Mariott Waterfront)
Session Title: 5F. – Artificial Intelligence and the Future of Educational Measurement and Evaluation
Presentation Title: Low-Cost Measurement with Large Language Models: An Application of Few-Shot Classification in Educational Evaluations
Author: Claudia Ventura*
Presenting Author: Kylie Anglin*

2.15 pm to 3.15 pm
Location: Kent AB (4, Baltimore Marriott Waterfront)
Session Title: Open Science Affinity Group Panel: How Do We Move Open Science Forward in Educational Research?
Panelists: Kara Finnigan, Elizabeth Tipton, Sean Grant
Moderator: D. Betsy McCoach*

Friday, September 20

9.00 am to 10.30 pm
Location: Harborside Foyer (4, Baltimore Marriott Waterfront)
Session Title: 7H. – Advancing Measurement and Assessment in Special Education
Presentation Title: Measurement Invariance and Predictive Validity of a Free-Access Universal Screening Tool: The SRSS-IE
Presenting Author: Kathleen Lane
Authors: Katie Pelton*

1.15 pm to 2.15 pm
Location: Harborside Foyer (4, Baltimore Marriott Waterfront)
Session Title: 8D. Organization of Schools and Systems
Presentation Title: Evaluation of a Brief Intervention: Achievement Gaps and Reliability
Authors: Joselyn Perez*, D. Betsy McCoach*

1.15 pm to 2.15 pm
Location: Kent AB (4, Baltimore Marriott Waterfront)
Session Title: 8F. Research Methods
Presentation Title: Deciphering Hyperparameter Choices in Machine Learning for Propensity Score Estimation: A Systematic Review of GBM and Random Forest Methods
Authors: Huibin Zhang, Walter Leite, Zachary Collier*, Kamal Chawla, Lingchen Kong, YongSeok Lee, Jia Quan
Presenting Author: Huibin Zhang

1.15 pm to 2.15 pm
Location: Harborside Foyer (4, Baltimore Marriott Waterfront)
Session Title: 8F. Research Methods
Presentation Title: Exploring Estimates of Multilevel Reliability for School Based Behavioral Measures
Authors: Katie Pelton*, D. Betsy McCoach*

2.15 p.m. – 3.45 p.m.
Location: Kent AB (4, Baltimore Mariott Waterfront)
Session Title: 9E. Advancing Educational Outcomes through Machine Learning and Predictive Analytics
Chair: Kylie Anglin*

*RMME Community member