News & Updates

RMME News & Updates

Upcoming RMME/STAT Colloquium (2/7): Paul De Boeck, “Adventitious Error Everywhere”

RMME/STAT Joint Colloquium

Adventitious Error Everywhere

Dr. Paul De Boeck

The Ohio State University

Friday, February 7, at 11 AM ET

https://tinyurl.com/rmme-PDeBoeck

Adventitious error is a concept introduced by Wu and Browne (Psychometrika, 2015) to explain imperfect goodness of fit of covariance structure models (CSMs, i.e., factor models, SEM). The paper was published together with critical remarks from the reviewers. In my presentation I will discuss, illustrate, and speculate about the potential of adventitious error beyond CSM, (a) as a unitary framework to understand and deal with underestimated inferential uncertainty regarding relations between variables, heterogeneity in meta-analysis, violations of measurement invariance, individual differences in the validity of tests, and (b) as a joint framework for reliability and validity. The presentation is partly based on De Boeck, DeKay, and Pek (Psychometrika, 2024, 89, 1055-1073).

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Upcoming RMME/STAT Colloquium (1/24): Walter Dempsey, “Challenges in Time-varying Causal Effect Moderation Analysis in Mobile Health”

RMME/STAT Joint Colloquium

Challenges in Time-varying Causal Effect Moderation Analysis in Mobile Health

Dr. Walter Dempsey

University of Michigan

Friday, January 24, at 11 AM ET

https://tinyurl.com/rmme-Dempsey

Twin revolutions in wearable technologies and smartphone-delivered digital health interventions have significantly expanded the accessibility and uptake of mobile health (mHealth) interventions in multiple domains of health sciences. Sequentially randomized experiments called micro-randomized trials (MRTs) have grown in popularity as a means to empirically evaluate the effectiveness of mHealth intervention components. MRTs have motivated a new class of causal estimands, termed “causal excursion effects”, that allow health scientists to answer important scientific questions about how intervention effectiveness may change over time or be moderated by individual characteristics, time-varying context, or past responses. In this talk, we present two new tools for causal effect moderation analysis. First, we consider a meta-learner perspective, where any supervised learning algorithm can be used to assist in the estimation of the causal excursion effect. We will present theoretical results and accompanying simulation experiments to demonstrate relative efficiency gains. Practical utility of the proposed methods is demonstrated by analyzing data from a multi-institution cohort of first year medical residents in the United States. Second, we will consider effect moderation with tens or hundreds of potential moderators. In this setting, it becomes necessary to use the observed data to select a simpler model for effect moderation and then make valid statistical inference. We propose a two-stage procedure to solve this problem that leverages recent advances in post-selective inference using randomization. We will discuss asymptotic validity of the conditional selective inference procedure and the importance of randomization. Simulation studies verify the asymptotic results. We end with an analysis of an MRT for promoting physical activity in cardiac rehabilitation to demonstrate the utility of the method.

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

RMME Programs Celebrates its Fall 2024 Grads!!!

UConn’s Research Methods, Measurement, & Evaluation (RMME) Programs Community is so proud of our Fall 2024 RMME Master’s degree program and Graduate Certificate in Program Evaluation program graduates! We are thrilled to celebrate this educational milestone with you all and look forward to watching you grow as outstanding RMME alumni and professionals! Congratulations, from the entire Research Methods, Measurement, & Evaluation Community!

Congratulations to Our Fall 2024 RMME Programs Graduates!

 

 

 

RMME to Host Exhibits Booth at STEM Super Saturday on January 11, 2025!

The Rhode Island Society for Technology in Education (RISTE), jointly with the Rhode Island Science Teachers Association (RISTA) and the Rhode Island Math Teachers Association (RIMTA), are hosting STEM Super Saturday! This annual event for educators will take place at the Rhode Island Nursing Education Center, in Providence, RI, on January 11, 2025.

UConn’s RMME Programs are thrilled to participate in, and support the event, by hosting an exhibits booth for attendees! So, if you will be there, be sure to stop by for an opportunity to chat personally with Dr. Sarah D. Newton, the Associate Director of RMME Online Programs! We cannot wait to see you there!

Upcoming RMME/STAT Colloquium (12/6): Kristen Olson, “Recent Lessons on Mixing Mail and Web Data Collection Modes in Household Surveys”

RMME/STAT Joint Colloquium

Recent Lessons on Mixing Mail and Web Data Collection Modes in Household Surveys

Dr. Kristen Olson

University of Nebraska-Lincoln

Friday, December 6, at 11 AM ET

https://tinyurl.com/rmme-Olson

Survey designs are increasingly incorporating self-administered modes – namely mail and web – to counter decreasing response rates and increasing costs in interviewer-administered surveys. In this talk, I will cover recent research related to mixing modes of self-administered data collection in household surveys, with a focus on designing for survey participation. I will explore theory for why sample members may select one self-administered mode over another, how modes affect response rates, representation and costs, and design decisions related to recruitment materials for mixed-mode surveys.

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Upcoming RMME/STAT Colloquium (11/15): Pascale Deboeck, “Effects Across Time to Modeling States of a System: A Dynamical Systems Perspective on Modeling Social Science Data”

RMME/STAT Joint Colloquium

Effects Across Time to Modeling States of a System: A Dynamical Systems Perspective on Modeling Social Science Data

Dr. Pascale Deboeck

University of Utah

Friday, November 15, at 11 AM ET

https://tinyurl.com/rmme-Deboeck

Modeling of repeated observations across time generally falls into one of two categories. The first is modeling observations as they change across time; that is, as a function of time. The second is to model the observations as they relate to prior observations; many common models in the social sciences are represented in this second category, including auto-regressive models, cross-lagged panel models, and latent difference score models. A less-common approach in this latter category are differential equation models. These models express relations between the state of observations and how they are changing. These models offer the opportunity to imagine change relations that may be commonly overlooked when modeling repeated measures. This talk will introduce the application of differential equation modeling to intensive, longitudinal data. Specific examples will include estimating derivatives from noisy data and the possibility of testing novel models of intraindividual dynamics.

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

RMME Community Members Present at AEA 2024

Multiple Research Methods, Measurement, & Evaluation (RMME) community members are presenting at the 2024 annual meeting of the American Evaluation Association (AEA) this week. Remember to check out these excellent presentations–we hope to see you there!

Wednesday, October 23
4:15 PM to 5:15 PM
Location: B117-119
Session Title: (Panel + AEA Sponsored Sessions)
Presentation Title: Amplifying and Empowering Voices through Evaluation Journals
Chair: Bianca Montrosse-Moorhead*
Presenters: Florence Etta/Laura Peck/Nicole Bowman (Lunaape/Mohican)/Ayesha Boyce/ Thomas Archibald/Daniel Balsaobre-Lorente/Michael A. Harnar/Sarah Mason/Hanh Cao Yu

Thursday, October 24
10:15 AM to 11:15 AM
Location: G131-132
Session Title: (Panel + Theories of Evaluation)
Presentation Title: What Happens When You Give Evaluators a Box of Crayons? Reflections on Visualizations of Evaluation Theory
Chair: Sebastian Lemire
Presenters: Christina Christie/Tarek Azzam/Bianca Montrosse-Moorhead*
Discussants: Melvin M. Mark

6:00 PM to 7:30 PM
Location: N/A
Session Title: Poster Reception #2 (Poster + Organizational Learning and Evaluation Capacity Building)
Presentation Title: 257 – Amplifying the voices of new Evaluation Capacity Builders: Lessons learned from transferring new skills and ideas into the workplace
Chair: Leslie A. Fierro*
Presenters: Emily Frost, Lashonda Williams/Renee Boyer/Jordyn J. Livingston
Discussants: Heather D. Codd/ Renee Boyer

Friday, October 25
11:30 AM to 12:30 PM
Location: C120-122
Session Title: (Multipaper + Health Evaluation)
Presentation Title: The Updated CDC Program Evaluation Framework: Revised to Meet Current and Future Public Health and Evaluation Needs
Presenters: Daniel P. Kidder/Heather Salvaggio/Elena Luna
Discussants: Leslie A. Fierro*
Authors: Daniel P. Kidder/Cassandra Davis/Destiny Bruce/Michael Kerzner

12:45 PM to 1:45 PM
Location: Exhibit Hall A
Session Title: Birds of a feather (STEM Education and Training)
Presentation Title: 62 – Amplifying the Voices of Multigenerational and Multidisciplinary STEM Evaluation Journeys – A Roundtable Bridge of Resources and Mentorship
Presenters: Kevin Keane/Clara M. Pelfrey/Breonte Guy/Leslie A. Fierro*/Carlos Romero/Alexandra Budenz/Robert Garcia/Rechelle Paranal
Facilitators: Rajiv R. Ramdat/Ciara C. Knight

2:30 PM to 3:30 PM
Location: Portland Ballroom 254
Session Title: (Roundtable + Research on Evaluation)
Presentation Title: What does it mean to be an evaluator?
Presenters: Doris Espelien/Dana J. Linnell/Bianca Montrosse-Moorhead*/Aileen M. Reid/Cherie M. Avent/Grettel Arias Orozco/ Onyinyechukwu Onwuka du Bruyn
Authors: Doris Espelien/Dana J. Linnell/Bianca Montrosse-Moorhead*/Aileen M. Reid/Cherie M. Avent/Grettel Arias Orozco/ Onyinyechukwu Onwuka du Bruyn

3:45 PM to 4:45 PM PST
Location: B113-114
Session Title: (Think Tank + Use and Influence of Evaluation)
Presentation Title: The Critical Role of Equity and Inclusion in Defining Rigor: What It Takes to Reimagine Peer Review Journals
Chair: Hanh Cao Yu
Presenters: Rodney Hopson/Bianca Montrosse-Moorhead*

4:15 PM to 4:30 PM
Location: D133-134
Session Title: Building Evaluator Capacity and Competencies (Paper + Graduate Students and New Evaluators and Research on Evaluation)
Presentation Title: Doctoral Student Experiences with Research on Evaluation: Insights and Opportunities from a Collaborative Autoethnography
Presenters: Amanda Sutter*/ Valerie Marshall/Rachael Kenney/Allison Prieur/ Kari Ross Nelson/ Christine Liboon

 

*RMME Community member

Upcoming RMME/STAT Colloquium (10/11): Sandip Sinharay, “Assessment of Fit of Item Response Theory Models: Full-information and Limited-information Methods, Item and Person Fit Analysis, and Beyond”

RMME/STAT Joint Colloquium

Assessment of Fit of Item Response Theory Models: Full-information and Limited-information Methods, Item and Person Fit Analysis, and Beyond

Dr. Sandip Sinharay

Educational Testing Service (ETS) Research Institute

Friday, October 11, at 11:15 AM ET

AUST 110

https://tinyurl.com/rmme-Sinharay

Item response theory (IRT) is one of the central methodological pillars supporting many large and high-profile assessment programs globally. IRT analysis is essentially a type of discrete multivariate analysis and is performed using IRT models that are latent variable models for discrete data. However, IRT models involve multiple assumptions like conditional independence, monotonicity etc. and the results obtained from IRT models may not be accurate if one or more of the assumptions are not met, that is, if there is IRT model misfit. This presentation will include a comprehensive review of the literature on the assessment of fit of IRT models. The presenter will discuss various approaches and concepts regarding IRT model fit including full-information and limited-information methods, residual analysis, item and person-fit analysis, Bayesian methods, analysis for differential item functioning, and assessment of practical significance of misfit. A real data example will be used to illustrate some of the approaches. One goal of the presentation is to stimulate discussions involving the audience members regarding IRT model-fit assessment.

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab