Archived Posts

Dr. Bianca Montrosse-Moorhead To Give Evaluation Café Talk, 2/28

What are RMME faculty up to? Check out Dr. Bianca Montrosse-Moorhead’s Evaluation Café talk on February 28, 2024, at 12pm ET, to find out! Her featured presentation is entitled, “Modernizing Evaluation’s Cartography, Architectural Blueprint, and Definition”. Register here to reserve your spot in this excellent session: http://tinyurl.com/EvalCafe2024-BMM

 

ABSTRACT: In this Evaluation Café presentation, Dr. Montrosse-Moorhead will preview the Country of the Mind map, where the Evaluation Building is located on this map, and a close-up of the building itself. None of these have been visualized before in published scholarship. Dr. Montrosse-Moorhead will also share the proposed amended definition, why it is necessary, and the implications of adopting the amended definitions for evaluation practice; the implications for the instruments, methods, and techniques we use; and the implications for evaluation’s theoretical and metatheoretical scholarship.

 

Check out Dr. Bianca Montrosse-Moorhead’s upcoming Evaluation Café talk, on Februay 28, at 12pm!

Dr. Bianca Montrosse-Moorhead Serves on AI & Evaluation Panel, 2/21

On February 21, 2024, Dr. Bianca Montrosse-Moorhead (Associate Professor in RMME Programs) joined MERL Tech’s Natural Language Processing Community of Practice (NLP-CoP) to discuss the impact of Artificial Intelligence (AI) on future evaluation practice. This exciting panel also included other experts in the field, including Sarah Mason, Izzy Thornton, Tarek Azzam, Sahiti Bhaskara, and Blake Beckmann. The session covered a wide variety of topics, ranging from the birth of AI to consulting challenges and AI competencies. Click here for more information on the talk and to register for part two of this discussion series!

RMME Programs Celebrates its Fall 2023 Grads!!!

UConn’s Research Methods, Measurement, & Evaluation (RMME) Programs are excited to celebrate our newest graduates from RMME Master’s degree program and RMME’s Graduate Certificate in Program Evaluation program! We cannot wait to see all of the many ways you will make us proud, as a new RMME graduate! Congratulations, all, from the Research Methods, Measurement, & Evaluation Community!

Celebrating Our Fall 2023 RMME Programs Graduates! Congratulations to all!
Celebrating Our Fall 2023 RMME Programs Graduates! Congratulations to all!

 

 

 

Upcoming RMME/STAT Colloquium (12/1): Irini Moustaki, “Some New Developments on Pairwise Likelihood Estimation & Testing in Latent Variable Models”

RMME/STAT Joint Colloquium

Some New Developments on Pairwise Likelihood Estimation & Testing in Latent Variable Models

Dr. Irini Moustaki
London School of Economics

Friday, December 1, at 11AM ET

https://tinyurl.com/rmme-Moustaki

Pairwise likelihood is a limited-information method used to estimate latent variable models, including factor analyses of categorical data. It avoids evaluating high-dimensional integrals and, thus, is computationally more efficient than full information maximum likelihood. This talk will discuss two new developments in the estimation and testing of latent variable models for binary data under the pairwise likelihood framework. The first development is about estimation and limited information goodness-of-fit test statistics under complex sampling. The performance of the estimation and the proposed test statistics under simple random sampling and unequal probability sampling is evaluated using simulated data. The second development focuses on computational aspects of pairwise likelihood. Despite its computational advantages it can still be demanding for large-scale problems that involve many observed variables. We propose an approximation of the pairwise likelihood estimator, derived from an optimization procedure relying on stochastic gradients. The stochastic gradients are constructed by subsampling the pairwise log-likelihood contributions, for which the subsampling scheme controls the per-iteration computational complexity. The stochastic estimator is shown to be asymptotically equivalent to the pairwise likelihood one. However, finite sample performances can be improved by compounding the sampling variability of the data with the uncertainty introduced by the subsampling scheme. We demonstrate the performance of the proposed method using simulation studies and two real data applications.

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

RMME Faculty and Students Present at NERA 2023

Dr. D. Betsy McCoach (RMME Faculty member, Discussant), Amanda Sutter (RMME PhD Student, Presenter), Marcus Harris (RMME PhD Student, Presenter), Claudia Ventura (RMME PhD Student, Presenter), Faeze Safari (RMME PhD Student, Presenter), & Kirsten Reyna (RMME PhD Student, Presenter) present a symposium on “The Future of Educational Measurement” at NERA 2023. Congratulations on this excellent symposium, from the Research Methods, Measurement, & Evaluation Community!

 

RMME Faculty and Students Present at NERA 2023

 

Symposium Chair/Discussant: D. Betsy McCoach

Symposium Presenters: Amanda Sutter, Marcus Harris, Claudia Ventura, Faeze Safari, & Kirsten Reyna

Symposium Abstract: Emergent measurement scholars provide their perspective on current issues and future directions in educational measurement. The four presentations focus on 4 critical areas: 1.) Equity and social justice, 2.) The context in which we operate, 3.) The rise of artificial intelligence, and 4.) Graduate training in educational measurement.

Symposium Presentations:

1) Context Matters
Amanda Sutter, Marcus Harris

2) Equity and Social Justice Issues and Values in Measurement
Claudia Ventura, Amanda Sutter

3) Exploring the Challenges and Potential of Artificial Intelligence in Educational Measurement
Faeze Safari, Kirsten Reyna

4) Transforming Graduate School Training: Advancing Measurement and Open Science
Marcus Harris, Kirsten Reyna