News & Updates

RMME News & Updates

Upcoming RMME/STAT Colloquium (1/29): P. Richard Hahn, “The Bayesian Causal Forest Model: Regularization, Confounding, and Heterogeneous Effects”

RMME/STAT Joint Colloquium:

The Bayesian Causal Forest Model: Regularization, Confounding, and Heterogeneous Effects

Richard Hahn
Arizona State University

January 29, 2021, at 12:00 EST

https://uconn-cmr.webex.com/uconn-cmr/j.php?MTID=mc19e545b14cc3a980ffc36760a5ce5f4

This talk will describe recent work on Bayesian supervised learning for conditional average treatment effects. Dr. Hahn will motivate the proposed Bayesian causal forest model in terms of fixing two specific flaws with previous approaches. One, our model allows for direct regularization of the treatment effect function, providing lower variance estimates of heterogeneous treatment effects. Two, by including an estimate of the propensity score as a control variable in our model we mitigate a phenomenon called “regularization induced confounding” that leads to substantial bias in previous approaches. Dr. Hahn will conclude with a detailed discussion of designing simulation studies to systematically investigate and validate machine learning models for causal inference.

Note: Dr. Hahn may also talk about this tutorial: https://math.la.asu.edu/~prhahn/xbcf_demo.html

Mark Your Calendar: New RMME/STAT Joint Colloquia Announced!!

As a continuation of the presentation series this fall, the University of Connecticut’s Research Methods, Measurement, & Evaluation (RMME) program and Statistics department will jointly sponsor several additional RMME/STAT colloquia, starting in January of 2021. So, mark your calendar now! And as always, be sure to check the RMME website for more information as these talks approach!

1/29/2021 12:00-1:15pm EST P. Richard Hahn Arizona State University
2/26/2021 12:00-1:15pm EST Edward Ip Wake Forest University
3/26/2021 12:00-1:15pm EST David Dunson Duke University
4/16/2021 12:00-1:15pm EST Susan Paddock NORC University of Chicago
4/23/2021 2:00-3:15pm EST Jean-Paul Fox University of Twente
5/21/2021 12:00-1:15pm EST David Kaplan University of Wisconsin-Madison
9/10/2021 12:00-1:00pm EST Susan Murphy Harvard University

 

Dr. Bianca Montrosse-Moorhead Gives Talk: “Working with Youth in Evaluation”

On December 4, 2020, Dr. Bianca Montrosse-Moorhead (RMME faculty member) gave a fantastic talk, entitled “Working with Youth in Evaluation” for EvalYouth Global, a global, multi-stakeholder network that supports and promotes young and emerging evaluators (YEEs) and youth-led accountability around the world. In this presentation, Dr. Montrosse-Moorhead spoke about the foundations, principles, and tensions in youth participatory evaluation work. Interested individuals can access a copy of her presentation at: https://www.researchgate.net/publication/346629709_Working_with_Youth_in_Evaluation.

Upcoming RMME/STAT Colloquium (12/18): Paul De Boeck, “Response Accuracy and Response Time in Cognitive Tests”

RMME/STAT Joint Colloquium:

Response Accuracy and Response Time in Cognitive Tests

Paul De Boeck
The Ohio State University

December 18th at 12:00 EST

https://uconn-cmr.webex.com/uconn-cmr/j.php?MTID=m1b6efd4435a2cd17535d693bd2ac1a14

It is an old and still unresolved issue how much a cognitive test score reflects ability and how much it reflects speed. The well-known speed-accuracy tradeoff does not make an answer to the question easier. In the presentation I will report the results of my research steps to investigate the problem. Briefly summarized, the findings are as follows. First, the correlation of ability and speed across persons depends on the test. Second, based on different kinds of modeling and different kinds of data, there seem to be remaining item-wise dependencies (i.e., conditional dependencies) between response accuracy and response time after controlling for the underlying latent variables. Third, the remaining dependencies depend on the difficulties of the test items and the dependencies also are curvilinear. I will present an explanation for the findings, and a tentative, complex answer to the old question of what is being measured in a cognitive test.

Pamela Peters, RMME Ph.D. Student, Earns Research Award

Pamela Peters recently took 3rd place for her paper entitled, “Development of the Assessment of Teachers’ Attitudes Toward Twice-Exceptionality,” at the National Association for Gifted Children’s Research and Evaluation Network Graduate Student Research Gala (Doctoral-level, Completed Research category). Congratulations on this accomplishment, Pam!

Upcoming RMME/STAT Colloquium (11/20): Bengt Muthen, “Recent Advances in Latent Variable Modeling”

Recent Advances in Latent Variable Modeling

Bengt Muthen

Friday, November 20, 2020

11:30am -1:00pm

Abstract:  This talk gives an overview of some recent and ongoing latent variable research.  Borrowing ideas from multilevel factor analysis, longitudinal SEM in a single-level, wide format is formulated in a new way that finds a well-fitting model 45 years after the writing of the classic Wheaton, Muthen, Alwin, and Summers article.  This segues into a generalization of latent transition analysis using the multilevel notion of a random intercept while staying in a single-level, wide format.  Turning back to multilevel modeling, the talk considers time series analysis of intensive longitudinal data.  This is illustrated by intervention data on electricity consumption and a randomized intervention related to positive and negative affect where cycles play a major role.  Finally, the new feature in Mplus Version 8.5 of Bayesian analysis of count, nominal, and binary logit models is presented.

 

This session is jointly sponsored by the Statistics department and the Research Methods, Measurement, and Evaluation program as part of the Statistical Applications and Quantitative Research Methods colloquium series.

RMME’s 11/6 Application Deadline for Online Programs

Looking to enhance your skills in program evaluation, quantitative research, measurement and/or data analysis? Know a colleague who wants to develop this in-demand skill set? Get prepared for the future of Research Methods, Measurement, & Evaluation (RMME) at the University of Connecticut!

 

With 100% Online or Campus-based options for our 12-credit Graduate Certificate in Program Evaluation and 30-credit RMME Master’s Degree, we offer:

– Flexibility for working professionals—Study anytime, anywhere;

– Courses designed and taught by expert RMME faculty;

– Opportunities for individualized course selection to facilitate your personal career goals;

 And more!

 

Furthermore, with advanced planning, it is even possible to earn BOTH the Graduate Certificate in Program Evaluation and the RMME Master’s Degree WITH NO ADDITIONAL COURSEWORK (beyond the 30 credits required for the master’s degree).

For more information, please visit:

100% Online Program Evaluation Certificate

100% Online RMME Master’s Degree

Or email: methods@uconn.edu

 

Start your journey today—Spring 2021 application deadlines for both the Program Evaluation Certificate and RMME Master’s Degree programs are November 6, 2020, 11:59pm EST!