Month: December 2020

Mark Your Calendar: New RMME/STAT Joint Colloquia Announced!!

As a continuation of the presentation series this fall, the University of Connecticut’s Research Methods, Measurement, & Evaluation (RMME) program and Statistics department will jointly sponsor several additional RMME/STAT colloquia, starting in January of 2021. So, mark your calendar now! And as always, be sure to check the RMME website for more information as these talks approach!

1/29/2021 12:00-1:15pm EST P. Richard Hahn Arizona State University
2/26/2021 12:00-1:15pm EST Edward Ip Wake Forest University
3/26/2021 12:00-1:15pm EST David Dunson Duke University
4/16/2021 12:00-1:15pm EST Susan Paddock NORC University of Chicago
4/23/2021 2:00-3:15pm EST Jean-Paul Fox University of Twente
5/21/2021 12:00-1:15pm EST David Kaplan University of Wisconsin-Madison
9/10/2021 12:00-1:00pm EST Susan Murphy Harvard University

 

Dr. Bianca Montrosse-Moorhead Gives Talk: “Working with Youth in Evaluation”

On December 4, 2020, Dr. Bianca Montrosse-Moorhead (RMME faculty member) gave a fantastic talk, entitled “Working with Youth in Evaluation” for EvalYouth Global, a global, multi-stakeholder network that supports and promotes young and emerging evaluators (YEEs) and youth-led accountability around the world. In this presentation, Dr. Montrosse-Moorhead spoke about the foundations, principles, and tensions in youth participatory evaluation work. Interested individuals can access a copy of her presentation at: https://www.researchgate.net/publication/346629709_Working_with_Youth_in_Evaluation.

Upcoming RMME/STAT Colloquium (12/18): Paul De Boeck, “Response Accuracy and Response Time in Cognitive Tests”

RMME/STAT Joint Colloquium:

Response Accuracy and Response Time in Cognitive Tests

Paul De Boeck
The Ohio State University

December 18th at 12:00 EST

https://uconn-cmr.webex.com/uconn-cmr/j.php?MTID=m1b6efd4435a2cd17535d693bd2ac1a14

It is an old and still unresolved issue how much a cognitive test score reflects ability and how much it reflects speed. The well-known speed-accuracy tradeoff does not make an answer to the question easier. In the presentation I will report the results of my research steps to investigate the problem. Briefly summarized, the findings are as follows. First, the correlation of ability and speed across persons depends on the test. Second, based on different kinds of modeling and different kinds of data, there seem to be remaining item-wise dependencies (i.e., conditional dependencies) between response accuracy and response time after controlling for the underlying latent variables. Third, the remaining dependencies depend on the difficulties of the test items and the dependencies also are curvilinear. I will present an explanation for the findings, and a tentative, complex answer to the old question of what is being measured in a cognitive test.