Archived Posts

Ashley Taconet, Program Evaluation Certificate Grad, Earns Award

Ashley Taconet, a graduate of RMME’s Graduate Certificate in Program Evaluation Program and a current doctoral student in Neag’s Educational Psychology department, earned one of five scholarships awarded to graduate students by the Council for Exceptional Children’s Division on Career Development and Transition in 2022. Ashley earned this award for the project entitled, “Examining Independent Living Skills and Economic Hardship for Youth with Disabilities Using Data from the NLTS2012”. See this announcement for more details.

Congratulations on this stellar accomplishment, Ashley!

 

RMME Community Members Contribute to Construct Validation Article

Dr. Graham Rifenbark (an RMME alumnus), Dr. H. Jane Rogers (retired RMME faculty member), Dr. Hariharan Swaminathan (retired RMME faculty member), Ashley Taconet (RMME Program Evaluation Certificate graduate) and Shannon Langdon (RMME Program Evaluation Certificate graduate) contributed to a recently published article lead by Dr. Allison Lombardi, entitled: “Establishing Construct Validity of a Measure of Adolescent Perceptions of College and Career Readiness” in the journal, Career Development and Transition for Exceptional Individuals. Congratulations to all of the authors of this new paper!

 

Abstract:

The purpose of this study was to establish construct validity of a college and career readiness measure using a sample of youth with (n = 356) and without (n = 1,599) disabilities from five high schools across three U.S. states. We established content validity through expert item review, structural validity through initial field-testing, and convergent validity by correlating domain scores with school academic and behavioral data. A four-factor measurement model emerged representing the domains Ownership of Learning, Academic Engagement and Processes, Interpersonal Engagement, and Career Development. Domain scores were significantly correlated with achievement, college admission exam scores, and attendance. Implications for research and practice with an emphasis on transition service delivery via multi-tiered systems of support are discussed.

RMME Programs Celebrates its Fall 2022 Grads!!!

We, here at UConn’s RMME Programs, are thrilled to celebrate our newest alumni from the

  • RMME Master’s degree program and
  • RMME’s Graduate Certificate in Program Evaluation program

We cannot wait to see all of the amazing things you will accomplish, as you further your career with your your well-deserved, new credential(s). Congratulations, Shannon A., Shannon L., Ashley, Sierra, and Amelia!!! We are so proud of you!!

 

RMME Programs Celebrates Fall 2022 Graduates

 

 

Dr. D. Betsy McCoach and Pamela M. Peters Honored at NAGC 2022

Neag Researchers Earn Awards at NAGC 2022
Neag School of Education researchers earn awards at the 2022 annual meeting of the National Association for Gifted Children (Left to Right: Dr. Susan Dulong Langley, Dr. Del Siegle, Dr. D. Betsy McCoach, Pamela M. Peters [Photo Credit: Renzulli Center for Creativity, Gifted Education, and Talent Development, Facebook: https://www.facebook.com/uconngifted]

The RMME Community celebrates researchers from the Neag School of Education, who received awards at the 2022 annual meeting of the National Association for Gifted Children (NAGC).

RMME Professor, Dr. D. Betsy McCoach, earned recognition as the 2022 NAGC Distinguished Scholar. In this capacity, she gave a featured presentation entitled: “How Can We Answer the Most Fundamental Questions in Gifted Education?”

Dr. McCoach also received an award for her contributions to the Gifted Child Quarterly Paper of the Year.

In addition, Pam Peters (RMME doctoral student), earned a Carolyn Callahan Doctoral Student Award for her “exemplary work in research, publications, and educational service, as well as…potential for future scholarship.” [NAGC Press Release]

Congratulations to these two outstanding scholars and all of this year’s NAGC award winners!

SAVE THE DATE! Modern Modeling Methods Returns to UConn!

 

 

Mark your calendar! The Modern Modeling Methods (M3) conference returns to UConn after a lengthy pandemic-induced hiatus. From June 26-28, 2023, M3 will resume as an in-person conference on the Storrs campus. Keynote speakers and workshop presenters include Bengt MuthenTihomir Asparouhov, and Ellen Hamaker. Remember to check the M3 website regularly for more information and updates.

 

Upcoming RMME/STAT Colloquium (11/11): Dylan Small, “Testing an Elaborate Theory of a Causal Hypothesis”

RMME/STAT Joint Colloquium

Testing an Elaborate Theory of a Causal Hypothesis

Dr. Dylan Small
University of Pennsylvania

Friday, November 11, at 11AM ET

https://uconn-cmr.webex.com/uconn-cmr/j.php?MTID=m8da0e35b64c861fc97a21dd36fb29ded

When R.A. Fisher was asked what can be done in observational studies to clarify the step from association to causation, he replied, “Make your theories elaborate” — when constructing a causal hypothesis, envisage as many different consequences of its truth as possible and plan observational studies to discover whether each of these consequences is found to hold. William Cochran called “this multi-phasic attack…one of the most potent weapons in observational studies.” Statistical tests for the various pieces of the elaborate theory help to clarify how much the causal hypothesis is corroborated. In practice, the degree of corroboration of the causal hypothesis has been assessed by verbally describing which of the several tests provides evidence for which of the several predictions. This verbal approach can miss quantitative patterns. So, we developed a quantitative approach to making statistical inference about the amount of the elaborate theory that is supported by evidence. This is joint work with Bikram Karmakar.

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

RMME Community Members Earn NAGC Awards

The National Association for Gifted Children (NAGC) recently announced its 2022 NAGC Award recipients, and RMME Programs has multiple reasons to celebrate!

 

*Dr. D. Betsy McCoach (RMME Professor) earned NAGC’s Distinguished Scholar Award.

*Pamela M. Peters (RMME Doctoral Student) earned NAGC’s Carolyn Callahan Doctoral Student Award.

*Dr. D. Betsy McCoach (RMME Professor) co-authored a NAGC Gifted Child Quarterly Paper of the Year written by Susan Dulong Langley, E. Jean Gubbins, D. Betsy McCoach, Karen Ottone-Cross, and Del Siegle.

 

Congratulations to Dr. McCoach, Pamela, and all of this year’s NAGC Award recipients!

 

RMME Faculty Member, Dr. Bianca Montrosse-Moorhead, Named Co-Editor-In-Chief of Leading Evaluation Journal

Congratulations to RMME Faculty Member, Dr. Bianca Montrosse-Moorhead, who was recently named Co-Editor-In-Chief of the exceptional evaluation journal, New Directions for Evaluation (see announcement here). This publication is one of two published by the premier professional organization for evaluators, the American Evaluation Association. Dr. Montrosse-Moorhead will serve in this role for the next three years with colleague, Dr. Sarah Mason, from the University of Mississippi. Check out this  UConn Today article for more. And congratulations again to the well-deserving Dr. Bianca Montrosse-Moorhead on this outstanding appointment!

 

Upcoming RMME/STAT Colloquium (10/7): Edsel A. Pena, “Searching for Truth through Data”

RMME/STAT Joint Colloquium

Searching for Truth through Data

Dr. Edsel A. Pena
University of South Carolina

Friday, October 7, at 11:15AM ET, AUST 108

https://uconn-cmr.webex.com/uconn-cmr/j.php?MTID=m9667e91caf1197b47fc45f50529388b9

This talk concerns the role of statistical thinking in the Search for Truth using data. This will bring us to a discussion of p-values, a much-used tool in scientific research, but at the same time a controversial concept which has elicited much, sometimes heated, debate and discussion. In March 2016, the American Statistical Association (ASA) was compelled to release an official statement regarding p-values; a psychology journal has even gone to the extreme of banning the use of p-values in its articles; and in 2018, a special issue of The American Statistician was fully devoted to this issue. A main concern in the use of p-values is the introduction of a somewhat artificial threshold, usually the value of 0.05, when used in decision-making, with implications on reproducibility and replicability of reported scientific results. Some new perspectives on the use of p-values and in the search for truth through data will be discussed. In particular, this will touch on the representation of knowledge and its updating based on observations. Related to the issue of p-values, the following question arises: “When given the p-value, what does it provide in the context of the updated knowledge of the phenomenon under consideration, and what additional information should accompany it?” To be addressed also is the question of whether it is time to move away from hard thresholds such as 0.05 and whether we are on the verge of — to quote Wasserstein, Schirm and Lazar (2019) — a “World Beyond P < 0.05.”

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Upcoming RMME/STAT Colloquium (9/9): Kosuke Imai, “Experimental Evaluation of Algorithm-Assisted Human Decision-Making: Application to Pretrial Public Safety Assessment”

RMME/STAT Joint Colloquium

Experimental Evaluation of Algorithm-Assisted Human Decision-Making: Application to Pretrial Public Safety Assessment

Dr. Kosuke Imai
Harvard University

Friday, September 9, at 11:00AM ET

https://uconn-cmr.webex.com/uconn-cmr/j.php?MTID=m486f7b13e6881ba895b350f338b0c90d

Despite an increasing reliance on fully-automated algorithmic decision-making in our day-to-day lives, human beings still make highly consequential decisions. As frequently seen in business, healthcare, and public policy, recommendations produced by algorithms are provided to human decision-makers to guide their decisions. While there exists a fast-growing literature evaluating the bias and fairness of such algorithmic recommendations, an overlooked question is whether they help humans make better decisions. We develop a general statistical methodology for experimentally evaluating the causal impacts of algorithmic recommendations on human decisions. We also show how to examine whether algorithmic recommendations improve the fairness of human decisions and derive the optimal decision rules under various settings. We apply the proposed methodology to preliminary data from the first-ever randomized controlled trial that evaluates the pretrial Public Safety Assessment (PSA) in the criminal justice system. A goal of the PSA is to help judges decide which arrested individuals should be released. On the basis of the preliminary data available, we find that providing the PSA to the judge has little overall impact on the judge’s decisions and subsequent arrestee behavior. Our analysis, however, yields some potentially suggestive evidence that the PSA may help avoid unnecessarily harsh decisions for female arrestees regardless of their risk levels while it encourages the judge to make stricter decisions for male arrestees who are deemed to be risky. In terms of fairness, the PSA appears to increase an existing gender difference while having little effect on any racial differences in judges’ decisions. Finally, we find that the PSA’s recommendations might be unnecessarily severe unless the cost of a new crime is sufficiently high.

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab