Archived Posts

RMME Instructor, Dr. Brenna Butler, Presents at AEA 2023

Dr. Brenna Butler, RMME instructor, presents at the AEA 2023 Conference. Dr. Butler shared two presentations at AEA 2023, including “Getting down to the essentials: How to effectively measure need on a needs assessment survey using a gap analysis approach” and “How can one-on-one client interactions be measured as organization-wide impacts?” Congratulations on these fantastic presentations, from the Research Methods, Measurement, & Evaluation Community!

 

RMME Instructor, Dr. Brenna Butler, Presents at AEA 2023

 

Author and Presenter: Brenna Butler

Presentation Title: Getting down to the essentials: How to effectively measure need on a needs assessment survey using a gap analysis approach

Abstract: Measuring need through a needs assessment survey is often not easy, given the many different methods that are often used to define what a “need” is. In this Ignite presentation, participants will learn about one concrete way of measuring participants’ needs through a gap analysis, which measures both participants’ present state and their desired state (Watkins et al., 2012). Participants will be provided examples of survey questions to measure these states in a valid manner that is housed in a framework of a systems thinking approach, meaning that participants’ present and desired states under investigation are done so with personal, social, and societal influences in consideration (Arther & McMahon, 2005). The strengths and weaknesses of measuring need through this approach compared to other approaches will be briefly described. Participants will leave this Ignite presentation with one contextual framework to apply to their own needs assessment surveys along with a template of survey questions that can be structured to measure needs effectively.

 

Presenters: Brenna Butler, Michael Hamel, Matthew Spindler, Malinda Suprise

Presentation Title: Hidden stories: How can one-on-one client interactions be measured as organization-wide impacts?

Abstract: In large organizations, such as Cooperative Extensions, impacts that derive from one-on-one interactions with clientele are often missed in “hidden stories” due to a lack of effective data tracking measures present within the organization. This session will describe the methodology used to create a survey tool that captures the outputs and outcomes of educator-clientele interactions at Penn State Extension, focusing on supporting data utility at multiple levels (i.e., educator, supervisor, and leadership council). How the online survey built a foundation in the organization for data analytics supported through consistent and structured data input and storage processes will be described. Those data input and storage processes will facilitate future data analysis for complex decision-making throughout the organization’s evaluative cycles of activities and programs. The importance of stakeholder involvement in the development process of this survey as a form of boundary-making will be discussed in relation to maximizing the utility of data collection throughout the organization. Data collection is inherently a form of boundary-making that determines which elements of a situation should be included in the informational picture constructed of a context and which elements should be excluded (Schwandt, 2018). Boundary-making was used in this instrument development process to utilize the important role of stakeholder involvement in defining what information should be collected and curated (Archibald, 2020). Participants should leave this demonstration with the knowledge and tools to employ a similar methodology at their own organization to track individual interactions using a structure that allows for data aggregation at an organizational level.

 

 

RMME PhD student, Amanda Sutter, Presents at AEA 2023

RMME PhD student, Amanda Sutter, discusses “The Story Behind an Evaluation Practice Survey: Insights from Cognitive Interviews” at the 2023 AEA Conference. Congratulations on this wonderful presentation, from the Research Methods, Measurement, & Evaluation Community!

 

RMME PhD student, Amanda Sutter, presentes at AEA 2023

 

Author and Presenter: Amanda Sutter

Title: The Story Behind an Evaluation Practice Survey: Insights from Cognitive Interviews

Abstract: Abstract Information: What do evaluators think about when they are asked questions about evaluation practice? What narratives are triggered by what words? This research study focused on understanding the thought processes and beliefs of evaluators as they reflect on their practice through cognitive interviews. Cognitive interviews are a powerful instrument design strategy that helps reveal the narratives and meanings behind survey responses. Building on last year’s field pilot of a new evaluation practice instrument, this research on evaluation study involved cognitive interviews with 20 evaluators recruited through purposive nonprobability sampling to gather diverse perspectives on the evaluation practice instrument. The semi-structured interview process followed the Willis Method, offering participants an easy process to explore their responses in-depth. This paper shares findings from the interviews and how the data will be used to improve the instrument, particularly to ensure that the phrasing of questions and subsequent interpretations align with the intended construct definitions and narrative of the instrument. The paper will be of interest to a variety of audiences. Practitioners and commissions will have opportunities to think about their own surveys and how cognitive interviews may be a useful tool for evaluation studies. Other researchers will also benefit from these opportunities, and they will learn about a potential instrument that may be useful in their own scholarly work.

RMME Faculty, Dr. Bianca Montrosse-Moorhead, & RMME PhD Student, Amanda Sutter, Present at AEA 2023

RMME Faculty member, Dr. Bianca Montrosse-Moorhead, and RMME PhD student, Amanda Sutter, present their workshop, “Building Better Surveys” at the AEA 2023 Conference. Congratulations on this successful presentation, from the Research Methods, Measurement, & Evaluation Community!

 

 

 

Authors: Bianca Montrosse-Moorhead and Amanda Sutter

Presentation Title: Building Better Surveys

Abstract: This skill-building workshop is back by popular demand! Have you ever wondered what strategies you can use to improve your surveys? Or, wanted to create or adapt an instrument that better captures difficult to measure concepts well? Or, wanted to make sure that your survey instrument is able to accurately shed light on the story you are trying to understand? This session will teach you an easy-to-implement mixed-method feedback process to build better surveys. The session will begin with an overview of the process, which includes both a review and interview component. Throughout the session, presenters will focus on the value and use of the process, walk through examples from real-life evaluation studies, and share resources for further learning. Attendees will also have the opportunity to practice using this process through hands-on activities. Attendees will leave the session with templates they can adapt for their own use and guidance to help them feel prepared to take action. Throughout the session, presenters will call attention to ways in which equity is and can be centered in the mixed-method feedback process.

RMME Master’s alumna, Xueying Gao, Presents at AEA 2023

RMME Master’s alumna, Xueying Gao, presents her poster, “Validation of the VI-SPDAT (version 3) instrument: a confirmatory factor analysis” at the 2023 AEA conference. Congratulations on this fantastic poster presentation, from the Research Methods, Measurement, & Evaluation Community!

 

RMME Master’s alumna, Xueying Gao, presents at AEA 2023

 

Authors: Xueying Gao and Brad Richardson

Presenter: Xueying Gao

Abstract: The Vulnerability Index-Service Prioritization Decision Assistance Tool (VI-SPDAT version 3; Community Solutions & OrgCode Consulting, Inc., 2020) is the primary assessment tool to scale individuals’ vulnerability (or self-sufficiency). The Iowa’s Treatment for Individuals Experiencing Homelessness (IA-TIEH) project was conducted to advance an informed, integrated program for homeless and at-risk adults who experience co-occurring disorders, in which VI-SPDAT was applied to the participants at intake assessment and 6-month reassessment to measure their level of vulnerability. The current study adopted the sample from 2020 to 2023 (N=539) to examine the construct validity of VI-SPDAT (version 3) with two factor models: single factor model and a three-factor model. Results suggested that the single factor model did not demonstrate adequate fit (CFI = 0.467, TLI = 0.445, RMSEA = 0.111, SRMR = 0.114), while the hierarchical CFA model demonstrated better fit (CFI = 0.856, TLI = 0.848, SRMR = 0.071, RMSEA = 0.061), suggesting its limitation in measuring individuals’ vulnerability and other outcome parameters in research and clinical practice. Some items were not associated with the global factor or sub-factor. The VI-SPDAT has substantial weaknesses in its theoretical alignment, item performance, and psychometric properties. We recommend the enhancement of a new multidimensional scale of vulnerability with a rigorous measurement development protocol.

Upcoming RMME/STAT Colloquium (11/3): Xinyuan Song, “Hidden Markov Models with an Unknown Number of Hidden States”

RMME/STAT Joint Colloquium

Hidden Markov Models with an Unknown Number of Hidden States

Dr. Xinyuan Song
The Chinese University of Hong Kong

Friday, November 3, at 10AM ET

https://tinyurl.com/rmme-Song

Hidden Markov models (HMMs) are valuable tools for analyzing longitudinal data due to their capability to describe dynamic heterogeneity. Conventional HMMs typically assume that the number of hidden states (i.e., the order of HMMs) is known or predetermined through criterion-based methods. This talk discusses double-penalized procedures for simultaneous order selection and parameter estimation for homogeneous and heterogeneous HMMs. We develop novel computing algorithms to address the challenges of updating the order. Furthermore, we establish the consistency of order and parameter estimators. Simulation studies show that the proposed procedures considerably outperform the commonly used criterion-based methods. An application to the Alzheimer’s Disease Neuroimaging Initiative study further confirms the utility of the proposed method.

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Upcoming RMME/STAT Colloquium (10/13): Wes Bonifay, “Uncovering the Hidden Complexity of Statistical Models”

RMME/STAT Joint Colloquium

Uncovering the Hidden Complexity of Statistical Models

Dr. Wes Bonifay
University of Missouri

Friday, October 13, at 11AM ET

https://tinyurl.com/rmme-Bonifay

Model complexity is the ability of a statistical model to fit a wide range of data patterns. Complexity is routinely assessed by simply counting the number of freely estimated parameters in a given model. However, complexity is also affected by configural form, that is, by the particular arrangement of the variables in the model. Recent considerations of configural complexity have found that certain models have an inherent tendency to fit well to any possible data (sometimes achieving superior goodness-of-fit when compared to alternative models that contain a greater number of free parameters!). In this talk, Dr. Bonifay will present a method for evaluating configural complexity and demonstrate how more sophisticated considerations of complexity can improve applied research in the social sciences.

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Modern Modeling Methods: Conference Registration Ends 6/19

Register today for the 2023 Modern Modeling Methods Conference. Conference registration ends June 19!

 

2023 Modern Modeling Methods (M3) Conference
Dates: June 26 – June 28, 2023
Location: University of Connecticut’s Main Campus in Storrs, CT
Description: The Modern Modeling Methods (M3) Conference is an interdisciplinary conference designed to showcase the latest modeling methods and to present research related to these methodologies. Planned events include:

  • Monday, June 26: Full-day preconference workshop by Bengt Muthén, Tihomir Asparouhov, and Ellen Hamaker–“New Features in Mplus Version 8.9 and Forthcoming 8.10”
  • Tuesday (June 27) & Wednesday (June 28): Keynote presentations by Bengt Muthén and Ellen Hamaker; Talks by Tihomir Asparouhov, Jay Magidson, Daniel McNeish, David A. Kenny, and many others.

See the M3 Preliminary Program for a full list of talks.

Visit our website: modeling.uconn.edu.

Register Here!

 

Register Now! Modern Modeling Methods Conference Registration Ends June 19!