Evaluation 2015: Exemplary Evaluations in a Multicultural World

View Printable Version

Exemplary Program Evaluation—Insights from Three Studies of Federally-Funded Educational Programs

Session Number: 1970
Track: Program Theory and Theory-Driven Evaluation
Session Type: Multipaper
Session Chair: Michael Reynolds [NORC at the University of Chicago]
Discussant: Mary Morris Hyde [Corporation for National and Community Service]
Presenter 1: Carrie Markovitz [NORC at the University of Chicago]
Presenter 2: Jake Bartolone [NORC at the University of Chicago]
Presenter 3: Sarah-Kathryn McDonald [NORC at the University of Chicago]
Presentation 1 Additional Author: Marc W. Hernandez [Senior Research Scientist - NORC at the University of Chicago]
Presentation 2 Additional Author: Marie Halverson [Vice President, Education and Child Development - NORC at the University of Chicago]
Presentation 2 Additional Author: Tom Hoffer [Senior Fellow - NORC at the University of Chicago]
Time: Nov 13, 2015 (07:00 AM - 07:45 AM)
Room: Plaza B

Abstract 1 Title: Findings from the Outcome Evaluation of the Minnesota Reading Corps PreK Program
Presentation Abstract 1: The limited number of evaluations on volunteer-only tutoring programs has shown variation in their effectiveness to improve children’s literacy proficiency. Minnesota Reading Corps is a data driven, statewide initiative to help every Minnesota child become a proficient reader by the end of third grade. The program engages a diverse group of volunteer AmeriCorps members with no prerequisite literacy or education backgrounds to provide literacy interventions to preschool (PreK) and Kindergarten through third grade (K-3) elementary school students. This study, sponsored by the Corporation for National and Community Service, utilized a matched comparison quasi-experimental design (QED) for evaluating the PreK program’s ability to affect preschool students’ emergent literacy skills. The results of the study showed that AmeriCorps members helped 4- and 5-year old students meet or exceed spring targets for kindergarten readiness in all five assessed areas. Importantly, effect sizes for these findings were substantial (between .40 and .72).
Abstract 2 Title: Evaluation of the National Science Foundation’s (NSF) Graduate Research Fellowship Program (GRFP)
Presentation Abstract 2: The GRFP awards approximately 2,000 fellowships yearly to graduate students in research-based programs within STEM fields. NORC’s evaluation of the GRFP focused on the program’s impact on participants’ educational decisions, career preparations, aspirations and progress, and professional productivity, as well as details of how the program is implemented at universities. The evaluation included approximately 10,000 web surveys of fellows and similarly-ranked applicants who were not awarded the fellowship, analysis of existing national data sources including similar populations, and site visits and phone interviews with 24 institutions hosting fellows. The evaluation demonstrated impact on fellows graduate school experiences and career outcomes, including greater likelihood of Ph.D. completion within 10 years, more papers published, more presentations given, and more grants awarded as a PI compared to their non-fellow peers. The population of fellows also included a larger proportion of women and URMs compared to the national STEM Ph.D. population, indicating broadened access.
Abstract 3 Title: Documenting and Assessing Returns on Investments in STEM Education Research — Modeling and Measuring Impacts of Research Use for Science Policy Making
Presentation Abstract 3: Considerable effort has been devoted in the past decade to advancing the science of science policy-making. Of particular interest in assessing the outcomes of major programs of investment are the extent to which impact evaluations can (and should) account for secondary and tertiary distal outcomes (including, e.g., the extent to which results of R&D programs are used, by who, with what outcomes, over time). This paper reports results of an analysis of a selection of publicly available documents examined as part of a larger systematic literature review the author is conducting. A generic, adaptable logic model for assessing returns on (and to inform the development of) R&D investment programs is suggested, developed by documenting and synthesizing program logic models articulated (and implied) in descriptions and evaluations of federally-funded STEM education research programs, with a particular emphasis on distal returns associated with utilization of research findings beyond the academy.
Audience Level: Advanced

Session Abstract: 

This session addresses issues of longstanding importance in program theory and theory-driven evaluation, including: How can the effectiveness (and potential) of programs be gauged when program design constrains the evaluator’s ability to construct or reference informative counterfactuals? How can findings of an essentially summative nature be employed formatively for program improvement purposes? When should evaluation results influence decision-makers’ judgments regarding program viability? Three presentations highlight issues for developers to address in designing and modeling programs, considering from the outset the evidence they are likely to be able to present over time to warrant continued investment in a program (or investment in similar ones). The session is designed to stimulate discussion of these and other issues addressed in AEA’s Guiding Principles for Evaluators, with particular attention to the principles regarding systematic inquiry, and general and public welfare (e.g., constructing program evaluations to generate evidence optimizing their broader impacts).

For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.

Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 6, 2015. Email cancellation requests to registration@eval.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 6, 2015 all sales are final.