Adapting Evaluation Designs to Real World Changes and Challenges: US and International Examples

Session Number: 1899
Track: Human Services Evaluation
Session Type: Panel
Tags: child welfare, children's rights, Cross-cultural evaluation, DesignThinking, developing countries, Evaluation Design, Evaluation methods, International development, multi-site evaluation, TA evaluation, workforce development
Session Chair: Anne Chamberlain [Senior Research Associate - IMPAQ International]
Discussant: Valerie Jean Caracelli [Senior Social Science Analyst - U.S. Government Accountability Offi]
Presenter 1: Shannon Renee Howard [Senior Research Analyst - IMPAQ International]
Presenter 2: Neha Nanda [Research Associate - IMPAQ International ]
Presenter 3: Sonam Gupta [Research Associate - IMPAQ International]
Presenter 4: Chris Brandt [Vice President - IMPAQ International, LLC]
Presentation 1 Additional Author: Susan G Berkowitz [Director for Qualitative Research/Principal Research Associate - IMPAQ International ]
Presentation 1 Additional Author: Shannon Renee Howard [Senior Research Analyst - IMPAQ International]
Presentation 2 Additional Author: Yang Chen [Research Associate - IMPAQ International]
Presentation 2 Additional Author: Nicholas Bill [Senior Research Associate - IMPAQ]
Presentation 3 Additional Author: Maurice Kugler [Principal Research Scientist & Managing Director - IMPAQ International]
Presentation 3 Additional Author: Sandeep Shetty [Research Associate - IMPAQ International LLC]
Presentation 4 Additional Author: Ryoko Yamaguchi [President - Plus Alpha Research & Consulting]
Time: Oct 26, 2016 (04:30 PM - 06:00 PM)
Room: M102

Abstract 1 Title: Adapting Evaluation Design to Project and Policy Changes: The Performance Evaluation of the Education Priorité Qualité (EPQ) Project in Senegal
Presentation Abstract 1:

USAID’s Education Priorité Qualité (EPQ) project (2010-2014) aimed to improve the quality of teaching and learning in middle schools in selected regions of Senegal through better teacher preparation, creation of a Whole School Environment, strengthening basic skills in reading and math, and creating opportunities for youth. When IMPAQ submitted our proposal, the EPQ was still ongoing and was expected to continue; the design focused on evaluating project performance relative to long-term objectives and assessing potential for scale-up. By the time of award, the EPQ had ended. The evaluation design changed to reflect a shift in emphasis towards extracting potentially applicable lessons learned for future Senegalese educational policy. This presentation will discuss how IMPAQ evaluators adapted the original evaluation design by reframing selected evaluation questions and refining a rapid assessment data collection and analysis approach that enabled us to produce a useful final report with timely, policy-relevant recommendations


Abstract 2 Title: Adapting a Quasi-Experimental Design to Absence of Individual-level Participant Data in an Evaluation of a Workforce Development Grant to the Ohio-Pennsylvania Interstate Region


Presentation Abstract 2:

In 2012, the United States Department of Labor (USDOL) awarded the Ohio-Pennsylvania interstate region $6 million to develop and enhance workforce support in manufacturing. The region aims to increase student enrollment in manufacturing programs, introduce nationally-recognized credentials to improve employment and earnings outcomes, and increase employers’ productivity and competitiveness. Activities include holding industrial career fairs at high schools, making presentations to employers, and conducting online workshops. The USDOL requires that the evaluation employ a quasi-experimental design, so in the absence of data on individual participants, IMPAQ has adapted by developing a variation on an “intent-to-treat” design. The treatment group is comprised of unemployed and underemployed individuals who live in the Oh-Penn Interstate Region. The comparison group is made up of similar individuals from matched control counties, who have been identified using a unique two-step procedure. The presentation will describe the rationale for and implementation of this adaptive design.


Abstract 3 Title: Adapting a Randomized Control Trial Design to Address Possible Spillover Effects in an Evaluation of a Child Labor Prevention Program in India
Presentation Abstract 3:

IMPAQ International is evaluating Bal Mitra Gram (BMG), the flagship program of Bachpan Bachao Andolan (BBA) in India, under a grant from the Bureau of International Labor Affairs. BBA was founded in 1980 to promote the rights and protection of children and ensure they have access to free, quality education. After IMPAQ was awarded the grant, we learned that treatment villages would need to be located in relative proximity to each other to minimize program delivery costs. This requirement posed a potential threat to our randomized control trial (RCT) design by increasing the likelihood of spillover effects from treatment villages on control villages. To mitigate spillover effects, we adapted the randomization design to use a distance matrix to show distance between all villages and applied a minimum distance criterion before random assignment. This presentation will discuss the decision making surrounding the redesign and how well the adaptation has been working


Abstract 4 Title: Hidden in the Weeds: How SEED Program Evaluators Adapted Evaluation Designs to Address Challenges While Maintaining Rigorous What Works Clearinghouse (WWC) Standards
Presentation Abstract 4:

The U.S. Department of Education recently awarded IMPAQ International a contract to provide analytic technical assistance (TA) to Supporting Effective Educator Development (SEED) grantees. IMPAQ provides ongoing support to evaluators of 13 SEED grantees to ensure that evaluations meet rigorous What Works Clearinghouse (WWC) standards for scientifically based research. Although evaluation teams wrote detailed plans well in advance of award, virtually all experienced challenges, including:
• Site recruitment, particularly for random assignment studies
• Identifying appropriate comparison groups
• Distinguishing between the unit of assignment, intervention and analysis
• Obtaining access to existing data sources and reducing data collection burden
• Identifying appropriate measures

This presentation will describe these challenges and explain how evaluation teams addressed them for different programs and evaluation designs. The project director and analytic TA lead will also share lessons learned about how to support teams who must implement rigorous studies amidst stringent timelines and competing priorities.


Audience Level: All Audiences

Session Abstract: 

Evaluation design has to respond to changing conditions, policy directions and emergent information needs, but still maintain its relevance, integrity and rigor. This panel will present examples of four very different evaluations of education and human services interventions, two in the United States and two in international settings (India and Senegal), in which IMPAQ  International researchers successfully adapted the evaluation design to fit changed conditions. Each presentation will tell the story of one case, addressing such questions as: What happened to require or inspire a change in the design(s)? What considerations and alternatives were weighed? What lessons have been learned about the benefits and pitfalls of these adaptations and the need for nimble future designs? The moderator will offer a general framework for the case examples and the discussant will draw out common points across the cases as well as implications for other evaluations. Audience participation will be welcomed