New directions in using experiments for international evaluations
Session Number: 3034
Track: International and Cross Cultural Evaluation
Session Type: Panel
Tags: International development
Session Chair: Tulika Narayan, Development Economist [Principal Associate]
Discussant: Arif Mamun [Associate Director of International Research - Mathematica Policy Research ]
Presenter 1: Kate Hausdorff
Presenter 2: Abigail Conrad [Associate - Abt Associates ]
Presenter 3: Austin Davis [Assistant Professor of Economics; Postdoctoral Associate - American University; Yale University]
Presentation 1 Additional Author: Judy Geyer
Presentation 1 Additional Author: Stephen H. Bell [Senior Fellow & Principal Scientist - Abt Associates Inc.]
Presentation 1 Additional Author: Tulika Narayan, Development Economist [Principal Associate]
Presentation 2 Additional Author: Tulika Narayan, Development Economist [Principal Associate]
Presentation 2 Additional Author: Melissa Marie Chiappetta [Director of the Center for International Evaluation - Abt Associates]
Presentation 2 Additional Author: Austin Nichols [Principal Associate - Abt Associates]
Presentation 3 Additional Author: Mushfiq Ahmed Mobarak
Time: Nov 10, 2017 (08:00 AM - 09:30 AM)
Room: Thurgood Marshall North
Abstract 1 Title: Challenges and opportunities in evaluating pull mechanisms
Presentation Abstract 1:
Donors are increasingly turning towards “payment by results” programs to fund new initiative, which reward results through prizes. We describe an evaluation framework for these programs. This framework draws on our experience in leading random assignment evaluations and other quasi-experimental evaluations of AgResults, a $110 million multilateral initiative funded by Australia, Canada, the United Kingdom, the United States, and the Bill & Melinda Gates Foundation to test the use of “pull mechanisms” to incentivize private companies to develop and disseminate high-impact agricultural innovations that promote food security and benefit smallholder farmers. Based on our interim process analysis and impact analysis findings across the six ongoing distinct AgResults pilot projects, we provide early insights on the challenges and ideal approaches for ensuring the success of randomized control trials in the context of pay for success programs.
Abstract 2 Title: Rapid Feedback Experiments to Inform Implementation
Presentation Abstract 2:
A growing critique of evaluations among the donor community is that the evaluation findings come only after the program has been completed, with no ability to affect the results of the program that it was evaluating. Programs are increasingly eager to learn within program cycles how they can improve their effectiveness. This requires rapid evaluation techniques. In this paper we present how to apply proven evaluation methods to test the effectiveness of specific components of an activity against alternative intervention options. The approach involves rigorously testing the success of two or more alternative intervention options at achieving short-term outcomes. This is done in rapid cycles to allow for timely feedback and course adjustment earlier than is typically done using standard methods. Rapid feedback evaluations may include use of advanced statistical techniques to improve statistical power and reduce sample size requirements, as well as the use of tools to achieve rapid data collection. We present this paper in the context of our rapid experiments to inform the best performing behavior change strategies to reduce the separation of children from their parents in poor communities in Cambodia. The data for these experiments come from online surveys of Google, Facebook and Travel sites.
Presentation 2 Other Authors: Abigail Conrad
Abstract 3 Title: Can Experiments Inform Scale-up?
Presentation Abstract 3:
Results from randomized control trial in Bangladesh suggest that a small monetary nudge – fare for a roundtrip ticket to the town - is successful in having a lasting impact in encouraging family members to migrate to the cities during the hungry season. The workers earned about $110 average each day, which they spent on an extra meal. This paper talks about the challenges and opportunities in scaling up this simple idea in other countries. It explores the ways in which authors explored tweaking of the program for other country contexts with broader lessons on in informing scale up of other successful programs. While the internal validity of experimental designs is very strong, external validity depends on many factors. This paper will talk about several approaches to address the problem of external validity, and the tweaks needed to inform global scale-up drawing from the authors experience in scaling up this idea eastern India, Indonesia, Malawi and Zambia.
Theme: Learning What Works and Why
Audience Level: All Audiences
Session Abstract (150 words):
As international donors continue to fund programs to achieve better livelihoods for the world’s poorest populations, the importance of rigorous evaluations has been increasingly recognized within the donor community. Informed by these evaluations, programs are implementing new ideas, such as how to use behavioral nudges to increase technology adoption. At the same time programs are also organically innovating, and seeking data-driven approaches to sharpen impact. This session will explore three new areas in international evaluations that help advance this objective through an interactive panel of researchers leading work in these areas: a) challenges and opportunities in rigorously evaluating a new approach to implement international development programs – prizes that reward only winners; b) the role of rapid cycle experiments using program monitoring data to sharpen program implementation for greater impact, and c) the challenges and opportunities in using the results of experiments to inform scale-up in different geographies.
For questions or concerns about your event registration, please contact email@example.com or 202-367-1173.
For questions about your account, membership status, or help logging in, please contact firstname.lastname@example.org.
Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 16, 2017. Email cancellation requests to email@example.com. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 16, 2017 all sales are final. For Evaluation 2017, international attendees and presenters who encounter complications due to the international travel environment will have up to 30 days after the event to request a refund and submit appropriate documentation. No administrative fee will apply for the international requests. The $50 fee will be waived for registrants who planned to travel into the US and experienced international travel issues.