Challenges to adopting innovations in Monitoring, Evaluation, Research and Learning (and potential solutions!)

Session Number: 1472
Track: International and Cross Cultural Evaluation
Session Type: Panel
Tags: Evaluation Challenges, Innovation
Session Chair: Sophia van der Bijl [Senior Impact Assessment Advisor - USAID Global Development Lab]
Discussant: Shannon Griswold [Sr. Scaling Advisor, Office of Evaluation and Impact Assessment, US Global Development Lab - US Agency for International Development]
Presenter 2: Luke Heinkel [Senior Program Officer - Results for Development]
Presenter 3: Danielle C. de Garcia, MPA, CPT [Program Director, Performance Evaluation, Innovation, and Learning - Social Impact, Inc.]
Time: Nov 11, 2017 (11:15 AM - 12:00 PM)
Room: PARK TWR STE 8206

Abstract 1 Title: Strategic Program for Analyzing Complexity and Evaluating Systems (SPACES): Challenge of explaining complex tools to generalists
Presentation Abstract 1:

The Strategic Program for Analyzing Complexity and Evaluating Systems (SPACES) consortium tackles evaluations from a systems approach. One common question when introducing our methods is answering “what is a systems approach?” Another challenge our consortium faces is demonstrating the linkage between our systems-based approach and locally-led approaches. Another common misconception is that our tools, which involve modeling, analyzing innovation potential, tracking indicators and analyzing social networks is best used at the design phase.

In this panel we will discuss how we’ve demonstrated the value and connection of a systems approach to a variety of programs and all stages of dynamic monitoring, evaluation and learning.

Abstract 2 Title: Rapid Feedback: Challenge of estimating cost and time requirements for unpredictable work
Presentation Abstract 2:

When introducing our innovative approach to evaluation, we are often asked how much it is going to cost and how long it will take to implement. Our Rapid Feedback method is an adaptive one, making it difficult to accurately answer these questions at the outset of an engagement. The evaluation design is created as a result of formative research and early testing—ideally before the formal engagement with a pilot program begins. Nevertheless, we need to be able to answer these very reasonable questions.

For this panel, we will discuss the challenge of communicating cost and time implications to technical and non-technical partners that are used to traditional M&E methods. Our solution to this challenge has been to develop a simple budget and timeline tool that outlines high-and low-touch versions of our methods. We will use our project in Cambodia as an example of how this tool can be used.

Abstract 3 Title: Developmental Evaluation Pilot Activity(DEPA): Challenge of resistance to change
Presentation Abstract 3:

A lack of understanding regarding new approaches and what they entail are key challenges for any innovative approach, and these are paramount within developmental evaluation (DE). In the implementation of DEs, additional challenges include finding partners willing to be the first to try something new, identifying evaluators with this unique skillset, and budgeting for unknowns while remaining adaptive and flexible. One additional challenge to DEPA is being an outsider/evaluator and an insider/implementing team member simultaneously - garnering trust and access while maintaining objectivity

We will discuss a variety of solutions we have implemented to overcome some of these challenges, including through identification of similar cases and the development of illustrative budgets, templates, deliverables, and options memos. We will also discuss approaches we’ve taken to identify skilled evaluators, garner buy-in, and set common expectations.

Abstract 4 Title: Expanding the Reach of Impact Evaluation (ERIE): Challenge of communicating the value and use of longitudinal studies
Presentation Abstract 4:

Expanding the Reach of Impact Evaluation (ERIE) is an approach to conducting retrospective longterm impact evaluations of completed interventions. These evaluations will leverage and build on existing program data to either assess if the observed short‐term impacts are sustained, or to investigate results which might only be expected to emerge over a long-term horizon. We use innovative data collection strategies and methods to identify the appropriate counterfactuals, and generate lessons on how to plan for and conduct these long‐term impact evaluations. 

The ERIE consortium anticipates challenges in getting program staff to buy into evaluating a past program.  If staff have moved on after the completion of program activities, it will be difficult to convince them that looking back at past projects has value and can affect both strategy and programming of current activities.

Audience Level: All Audiences

Session Abstract (150 words): 

This session will include presentations on the challenges and solutions from four teams involved in Monitoring, Evaluation, Research and Learning Innovations (MERLIN) at USAID. The teams have all been trying to encourage offices within USAID to pilot alternative approaches to traditional ones currently being used.  Things have not gone as smoothly as expected in terms of gaining support and buy-in.  Each of the teams has come up with solutions to address these challenges, which they will share.