Session Number: 2345
Track: Use and Influence of Evaluation
Session Type: Panel
Session Chair: Eric Lundgren [Director of International Operations - IMPAQ International]
Discussant: Virginia Lamprecht [Evaluation Methodologist - USAID]
Presenter 1: Eric Lundgren [Director of International Operations - IMPAQ International]
Presenter 2: Laura Arntson [Oregon State University]
Presenter 3: Michelle Christine Adams-Matson [Technical Director/ Practice Area Leader - Management Systems International ]
Presenter 4: Ami P Henson [QED Group]
Presentation 2 Additional Author: Nancy Peek [Monitoring and Evaluation Specialist - Social Solutions International]
Presentation 2 Additional Author: Tonya Caprarola Giannoni [Chief of Party - Social Solutions International]
Time: Nov 08, 2017 (06:15 PM - 07:15 PM)
Room: Thurgood Marshall South
Abstract 1 Title: Utilization of Monitoring and Evaluation activities in USAID Missions
Presentation Abstract 1:
Social Impact has used our eight different MEL Platform contracts around the world to begin systematically tracking use of the different M&E deliverables we provide to the Missions. This information is captured through both regular meetings between our team and the Mission, as well as formal surveys that we send to our Mission clients on a regular basis. In order to understand how best to support improvements in the Mission's learning systems, it's important to first understand how they're using the information we're currently providing to them. Social Impact will therefore present the findings of our utilization surveys, and how that information has helped us inform the way we approach supporting USAID Missions to improve their learning approach.
Abstract 2 Title: Learning what works: Using MEL Platforms for Evaluation Capacity, Quality, and Learning
Presentation Abstract 2:
The 2016 USAID MEL Platforms Assessment explored how these MEL support mechanisms are used by USAID missions to carry out MEL functions. This presentation summarizes the Platforms Assessment and ways these types of mechanisms strengthen learning and the enhancement of evaluation practices. The decentralized and competitive sourcing processes for acquiring MEL support services contributes to variation in Platform design and outcomes across the Agency. At the time of the 2016 assessment, 55 field-based Platform mechanisms were identified and a subset of 30 Platforms from 22 USAID missions underwent in-depth data collection and analysis. Qualitative data were collected through 107 key informant and small group interviews, using a semi-structured question guide. The assessment identifies challenges from and promising practices for using MEL Platform to extend USAID capacity to conduct quality evaluations that contribute to learning and decision making.
Abstract 3 Title: Integrating CLA into M&E Services
Presentation Abstract 3:
USAID’s most recent ADS 201, places more explicit emphasis on integrating collaboration, learning, and adaptation (CLA) in USAID’s programming cycle. Drawing on lessons learned from MSI’s M&E support platforms, we have identified a range of approaches for integrating CLA in to the range of services provided, as well as the factors that are necessary for success.
Abstract 4 Title: MEL Platforms Primarily Focused on Learning
Presentation Abstract 4:
Across the spectrum of MEL Platform contracts, the focus might vary significantly from traditional M&E activities on one side and learning activities on the other. In general, we are seeing Missions interpret and implement their learning requirements (particularly the Collaboration, Learning, and Adaptation (CLA) approach) in different ways. Taking Uganda as a case study, QED will talk about one of the projects focused on learning and CLA, what are the differences from traditional M&E platforms, factors contributing to success and progressions of projects that use CLA language.
Theme: Learning About Evaluation Use and Users
Audience Level: All Audiences
Session Abstract (150 words):
The introduction of the USAID Evaluation Policy in January 2011, along with related changes to the USAID automated directives system on strategy, project design, and monitoring, increased mission workload and the need for technical expertise in monitoring, evaluation, and organizational learning. In response, USAID missions are increasingly contracting out monitoring, evaluation and learning (MEL) services through institutional support contracts (referred to as Monitoring, Evaluation and Learning (MEL) Platforms) in order to extend the reach and technical capacity of USAID missions and improve the quality and use of evaluations. In this panel, leaders from organizations with multiple MEL Platform contracts will discuss the challenges and lessons learned from implementing these contracts, with particular focus on USAID's approach to learning from monitoring and evaluation work. We will also present findings from a meta-evaluation of the MEL Platform contracts identifying promising practices for using platforms to conduct evaluations that contribute to learning.