Evaluation 2017: From Learning to Action

View Printable Version

Collecting Program Information from the Ground Up: Strategies for Building Organizational Evidence

Session Number: 1872
Track: Organizational Learning & Evaluation Capacity Building
Session Type: Panel
Session Chair: Astrid HENDRICKS [Fellow - ICF]
Discussant: Mary Hyde [Corporation for National and Community Service]
Presenter 1: Astrid HENDRICKS [Fellow - ICF]
Presenter 2: Xiaodong Zhang [Fellow - ICF ]
Presentation 1 Additional Author: Miriam Jacobson [Senior Technical Specialist - ICF]
Presentation 1 Additional Author: Craig Kinnear [Program & Budget Analyst - Corporation for National & Communtiy Service]
Presentation 1 Additional Author: Anthony Nerino
Presentation 2 Additional Author: Michael Long [Principal - ICF International]
Presentation 2 Additional Author: Andrew MacDonald [Associate - ICF International]
Presentation 2 Additional Author: Lily Zandniapour [Research and Evaluation Manager - Corporation for National and Community Service]
Time: Nov 09, 2017 (04:30 PM - 05:15 PM)
Room: Madison B

Abstract 1 Title: Measuring Program Implementation and Impact Using Grantee Performance
Presentation Abstract 1:

VISTA is a program that focuses on building the organizational, administrative, and financial capacity of non-profit or public agencies that are addressing the toughest issues facing low-income communities in education, health, and economic development. Like many funders, CNCS monitors the work of its grantees using various progress tracking tools, such as performance and activity reports. A common dilemma often is whether these tracking tools measure grantees’ fidelity to their program, and to what extent they capture how grantees achieve impact. In this presentation we will describe the collaborative evaluation process for measuring the alignment of grantee strategy and outcomes described in VISTA’s management and tracking processes in order to determine the adequacy of the information being collected and the data’s potential to inform the program, measure impact, and inform policy. In addition, we present recommendations for developing monitoring systems that help funders learn from grantees while tracking their accomplishments.

Abstract 2 Title: Building Evidence on Service Solutions from the Ground Up
Presentation Abstract 2:

In this presentation, we describe experiences from CNCS’ efforts to build evidence around the AmeriCorps model. This presentation summarizes findings from a review of grantee applications, evaluation plans and reports to assess the evidence and evaluation practices in different focus areas and target groups, common evaluation approaches and challenges, and focus groups and interviews with grantees and other stakeholders to learn about their needs and perspectives on evidence and evaluation. It also outlines recommended strategies to synthesize and grow the evidence base for AmeriCorps, including: 1) Collecting, analyzing and sharing examples, grantee and field level approaches and evidence; 2) Enhancing grantees’ capacity around building evidence; 3) Using a bundling approach to evaluate multiple grantees with similar interventions; 4) Helping CNCS build buy-in by improving messaging around evaluation and an evidence-building framework; and 5) Examining grantees’ theories of change to inform the development of focus area or program-level theories of change.

Theme: Learning About Evaluation Use and Users
Audience Level: All Audiences

Session Abstract (150 words): 

One of the ongoing challenges grant-makers from government agencies and philanthropies face is how to build evidence from evaluations with grassroots service-providers, which often lack the capacity to systematically capture, analyze, and use the data they compile. This panel will suggest ways funders can grow, gather and use evidence about the organizational capacity and community impact of their grantees. These suggestions are rooted in two projects conducted for the Corporation for National and Community Service (CNCS), which funds thousands of nonprofits in health, education, environment, economic opportunities, and tribal communities. In these projects, the panelists gathered data about organizations, such as evaluation capacity and practices, programmatic context, evidence about their program and their use of the evidence. Informed by that data, the panelists developed strategies for facilitating learning and developing knowledge about programs. This panel will share findings about organizations’ use of evidence and practical strategies for enhancing organizational learning. 

For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.

Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 16, 2017. Email cancellation requests to registration@eval.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 16, 2017 all sales are final. For Evaluation 2017, international attendees and presenters who encounter complications due to the international travel environment will have up to 30 days after the event to request a refund and submit appropriate documentation. No administrative fee will apply for the international requests. The $50 fee will be waived for registrants who planned to travel into the US and experienced international travel issues.