Evaluation 2015: Exemplary Evaluations in a Multicultural World

View Printable Version

Developing the implementation science for complex change.

Session Number: 2948
Track: Presidential Strand
Session Type: Panel
Tags: Collective Impact, community change, complexity, Developmental Evaluation, Implementation Evaluation, Place based Initiatives, systems change
Session Chair: David M Chavis [Community Science]
Presenter 1: Lisbeth Schorr [Center for the Study of Social Policy]
Presenter 2: Michael Quinn Patton [Utilization-Focused Evaluation]
Presenter 3: David M Chavis [Community Science]
Time: Nov 13, 2015 (08:00 AM - 09:30 AM)
Room: Plaza A

Abstract 1 Title: On Knowing the Unknown: Opportunities, Challenges, and Consequences of Trying to Learn from “What Works”
Presentation Abstract 1: Today, when so many of the most promising interventions are ever more complex, new forms of learning -- especially from implementation -- are essential. Everyone agrees that public and philanthropic resources should go to “what works” to achieve outcomes. The problem is how do we best generate and assemble the knowledge that leads to achieving greater results, reliably and at scale.
Currently, a narrow set of methods are preferred over others. This presentation will explore how newer approaches to generating and applying evidence are producing the richer knowledge we now need about complex interventions, to avoid skewing action towards what’s easy to measure, to increase opportunities to achieve impacts, and to encourage innovation. This presentation will use the work of the Carnegie Endowment for the Advancement of Teaching as an example of using learning networks to engage the people who actually do the work with researchers in developing the science of implementation and strengthening results.

Abstract 2 Title: Exemplary Embedded Evaluation: Integrating planning, design, implementation, evaluation, and learning to support innovation and change in complex dynamic systems
Presentation Abstract 2: Traditional models separate and sequence: First assess needs, then develop a plan, then design the intervention, then implement the intervention, then evaluate effectiveness, and, finally, extract lessons. Such a sequence epitomizes a closed-system project mentality to program development and evaluation. In complex dynamic systems, situation analysis, design, implementation, evaluation, and learning happen simultaneously, interactively, interdependently, and dynamically. Two examples of how to engage in change this way will be shared, one domestic example (USA) and one international example. Complexity-based evaluation standards for rigor and relevance provide mutual reinforcement to generate useful and credible evidence that informs ongoing action, adaptation, and learning. This presentation will discuss why to conceptualize change in this way, how to implement it, and what results to expect from such embedded and integrated evaluation.
Abstract 3 Title: Rigor in the evaluation of complex change: Friend or Foe?
Presentation Abstract 3: Strategies that change complex systems such as communities or service systems are becoming extremely important because of their potential to address core social determinants of well-being. Place-based initiatives (focused on people living in systems in a geographic area) are among the most complex. Mounting research evidence that shows that where people live is one of the strongest predictors of the well-being of the population. Despite the potential impact on population well-being, little is known about evaluating such initiatives in a way that will yield improved effectiveness as well as replicability across different contexts. Traditional experimental designs have not been neither useful nor effective methodology in this regard. This presentation seeks to address some of the issues that need to be addressed in order to effectively evaluate and understand the implementation of complex strategies. The cross case study methodology will be used as one example of a rigorous yet appropriate approach.


Audience Level: Advanced, All Audiences

Session Abstract: 

While the science of how to implement evidence-based programming for individual level change has been developing rapidly over the last several years, the methodologies for developing evidence regarding community and other  systems change initiatives have lagged behind the increased support for  these strategies by public and private funders. The current state-of-the-art is either a process or approach such as developmental evaluation or a patchwork approach “borrowing” methods and instruments from experimental designs and program evaluation. This session seeks to call for more systematic and appropriate approach to generating knowledge about how to implement these types of strategies by leaders in the field of evaluating complex initiatives.  Presenters will discuss what they think are the new standards and methodologies that can provide rigorous and useful understanding of change in complex situations. Examples of appropriate methodologies will be provided. Facilitated discussion among panelists and audience will also be included.

 



For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.



Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 6, 2015. Email cancellation requests to registration@eval.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 6, 2015 all sales are final.