Evaluation 2016: Evaluation + Design

View Printable Version

Collaborative Evaluations: Successes, Challenges, and Lessons Learned

Session Number: CPEE2
Track: Collaborative, Participatory & Empowerment Evaluation
Session Type: TIG Multipaper
Session Chair: Connie Walker-Egea [Senior Social and Behavioral Researcher - University of South Florida]
Presenter 1: Catherine Dunn Kostilnik, PhD [President - CCS, Inc.]
Presenter 2: Tiffany L. Young [Postdoctoral Research Associate - University of North Carolina at Chapel Hill]
Presenter 3: Paul St Roseman [Principal and Evaluation Researcher - DataUse Consulting Group]
Presenter 4: Monica Hargraves [Associate Director for Evaluation Partnerships - Cornell University]
Presenter 5: Jeffrey J. Milroy [Associate Director - Institute to Promote Athlete Health & Wellness]
Presentation 2 Additional Author: Tiffany L. Young [Postdoctoral Research Associate - University of North Carolina at Chapel Hill]
Presentation 2 Additional Author: Gaurav Dave, MD, DrPH, MPH [Assistant Professor - UNC-Chapel Hill]
Presentation 2 Additional Author: Zoe Enga [Evaluation and Data Coordinator - NC TraCS Institute]
Presentation 3 Additional Author: Paul St Roseman [Principal and Evaluation Researcher - DataUse Consulting Group]
Presentation 3 Additional Author: Erika Van Buren [Vice President of Evaluation and Learning - First Place for Youth]
Presentation 4 Additional Author: Monica Hargraves [Associate Director for Evaluation Partnerships - Cornell University]
Presentation 4 Additional Author: Cecilia Denning [Action Resources International]
Presentation 5 Additional Author: Jeffrey J. Milroy [Associate Director - Institute to Promote Athlete Health & Wellness]
Presentation 5 Additional Author: Muhsin Michael Orsini [Evaluation Scientist]
Presentation 5 Additional Author: Daivd L. Wyrick [Associate Professor - University of North Carolina at Greensboro]
Time: Oct 26, 2016 (04:30 PM - 06:00 PM)
Room: International North C

Presentation Abstract 1:


This qualitative study examined the roles that collaborative organizations play in planning and measuring community change. The study supported the literature on the necessary and specific steps in which collaborative members engage to build the internal ability to measure community outcomes. Community collaborative members determined how to operate across various organizational structures, how to manage a collaborative entity through membership and leadership, and how to measure the implementation of collaborative members’ action plans. The focus of the study was to examine the processes collaboratives used in the context of a three part conceptual framework – theory of change, systems change, and utilization-focused evaluation. The conceptual framework that informs this study focuses on the collaborative processes of systems change, theory of change, and utilization-focused evaluationBy utilizing their formal and informal organizational structures, collaborative members build the capacity to measure outcomes and utilize the findings for future planning.        


Abstract 2 Title: Collaborative Stakeholder Engagement in Developing Health Promotion Strategies: The Hypertension Evidence Academy Action Learning Cohort Series
Presentation Abstract 2:

The disproportionate burden of hypertension (HTN) in eastern North Carolina is complex.  Addressing HTN disparities requires rapid uptake of the latest innovations and identification of feasible solutions.  Conferences that bring together evaluators, researchers, healthcare professionals, and community partners can encourage uptake of new innovations; however, there is limited knowledge about how to continue participant engagement post-conference to identify new solutions.  Following a one-day regional HTN conference, we used an action learning approach to engage stakeholders in eastern North Carolina to use information gained at the conference to create novel HTN solutions.  This presentation will illustrate our collaborative solution creating process in which diverse stakeholders developed a toolkit focused on improving healthcare provider empathy toward HTN patients.  The co-learning series involved a cycle of systems learning, strategic planning, and process evaluation activities.  We will discuss stakeholders’ experiences and provide lessons learned and recommendations for program designs that incorporate action learning and stakeholder engagement.



Abstract 3 Title: The Alignment of Evaluation Capacity Building, Adult Learning and Leadership Development within a Conceptual Framework
Presentation Abstract 3:

This paper presents a conceptual framework that maps instances of adult learning and leadership development that occur throughout the formative, implementation and use phases of a collaborative evaluation capacity building process.  The conceptual framework grounds capacity development outcomes to evaluation, learning, organization and leadership theories.  By grounding ECB outcomes to cross-discipline theories, it is possible to identify conceptually rich development pathways useful for: (a) diagnosing evaluation capacity, (b) monitoring capacity development during an evaluation’s implementation phase, and (c) considering the impact of ECB efforts on individual and organizational development.  Collectively the outcomes presented in this framework measure: (1) development (or changes) in behavior/values of collaborative stakeholders towards evaluation; (2) how collaborative stakeholders negotiate their skill and expertise to determine what needs to be done within an evaluation study; and (3) how voice, agency and the adaptive use of evaluation data, reports and other deliverables develop throughout an evaluation study.

Abstract 4 Title: Visualizing Expertise, Complexity, and Design – Collaborative Pathway Modeling with Community Organizers
Presentation Abstract 4:

Community leaders have distinct sources of expertise and first-person knowledge which are essential for developing innovative, relevant, effective approaches to community problems. Unfortunately that distinct expertise and the complex solutions can be difficult to communicate. The challenge for outside stakeholders and for the community experts is how to come to understand each other enough to be able to discuss program design and evaluation meaningfully. The method described here produces visual models that bring forward practitioner theories of change and program designs. The process focuses on surfacing practitioner expertise, honoring the complexity of the work they are doing, making it comprehensible and accessible to outsiders, and articulating outcomes and theories of change. We present pathway models from two transformative social change programs: Dig Deep Farms in California, and Feeding Laramie Valley in Wyoming. Both have been part of the USDA-funded “Food Dignity” action research project on sustainable community food systems (http://fooddignity.org/).


Abstract 5 Title: A Collaborative Approach to Evaluating and Optimizing Behavioral Interventions Using the Multiphase Optimization Strategy (MOST)
Presentation Abstract 5:

MOST is a framework for developing and evaluating behavioral interventions that is inspired by engineering research and product development. This session will describe a collaborative approach to evaluating and optimizing behavioral interventions using MOST in which intervention scientists are collaborating with program evaluators to optimize an online program. This process was guided by the principles and methods of collaborative evaluation in which the authors engaged stakeholders in the planning and implementation stages, were responsive to program needs, and ensured that decision making responsibilities were dispersed across all member of the project team. Moreover, this session will describe how purposefully using a collaborative approach to evaluation enhances the utility of MOST. For example, a collaborative approach during the preparation phase of MOST empowers both programmers and evaluators to be actively engaged during future phases and encourages the inclusion of multiple stakeholders during decision making.


Audience Level: Beginner, Intermediate, Advanced, All Audiences

Session Abstract: 

Collaborative Evaluations: Successes, Challenges, and Lessons Learned

For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.

Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 3, 2016. Email cancellation requests to registration@eval.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 3, 2016 all sales are final.