Evaluation 2017: From Learning to Action

View Printable Version

Strategies and Tools for Multi-Stakehodlers Systems Change Initiatives

Session Number: UIE2
Track: Use and Influence of Evaluation
Session Type: TIG Multipaper
Session Chair: Jessica Shaw [Assistant Professor - Boston College]
Presenter 1: Lauren Nichol Gase [Senior Researcher - Spark Policy Institute]
Presenter 2: Jacqueline Pei [Associate Professor - University of Alberta]
Presenter 3: Susan H Chibnall [Caliber Associates]
Presenter 4: Barbara Szijarto [PhD Candidate - University of Ottawa]
Presenter 5: Mari Kemis [Director, Research Institute for Studies in Education - Iowa State University]
Presentation 1 Additional Author: Taylor S Schooley [Research Analyst - Los Angeles County Department of Public Health]
Presentation 1 Additional Author: Moira Inkelas [Associate Professor - UCLA Fielding School of Public Health]
Presentation 1 Additional Author: Eraka Bath [Associate Professor - UCLA]
Presentation 2 Additional Author: Cheryl Poth [Associate Professor - University of Alberta]
Presentation 2 Additional Author: Melissa Tremblay [Doctoral Student - University of Alberta]
Presentation 2 Additional Author: Btissam El Hassar [Doctoral Candidate in Measurement, Evaluation and Cognition - Centre for Research in Applied Measurement, University of Alberta]
Presentation 3 Additional Author: Michael Steketee [Westat]
Presentation 3 Additional Author: Jaymie Lorthridge [Senior Study Director - Westat]
Presentation 5 Additional Author: Brandi N Geisinger [Research and Evaluation Scientist - Iowa State University]
Presentation 5 Additional Author: Elena Yu Polush [Evaluator - Des Moines Public Schools]
Time: Nov 10, 2017 (06:30 PM - 07:15 PM)
Room: Roosevelt 4

Abstract 1 Title: Designing Evaluation to Improve Complex Systems: A Case Study from the Los Angeles Juvenile Court
Presentation Abstract 1: As many as 70% of youth who come into contact with the juvenile justice system have a diagnosable mental health concern; however, many youth enter a system that is ill-equipped to assist them. In 2016, the Los Angeles County (LAC) Department of Public Health partnered with the LAC Juvenile Court to assess the processes, programs, and other services being implemented to address the mental health needs of justice-involved youth, with the aim of identifying strategies to improve system functioning. Methods included (1) semi-structured interviews with organizational leaders to identify system processes, strengths, and challenges; (2) surveys with ground-level staff (probation officers, attorneys) to understand variation in practice and gaps in knowledge; and (3) abstraction of administrative data to understand service delivery and youth outcomes. This presentation will describe the participatory evaluation process, with a focus on strategies and tools to improve of multi-stakeholder initiatives and systems.
Abstract 2 Title: Enhancing use during a systems evaluation through a stakeholder-centred tool development process
Presentation Abstract 2: This presentation describes a stakeholder-centred tool development process for the purpose of enhancing use during an evaluation adopting a systems perspective. The process emerged as a collaborative response to stakeholders’ need to improve services and interventions for Canadian individuals affected by Fetal Alcohol Spectrum Disorder (FASD) and the evaluation team’s desire to enhance validity evidence. The great variability in the roles for service providers and agencies across sectors (e.g., housing, social services, health) attributable to differences across individuals and their families affected by FASD provides a unique context in which to pursue this research from a systems perspective. Dynamic features inherent to complex evaluations along with the lack of empirical-based effective FASD practices highlight the need for context-relevant tools developed with stakeholder involvement. To that end, we describe the dynamic features and system components, the stakeholder-centred tool development process, and the anticipated outcomes validity and use for six evaluation instruments.
Presentation 2 Other Authors: Sabine Ricioppo, ; Doctoral student in Measurement, Evaluation and Cognition; Centre for Research in Applied Measurement, University of Alberta
Abstract 3 Title: Using Social Network Analysis to Engage Stakeholders and Learn About Systems Change
Presentation Abstract 3: Creation and strengthening of cross-sector coordination is one of the system change goals for the Conrad N. Hilton Foundation’s Foster Youth Strategic Initiative (FYSI).  After a review of possible measures, the evaluation team decided to use social network analysis (SNA) to assess this goal.  SNA was selected because it allowed for a complete analysis of the FYSI network.  Through quantitative and visual descriptions of coordination, using network metrics and graphs illustrating change over time, the evaluation team documented the collaboration of cross-sector grantees.  The SNA also increased grantees’ interest in and contribution to the evaluation.  Grantees completed two SNA surveys and their feedback, during the survey process, led to data collection improvements.  Results from the second survey led to an informative discussion on how the FYSI network had grown, coordination had increased across regions, and what those changes meant for cross-sector work. Findings informed grantee and funder decision-making.  
Abstract 4 Title: Making Space for Developmental Evaluation: Strategies from the Field
Presentation Abstract 4: Developmental evaluation specializes in a highly challenging niche area – providing systematic evaluative input to social interventions, including social innovations, that are developing or adapting under complex conditions (Patton, 2011). Behind these initiatives are people collaborating across sectors and social groups, who bring diverse needs and cultures of inquiry. How do DEs nurture, navigate and sustain an evaluation under these conditions? What does it take to foster adaptive learning? This presentation will report on a mixed methods study about DE. The study integrates multi-case analysis with concept mapping (e.g., Trochim, 2017) to explore the role of DEs in learning ‘systems’. Strategies from the field will be linked to practice challenges, for example making space for systematic inquiry while an initiative develops, cultivating a ‘common grammar’ for diverse stakeholders, and anchoring evaluative thinking in an initiative. The audience will be encouraged to discuss broader implications of the DE experience for evaluation use.
Abstract 5 Title: Moving from an Activities-Based Evaluation to Topic-Focused Evaluation to Evaluation Research: A Case of Evaluating a Multiyear NSF Engineering Research Center
Presentation Abstract 5: Evaluation is human inquiry (Reason, 1994). The nature of its underlying process is participative – doing evaluation with rather than on people. As a human endeavor, the conduct of evaluation is reflexive with recurring fundamental issues periodically resurfacing in new forms and demanding attention as circumstances of our work and challenges we face change (Smith, p. 3, 2008). Addressing those concerns and sharing experiences cumulatively facilitates the profession’s growth. In this paper we reflect on how evaluating a multiyear program has shaped our thinking and informed our choices about the evaluation conduct. We focus on how an evaluation ‘ages’ through its life cycle, progressing from formative and idea-building stages to summative strategies. Within the presented case study, we examine the conceptions of evaluation rigor, credible evidence, and influences. This paper contributes to ongoing conversations about fundamental issues in evaluation pertaining to method and practice that form evaluation character.
Theme: Select one
Audience Level: None

Session Abstract (150 words):  Strategies and Tools for Multi-Stakehodlers Systems Change Initiatives


For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.



Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 16, 2017. Email cancellation requests to registration@eval.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 16, 2017 all sales are final. For Evaluation 2017, international attendees and presenters who encounter complications due to the international travel environment will have up to 30 days after the event to request a refund and submit appropriate documentation. No administrative fee will apply for the international requests. The $50 fee will be waived for registrants who planned to travel into the US and experienced international travel issues.