Evaluation 2017: From Learning to Action

View Printable Version

Using the Interactive Systems Framework to Guide Evaluation in Community Capacity-Building Interventions

Session Number: 2261
Track: Community Psychology
Session Type: Multipaper
Tags: Abraham Wandersman, community based organisation, community psychology
Session Chair: Abraham Wandersman [Professor - University of South Carolina]
Discussant: Leonard Bickman [Vanderbilt University]
Presenter 1: Kassandra Ann Alia [Graduate Student - University of South Carolina]
Presenter 2: Brittany S Cook [Doctoral Student - University of South Carolina]
Presenter 3: Michelle Abraczinskas [Graduate Student - University of South Carolina]
Presentation 1 Additional Author: Jonathan P Scaccia [Community Psychologist and Evaluator - Independent ]
Presentation 2 Additional Author: Victoria Scott [Assistant Professor - University of North Carolina Charlotte]
Presentation 2 Additional Author: Jonathan P Scaccia [Community Psychologist and Evaluator - Independent ]
Presentation 2 Additional Author: Michelle Abraczinskas [Graduate Student - University of South Carolina]
Presentation 3 Additional Author: Amy Reid
Presentation 3 Additional Author: Morgen Palfrey
Presentation 3 Additional Author: Rohit Ramaswamy [University of North Carolina Chapel Hill]
Time: Nov 09, 2017 (08:00 AM - 09:00 AM)
Room: PARK TWR STE 8212

Abstract 1 Title: A Multi-level Evaluation of Technical Assistance: Understanding the Delivery of Coaching for Improvement Science Methods
Presentation Abstract 1:

Technical Assistance (TA) is a strategy for enhancing the readiness of practitioners to implement evidence-based interventions. Though TA is widely practiced, it is not well defined or clearly operationalized; thus, more evaluation is needed to inform best practices for delivering TA. In this presentation, we present results of the delivery of coaching in the SCALE (Spreading Community Accelerators through Learning and Evaluation) initiative. In SCALE, trained improvement coaches provided group and individual support to participating community coalitions for use of improvement science methods. The evaluation examined the quality of support delivered to communities and the supports provided to coaches by the implementation team. Findings indicate that coach support was a critical SCALE component and that coaches were the conduit of information sharing between the implementation team and communities. However, coaches needed more support for their role, especially more quality improvement/quality assurance regarding the effectiveness of their efforts.


Abstract 2 Title: The Formative Evaluation of Training: Lessons from a Community Health Improvement Initiative
Presentation Abstract 2:

In many initiatives, training is one of the primary methods by which capacity is built to implement an innovation.  A well-designed training evaluation can help determine whether or not a training met its intended goals. This session will describe how we developed a comprehensive, multi-method formative approach to evaluate trainings.   We will discuss the different methods we use to assess trainees’ experiences, changes in learning, and intent to apply content to their own settings.   We will describe our experiences applying and refining this approach across four distinct administrations in a larger, complex community health improvement initiative.  We will share how specific methods and tools were naturally spread and adopted by training participants into other settings.  We will identify key learnings that can help other stakeholders with different levels of capacity replicate this method in other settings.


Abstract 3 Title: Evaluating the Delivery System of a Community Health Improvement Initiative through a Case Study Approach
Presentation Abstract 3:

A component of the Interactive Systems Framework is the delivery system, which can include national, state, and/or local entities (e.g., health and human services organizations, schools, coalitions) implementing an innovation (Wandersman, Chien, & Katz, 2012). The presentation will introduce how we used case study methodology to understand a community coalition delivery system. Five SCALE communities were closely monitored to document progress on health improvement projects. In addition to data gathered on all coalitions (e.g., surveys), the case study method included structured interviews and observations of meetings and community settings during virtual and in-person site visits. Attendees will learn how we used and refined the case study approach over time as implementation and evaluation needs changed, using virtual visits as one example. We will share lessons learned that can help others who are interested in using a case study approach to evaluate long-term, complex community health improvement initiatives.


Presentation 3 Other Authors: Victoria Scott, PhD
University of North Carolina—Charlotte
vscott10@uncc.edu

Theme: Learning What Works and Why
Audience Level: All Audiences

Session Abstract (150 words): 

There is a growing focus on improving population health, notably from the Triple Aim and RWJF’s Culture of Health. There already exists a vast infrastructure of community health coalitions in the U.S. and recent research shows promise for community coalitions as an approach to improving health. However, more evaluation is needed to understand the mechanisms underlying effects and identify best practices. In this presentation, we will describe how the Interactive Systems Framework (ISF) guided data synthesis in a national, multisite community capacity-building initiative. Results will be presented related to community delivery of health improvement projects and coaching/technical assistance and training provided to build community capacity for using quality improvement methods. Findings demonstrate how the ISF may be a useful framework for synthesizing data in community capacity-building initiatives. Results generated using the ISF can help distill complex findings into generalizable knowledge for enhancing the effectiveness of community coalition efforts.



For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.



Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 16, 2017. Email cancellation requests to registration@eval.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 16, 2017 all sales are final. For Evaluation 2017, international attendees and presenters who encounter complications due to the international travel environment will have up to 30 days after the event to request a refund and submit appropriate documentation. No administrative fee will apply for the international requests. The $50 fee will be waived for registrants who planned to travel into the US and experienced international travel issues.