Evaluation 2017: From Learning to Action

View Printable Version

Juggling the needs of multiple parties and building capacity through engagement with partners and stakeholders

Session Number: HE6
Track: Health Evaluation
Session Type: TIG Multipaper
Session Chair: Jeanette Treiber [Program Manager - University of California Davis]
Presenter 1: Dianne Rucinski [Evaluation Officer - HHS/OASH/OMH]
Presenter 2: Alison Mendoza-Walters [Principal - Public Health Impact, LLC]
Presenter 3: Lina M. Acosta [Research Associate - Montclair State University ]
Presenter 4: Jennifer Ballentine [President - Highland Nonprofit Consulting]
Presentation 2 Additional Author: Jillian Casey [Senior Manager, Prevention - NASTAD]
Presentation 2 Additional Author: Nicholas Parr [Senior Program Analyst, HIV, STI, and Viral Hepatitis - National Association of City and County]
Presentation 2 Additional Author: Diana Karczmarczyk [Public Health Impact ]
Presentation 3 Additional Author: Erin Bunger Johnson [Senior Research Associate - Montclair State University]
Presentation 4 Additional Author: Shelley Francis Travis [Vice President of Training and Programs - Georgia Campaign for Adolescent Power and Potential]
Time: Nov 09, 2017 (08:00 AM - 09:00 AM)
Room: Marriott Balcony A

Abstract 1 Title: Assessing the impact of partnerships on public health programs
Presentation Abstract 1: Partnerships are routinely promoted as a mechanism to improve the development and impact of programs and policies on populations of interest.  Despite their ubiquity, there is scant empirical evidence demonstrating that partnerships have their intended impact as catalysts to amplify program impact or to achieve desired program impact more efficiently.  Evaluations of partnerships usually focus on partner satisfaction with the process and/or participation, but rarely incorporate measures of partnership impact on program outcomes. This paper will present approaches to assessing the impact of partnerships on programs designed to affect health in two health promotion programs. The paper will explicate and operationalize a measure of partnership impact, including a discussion of underlying assumptions associated with the proposed measurement scheme and challenges to implementation using applied examples drawn from grant programs funded by the Office of Minority Health: Addressing Childhood Trauma and Re-Entry Community Linkages.
Abstract 2 Title: You can’t always get what you want: meeting stakeholders’ needs in multi-site evaluation
Presentation Abstract 2: Balancing stakeholder needs in evaluation is always a challenge. We will discuss to what extent we were able to design an evaluation of a rapid syphilis testing pilot project that was relevant to, useful for, and feasible to implement by multiple stakeholders, including a federal funder, a national nonprofit, five local health departments, and individual clients. We will describe challenges we faced and solutions we arrived at regarding determining which data to collect, integrating data collection into existing processes and procedures, and working with sites using differing databases and forms.
Presentation 2 Other Authors: Regina Ortiz Nieves, MS
Abstract 3 Title: Designing a Shared Measurement System from Infancy to Adolescence
Presentation Abstract 3: The shared measurement system is a key tenet of the collective impact model.  Through the use of a shared measures, the initiative can ensure the program is moving in the right direction, align data collection with the initiative’s efforts, improve data quality, hold partners accountable, and learn from each other’s successes and failures (Kania & Kramer, 2013).   This paper will discuss the process of developing a low budget shared measurement system from its infancy through it adolescence. Evaluation capacity building is a growing area of interest among organizations and agencies. It serves to increase the knowledge and skills of organization staff to enable them to conduct straightforward evaluations, be better consumers and contractors for evaluation as well as potentially plan more measurable programs (e.g.,pilot phase). It will explain an iterative process that collective impact initiatives might use to reach the pilot phase of the shared measurement project, which includes collaboratively identifying the measures, planning the system, working with a municipal health department to pilot the system and using GIS mapping. It will also provide an overview of best practices for launching a low budget shared measurement system.   Kania, J., & Kramer, M. (2013, January 21). Embracing emergence: How collective impact addresses complexity. Stanford Social Innovation Review, pp. 1–7.
Abstract 4 Title: Got Sex Ed? Lessons Learned from working with Champions to implement and evaluate the Working to Institutionalize Sex Education (WISE) method with School-Based Partners
Presentation Abstract 4: The Georgia Campaign for Adolescent Power and Potential  (GCAPP) replicates effective teenage pregnancy prevention programs, builds the capacity of youth serving professionals to develop and implement effective strategies, and advocates for policy and system changes that support adolescent sexual and reproductive health through the Working to Institutionalize Sex Education (WISE) Initiative. In this session, GCAPP will discuss their experience identifying, engaging and working with local champions to implement the WISE method with school-based partners. An analysis of multi-year evaluation data indicates that GCAPP’s provision of targeted training and technical assistance has increased the availability of and strengthened support for comprehensive sex education throughout Georgia. Participants will gain valuable insight about what works and why and discuss effective ways to deliver, monitor and maintain fidelity to evidence-based models. Participants will also learn how evaluation results are used to inform and improve upon training, technical assistance and program and policy implementation.
Theme: Select one
Audience Level: None

Session Abstract (150 words):  Juggling the needs of multiple parties and building capacity through engagement with partners and stakeholders


For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.



Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 16, 2017. Email cancellation requests to registration@eval.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 16, 2017 all sales are final. For Evaluation 2017, international attendees and presenters who encounter complications due to the international travel environment will have up to 30 days after the event to request a refund and submit appropriate documentation. No administrative fee will apply for the international requests. The $50 fee will be waived for registrants who planned to travel into the US and experienced international travel issues.