Evaluation 2017: From Learning to Action

View Printable Version

Mixed Methods Evaluation and Social Issues

Session Number: MME2
Track: Mixed Methods Evaluation
Session Type: TIG Multipaper
Session Chair: Annette Fay [M&E Specialist - Water CKM Project, Social Impact]
Presenter 1: Jeanette Treiber [Program Manager - University of California Davis]
Presenter 2: Esther Spindler, Institute for Reproductive Health (IRH) [Research Officer - Georgetown University]
Presenter 3: Mihaiela R Gugiu [Psychometrician - National Registry of Emergency Medical Technicians]
Presenter 4: Rajiv N Rimal [Professor - George Washington University]
Presentation 1 Additional Author: Robin Kipke [Evaluation Associate - Univ. of California, Davis]
Presentation 1 Additional Author: Jorge T Andrews [Evaluatino associate - UC Davis]
Presentation 1 Additional Author: Sue Haun, M.A. [Evaluation Consultant - Strategies By Design]
Presentation 4 Additional Author: Sabrina McCormick [Associate Professor - George Washington University]
Presentation 4 Additional Author: Ashley Bieniek-Tobasco [The George Washington University Milken Institute School of Public Health]
Presentation 4 Additional Author: Hina Shaikh [Research Director]
Time: Nov 09, 2017 (03:15 PM - 04:15 PM)
Room: Virginia C

Abstract 1 Title: Teaching Mixed Methods for Advocacy Evaluation
Presentation Abstract 1: The California Department of Public Health funds local organizations and county health departments to advocate for and implement tobacco control efforts. This requires process and outcome evaluation, using a mixed methods (MM) approach. The UC Davis Tobacco Control Evaluation Center functions as a training and technical assistance hub to help these organizations plan, carry out, and report on their evaluations. Based on over 10 years of experience, TCEC presents challenges and solutions to the MM advocacy evaluation and its teaching, as well as the tools used to aid in support of local programs. The greatest challenge our audience has is identifying appropriate evaluation activities and making them work together for advocacy; determining which activities best measure outcome versus process, and writing a concise and compelling report based on implementation and MM results. In response, TCEC has developed road maps and a training module for reporting.
Presentation 1 Other Authors: Sarah Hellesen; sehellesen@ucdavis.edu>; Diana Cassady; dlcassady@ucdavis.edu
Abstract 2 Title: Scaling up the Real Fathers: Lessons from integrating and evaluating an effective family violence prevention intervention within education and livelihood programs in Uganda
Presentation Abstract 2: The REAL Fathers Initiative was pilot tested between 2013 and 2015 as a father-centered mentoring program to reduce IPV and harsh punishment of children in post conflict Northern Uganda. The evaluation results showed that young fathers exposed to the intervention had greater odds of significantly reducing self-reported intimate partner violence (IPV) at endline, compared to those unexposed. Given the positive short and long term effects, REAL Fathers is now being scaled-up and adapted into two ongoing programs, Youth Initiative for Economic and Sustained Livelihoods for Development (YIELD) in Northern Uganda and Early Child Care and Development (ECCD) centers, in Karamoja. A quasi-experimental, mixed-methods evaluation is being implemented to assess the scale-up adaptation and effectiveness of integrating the mentoring project into the YIELD and ECCD programs, in reducing IPV and use of harsh physical child punishment, and increasing use of modern family planning (FP).
Presentation 2 Other Authors: Kim Ashburn (IRH, Georgetown University), Deb Almond (Save the Children US), Dickens Ojamuge (Save the Children International), Mariana Natyang (Save the Children International), Pauline Kabagenyi (Save the Children International), Lizzy Menstell (IRH), Rebecka Lungdren (IRH)
Abstract 3 Title: Utilizing a Mixed Methods Design to Evaluate an Early Head Start-Child Care Partnership
Presentation Abstract 3: In an effort to better address the needs of low-income families with young children (0-3 years), in 2013 the Federal Government committed $500 million for the development of Early Head Start-Child Care (EHS-CC) Partnerships across the nation with the goal of expanding high quality early learning and development opportunities for infants and toddlers. The EHS-CC Partnership evaluated herein partnered with 20 child care providers—12 centers and 8 families—and 10 community agencies to provide a comprehensive set of services to children and their families (e.g., health screenings and immunization, housing, parenting). This study utilized an adapted explanatory design to evaluate the first year of implementation of the Partnership, utilizing in-person interviews, focus groups, and online surveys. This paper will present the method developed for integrating the qualitative and quantitative findings along with challenges encountered in the evaluation study and practical solutions.
Presentation 3 Other Authors: Jane Wiechel, Ph.D., Director for Community Programs & Engagement, The Ohio State University (wiechel.5@osu.edu); Sherrie Sutton, M.S.Ed, LPCC, Service Integration Coordinator, The Ohio State University (cunningham.824@osu.edu)
Abstract 4 Title: Assessing the Effects of a Mass Media Program on Climate Change: A Mixed Method Approach Using Qualitative Interviews, Panel Cohort, and Random Assignment
Presentation Abstract 4: Causal attribution of outcomes to mass media programs remains a primary challenge in evaluation research. We present findings from a multi- and mixed-method evaluation design that assesses the effects of the National Geographic program Years of Living Dangerously, broadcast on cable over eight weeks. To model the process of change, a nationally representative panel (N=165) watched all weekly episodes and answered questions each week. Another national sample was randomly assigned to watch either an episode of the program or another, control program (N=313, recruitment ongoing). A third component randomly assigned participants to watch the program or a control video in a laboratory setting (N=371). A qualitative study (N=74) conducted in five US regions provided a rich context for the underlying mechanism linking exposure to the program with key outcomes. This panel presents results from all four components to both document outcomes and describe the mechanism of change.
Presentation 4 Other Authors: Shafer, Madelyn, ; Kaur, Amandeep,; Wagstaff, Laura,; Daniels, Tiffany
Theme: Select one
Audience Level: None

Session Abstract (150 words):  Mixed Methods Evaluation and Social Issues

For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.

Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 16, 2017. Email cancellation requests to registration@eval.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 16, 2017 all sales are final. For Evaluation 2017, international attendees and presenters who encounter complications due to the international travel environment will have up to 30 days after the event to request a refund and submit appropriate documentation. No administrative fee will apply for the international requests. The $50 fee will be waived for registrants who planned to travel into the US and experienced international travel issues.