Evaluation 2017: From Learning to Action

View Printable Version

Creative Methods to Enhance Learning from Youth Program Evaluations

Session Number: YFE1
Track: Youth Focused Evaluation
Session Type: TIG Multipaper
Session Chair: Christina Olenik [Vice President, Technical Services - Making Cents International]
Presenter 1: Christine Paulsen [Concord Evaluation Group]
Presenter 2: Louisa Houston Vann, MPH, CHES [Research Associate - University of South Carolina Center for Child and Family Studies]
Presenter 3: Inem Chahal [Evaluation Associate - Montclair State University]
Presenter 4: Lauren Michelle Berny [Evaluation Associate - Centerstone Research Institute]
Presenter 5: Ashley Elizabeth Jehn [UW-Stout]
Presentation 2 Additional Author: Misaela Alquicira [Research Associate - Center for Child and Family Studies - College of Social Work at the University of South Carolina ]
Presentation 3 Additional Author: Eden Kyse [Director, CREEHS - Montclair State University]
Presentation 4 Additional Author: Stephanie Adams [Research Associate - Centerstone Research Institute (CRI)]
Presentation 4 Additional Author: Wendy Shuran [Research Associate II - Centerstone Research Institute]
Presentation 4 Additional Author: Jenna Conner [Research Associate - Centerstone Research Institution]
Presentation 5 Additional Author: Tiffany Delores Smith
Time: Nov 10, 2017 (08:00 AM - 09:30 AM)
Room: Jefferson

Abstract 1 Title: Evaluation of a Multimedia STEM Education Program for Spanish-speaking Families
Presentation Abstract 1:

The children’s public media project PEEP and the Big Wide World is designed to teach STEM to children aged 3-5 years old. This evaluation independently studied the impact of PEEP on Spanish-speaking children and their families. It was a national experiment including 100 families who used PEEP resources compared to 100 families who did not. When compared to Spanish-speaking parents who did not use PEEP, Spanish-speaking parents who used PEEP were significantly more: (1) comfortable exploring math and science with their children, (2) knowledgeable about how to help their children learn math, (3) likely to report enjoying math and science exploration with their children, (4) likely to believe it was their job to help their children understand math, (5) likely to report that their children demonstrated math and science habits of mind. Spanish-speaking children who used PEEP scored significantly higher on the knowledge measure than children who did not use PEEP.


Presentation 1 Other Authors: Jessica Andrews, WGBH Boston
Abstract 2 Title: Recruit, relocate, and retain: Strategies for maintaining participation in longitudinal evaluation
Presentation Abstract 2:

Retaining youth participants in longitudinal data collections can be challenging; however, with purposeful, youth-centered strategies, high response rates can be obtained. The National Youth in Transition Database (NYTD) is a federal evaluation of the John H. Chafee Foster Care Independence Program. Every state is required to collect data from youth transitioning out of foster care to evaluate the effectiveness of the program. Youth are followed for five years and data are collected at 17, 19 and 21 years old. Since 2010, the South Carolina NYTD evaluation have obtained over 70% participation rate in each wave of data collection. Youth transitioning out of foster care are a marginalized, often hard-to-reach population; therefore there are specific challenges faced when attempting to collect data over time. Attendees will leave the session with youth-centered methodological skills to recruit, relocate, and retain youth participants in a longitudinal evaluation.


Abstract 3 Title: Using Dose-Response for Determining the Impact of Youth Development Programs: A Discussion on its Benefits and Limitations
Presentation Abstract 3:

Don’t have the access, time, or resources to establish a control or a comparison group? Don’t think a control or comparison group is appropriate or ethical for your program? This presentation will discuss the dose-response model and its potential for allowing evaluators to assess the impact of youth development programs by measuring participants’ level of exposure to them. The presenters will also discuss the benefits and limitations of this analysis approach compared to the more traditional control/comparison group methods.


Abstract 4 Title: Tailoring Surveys to Adolescents: Learning What Works From Cognitive Interviewing and Pretesting
Presentation Abstract 4:

In order to elicit valid, quality data, a survey must be constructed and tailored to its intended audience. While evaluations frequently involve collecting information from adolescents, the measures used to do so are often incongruent with this target population’s cognitive and social development. To better understand how high-risk youth interpreted and reacted to various survey items, evaluators conducted cognitive interviews and tested the survey with adolescents (ages 12-19) who participated in a sexual education class at juvenile justice centers. Cognitive interviewing gave crucial insight into the length of the surveys, mode of administration, phrasing of the items and responses, comfort level when answering sensitive questions, and participant burn out. Evaluators edited the survey to better align them with these results, which gave way to less confusion, quicker survey implementation, and improved data quality. This study highlights the importance of involving the target population to determine what works in an evaluation.


Abstract 5 Title: Telling a Story to Evaluate an At Risk Music Program
Presentation Abstract 5:

Programming and extracurricular activities are often motivational for youth and encouraged by parents.  However, not all of our youth are in such an environment.  There was an extraordinary opportunity for at risk youth to partake in music lessons for one year at the Wausau Conservatory of Music.  There were ideas as to how this program would impact the children’s lives; however, it was by looking through the lens of all the stakeholders involved in the program that the evaluator learned how each person truly viewed the program through a different lens.  The purpose of this paper is to look closer at how various methods and perspectives contribute to what works in programming for at risk youth.  This presentation will demonstrate how to get the most out of your evaluation by switching to “story mode” and gathering data from all perspectives to give a complete picture (Sole & Wilson, 2002).


Theme: My presentation doesn't specifically relate to the theme
Audience Level: All Audiences

Session Abstract (150 words): 

Creative Methods to Enhance Learning from Youth Program Evaluations



For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.



Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 16, 2017. Email cancellation requests to registration@eval.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 16, 2017 all sales are final. For Evaluation 2017, international attendees and presenters who encounter complications due to the international travel environment will have up to 30 days after the event to request a refund and submit appropriate documentation. No administrative fee will apply for the international requests. The $50 fee will be waived for registrants who planned to travel into the US and experienced international travel issues.