Evaluation 2015: Exemplary Evaluations in a Multicultural World

View Printable Version

Disparate Roles of Electronic Capabilities in Evaluation

Session Number: ITE3
Track: Integrating Technology into Evaluation
Session Type: Multipaper
Session Chair: Paul Lorton Jr [University of San Francisco]
Presenter 1: Candace H. Lacey [Southeast Research and Evaluation Associates]
Presenter 2: Andrew MacDonald [ICF International]
Presenter 3: Karl Poonai
Presenter 4: BA Laris
Presenter 5: Shawn J. Hime [Associate Research Scientist - University of Wyoming]
Presentation 1 Additional Author: Amalio c. Nieves [Broward County Public Schools]
Presentation 2 Additional Author: Shauna Clark [ICF International]
Presentation 2 Additional Author: Helene Jennings [ICF International]
Presentation 3 Additional Author: Genevieve Grimes [National Cancer Institute, NIH]
Presentation 3 Additional Author: William M. Leiserson [National Cancer Institute, NIH]
Presentation 4 Additional Author: Pamela Drake [Senior Research Scientist - ETR ]
Presentation 4 Additional Author: Mona Desai [Senior Research and Evaluation Manager - Childrens Hospital Los Angeles]
Presentation 5 Additional Author: Laurel Allison Wimbish
Time: Nov 14, 2015 (08:00 AM - 09:30 AM)
Room: McCormick

Abstract 1 Title: What Good Is an Evaluation if No One Reads It?
Presentation Abstract 1: <p align="left">As program evaluators, project managers, funders, and project staff members, we agree that our work is for naught if our findings are not read and widely disseminated. As stakeholders' time is limited and evaluation budgets are shrinking we sought a new avenue to reach our diverse and geographically dispersed stakeholders, the e-eval! This presentation will focus on sharing our experiences and challenges and the process we used in creating and disseminating the e-evals. We will also discuss the technology requirements and our plans for the further expansion of this exciting reporting tool including multi-lingual version of our e-evals.
Abstract 2 Title: Virtual Site Visits for a Multicultural World
Presentation Abstract 2: Evaluating multi-site interventions can be challenging when locations are spread across a wide geographic area and involve diverse groups of stakeholders. Virtual site visits may be valuable tools for gathering data on such interventions, as ICF demonstrated when it prepared case studies of the Foster Grandparent Program (FGP). FGP is a nationwide grant program to connect low-income seniors with service opportunities involving children and youth. FGP brings together participants from multiple generations and experiences, and serves a dual purpose: to help children benefit from the experience of older role models, while also providing opportunities for older adults to stay active in their communities. In this presentation, ICF describes how it used both in-person and virtual site visits to inform the case studies. Presenters discuss the benefits and challenges of virtual site visits and offer tips for maximizing the value of virtual site visits, such as pre-visit questionnaires and video-conferencing.
Abstract 3 Title: Building Better Websites—A Multipronged Approach to Improving Usability and Audience Reach
Presentation Abstract 3: Providing support to researchers through grant awards, training, and technical assistance is central to the mission of the National Cancer Institute (NCI). To determine how it might better support cancer researchers at large, the NCI conducted an evaluation of its publicly accessible web-based directory of research tools and resources. Known as NCI Research Resources (<a href="http://resresources.nci.nih.gov/">resresources.nci.nih.gov</a>), this centralized listing of tools, reagents, and services was developed to enable researchers to find information by scientific category, NCI Division, or via a search function. NCI used a mixed-method approach to assess usability and obtain information regarding audience awareness and usage, as well as preferences regarding content, format, and accessibility. Results were used to revamp the site and inform new strategies for promoting usage among researchers. Evaluation methods and metrics are discussed, particularly their potential applicability to other web-based directories that are intended for those working in health-related research fields.
Abstract 4 Title: Using SharePoint to Track Parenting Teens in Longitudinal Evaluation
Presentation Abstract 4: Tracking and monitoring program participants can be a challenge to evaluators, especially in longitudinal evaluations with vulnerable populations. This demonstration session will showcase a system created on SharePoint to track over 1000 pregnant and parenting teens who were being recruited for participation in an evaluation of a program to prevent rapid-repeat pregnancy. The session will show the structure of the system and how the system was used by field staff to track their caseloads for over 24 months and by program directors to monitor tracking. We will share the successes of this system as well as the limitations. Recommendations for participants attempting longitudinal tracking of hard-to-reach youth will be discussed.
Abstract 5 Title: Catching Fish in a Net with Holes Doesn't Make Sense Either—How to Prevent Gaps in Your Client-Collected Data
Presentation Abstract 5: Data gaps can occur when your client is collecting data for your evaluation. The gaps can be (but are not limited to) temporal (missing necessary data submission deadlines, unclear time frames for data collection of a benchmark or construct, confusing timeframes surrounding enrollment dates, etc.) and/or insufficient (or incomplete) data submissions. Preventing data gaps is crucial in evaluations when comparing measurements over time. Proper database structure removes the necessity of further data validation for most collected data. However, periodic but consistent data validation on the remaining variables improves data quality. Additionally, using technology tools to further aid in accurate data collection through automated reminders ensures valuable data. Using technology to strengthen data validation and facilitate accurate data from your client leads to a more successful evaluation.
Audience Level: None

Session Abstract: 

Disparate Roles of Electronic Capabilities in Evaluation



For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.



Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 6, 2015. Email cancellation requests to registration@eval.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 6, 2015 all sales are final.