Let’s Get Real: Evaluation Methodologies in a Virtual World
Session Number: 2995
Track: Integrating Technology into Evaluation
Session Type: Panel
Session Chair: Hung Pho [Program Evaluator - ICF]
Discussant: Kristin Zagar
Presenter 1: Brooke Shelley [Senior Research Associate - ICF]
Presenter 2: Katie Campbell [Senior Associate - ICF ]
Presenter 3: Hung Pho [Program Evaluator - ICF]
Presentation 1 Additional Author: Kathleen Korte Wang [Expert Consultant - ICF ]
Presentation 2 Additional Author: Erica McCoy [Associate ]
Presentation 3 Additional Author: James Hall
Time: Nov 09, 2017 (10:30 AM - 11:15 AM)
Room: PARK TWR STE 8210
Abstract 1 Title: Design Matters: Encouraging Feedback in a Virtual World
Presentation Abstract 1:
Panelists will highlight methods used to promote survey responses for evaluation of Child Welfare Information Gateway (CWIG) special initiative microsites, such as the strategic placement and design of widgets on the microsites and outreach campaigns. Findings from responses, triangulated with web and outreach metrics with survey responses, create actionable recommendations for fostering a more robust evaluation. Panelists will also discuss lessons learned from methods used to evaluate customer satisfaction with the resources and information available on the microsites to promote the initiatives as well as how lessons learned within each evaluation may be applied across all initiatives.
Abstract 2 Title: Linking Outcomes and Engagement: How Evaluation Can Improve the Webinar Experience
Presentation Abstract 2:
Regularly occurring virtual events provide evaluators a chance to get frequent data about participant’s satisfaction and perceived outcomes. This presentation will share methodology for examining participant experience in relation to the engagement strategies and format used within each event. The methodology illustrates how development of event format and engagement strategy choices influence the participant experience and achievement of event goals. Throughout each year, the Capacity Building Center for States conducts numerous events for different purposes and for a variety of audiences. Event staff receive feedback regarding each event and trend reports periodically to inform future planning. The panelist will discuss the lessons learned from these evaluation efforts, including targeted strategies for increasing response rates, and survey design choices.
Abstract 3 Title: Are Participants Really Participating?: Assessing Virtual Engagement
Presentation Abstract 3:
How do you know if participants are engaging in virtual technology and does level of engagement matter? The Capacity Building Center for States convened its first virtual conference, hosting over 550 participants at the Child Welfare Virtual Expo: Building Capacity to Address Sex Trafficking and Normalcy (Virtual Expo) in 2016. Panelists will highlight the methods used to develop an “engagement index”, consisting of a composite score of participation in the key virtual activities, such as live video streamed sessions, chatting, and an Exhibit Hall featuring 9 “virtual booths” with downloadable resources and facilitated virtual conversations. A mixed methods design using surveys embedded into the flow of virtual activities, post conference focus groups, and virtual platform metrics facilitated linkages between engagement and satisfaction as well as conference outcomes, satisfaction, and achievement of conference objectives.
Theme: Learning What Works and Why
Audience Level: All Audiences
Session Abstract (150 words):
Using virtual technologies is becoming a necessity for all organizations involved in disseminating information and supporting capacity building efforts for their target audiences. In some ways, the proliferation and use of these technologies has outpaced the field’s understanding of which technologies are effective, and under what conditions. Three panelists, drawing from their evaluation work with a well-established clearinghouse service and newly formed capacity building organization, will discuss lessons learned from the development and implementation of various evaluation designs and methods for assessing the effectiveness of specific virtual technologies, particularly those focused on the spread of information and engagement of virtual audiences.
For questions or concerns about your event registration, please contact firstname.lastname@example.org or 202-367-1173.
For questions about your account, membership status, or help logging in, please contact email@example.com.
Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 16, 2017. Email cancellation requests to firstname.lastname@example.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 16, 2017 all sales are final. For Evaluation 2017, international attendees and presenters who encounter complications due to the international travel environment will have up to 30 days after the event to request a refund and submit appropriate documentation. No administrative fee will apply for the international requests. The $50 fee will be waived for registrants who planned to travel into the US and experienced international travel issues.