Data Visualization Doesn't Wait: Using Visuals Throughout an Evaluation Life Cycle

Session Number: DVR3
Track: Data Visualization and Reporting
Session Type: TIG Multipaper
Session Chair: Cynthia Phillips [Evaluator - National Science Foundation]
Presenter 1: Veena Pankaj [Director - Innovation Network, Inc.]
Presenter 2: Amanda Buenz Makulec [Visual Analytics Advisor - John Snow, Inc.]
Presenter 3: Jerome De Lisle [Senior Lecturer in Educational Administration - The University of the West Indies]
Presenter 4: Cynthia L. Blitz, 83782, Ph.D. [Executive Director and Research Professor - Center for Effective School Practices, Rutgers Graduate School of Education ]
Presentation 1 Additional Author: Veena Pankaj [Director - Innovation Network, Inc.]
Presentation 1 Additional Author: Melissa Howlett [Consultant - ORS Impact]
Presentation 3 Additional Author: Jerome De Lisle [Senior Lecturer in Educational Administration - The University of the West Indies]
Presentation 3 Additional Author: Tracey Michelle Lucas [Monitoring and Evaluation Coordinator - Office of the Prime Minister, Gender and Child Affairs]
Presentation 3 Additional Author: Susan Mary Herbert [Lecturer - The University of the West Indies]
Presentation 4 Additional Author: Cynthia L. Blitz, 83782, Ph.D. [Executive Director and Research Professor - Center for Effective School Practices, Rutgers Graduate School of Education ]
Presentation 4 Additional Author: Maria Salinas [Founder - Dissemination Engagement Strategy Group LLC]
Time: Oct 27, 2016 (04:45 PM - 06:15 PM)
Room: A601

Abstract 1 Title: Incorporating Data Viz Products throughout the Evaluation Lifecycle
Presentation Abstract 1:

Data visualization and information design are increasingly gaining traction among evaluation practitioners. This session will highlight different data visualization products that can be used throughout the evaluation project lifecycle to enhance stakeholder understanding and use of evaluation findings. Interim and final products discussed will include theories of change, timelines, strategic debrief decks, visual snapshots, and more. By sharing practical examples of how data visualization has been incorporated into the evaluation consulting process – specifically to aid in the articulation of program strategy and/or theory, the sense-making of evaluation data, and the communication of findings – participants will walk away with practical considerations and recommendations they can apply to the development of products in their own consulting practice.  

 


Abstract 2 Title: Visualizations with Empathy: Developing Audience Personas
Presentation Abstract 2:

Evaluation stakeholders are more than demographic profiles. They’re people, with motivations, needs, challenges, and personal interests. Designing visualizations that help people make decisions requires understanding, in depth, who those audiences are and how they receive, interpret, and connect with information. As part of the human centered design process, teams typically develop user personas for the different stakeholder groups when designing a product, service, or system. We can use this same technique to craft a shared understanding of the audiences for our data visualizations, from evaluation reports to interactive dashboards. Building on experience facilitating workshops to build data visualization capacity among community-based organizations across three continents, participants will learn how they can adapt and apply this method from design to create a sense of empathy with their audiences and design more targeted visualizations for key stakeholders.

 


Abstract 3 Title: Using Implementation Dashboards to Communicate Outcomes for an Enacted Primary Curriculum in Trinidad and Tobago
Presentation Abstract 3:

Data dashboards were used to communicate findings from an implementation evaluation of the Primary Curriculum Re-write (PCR) in Trinidad and Tobago. Data visualization (DV) approaches are aligned to the principles of the Utilization-Focused Evaluation, which encourages collaboration with primary users. In this evaluation, therefore, clients and stakeholders were intimately involved in planning, sharing and discussing findings. Thirteen implementation dashboards were used as DV tools to guide reflection and decision-making by clients. The dashboard designs displayed national and district data from a quantitative survey of 454 teachers across 58 schools on demographics, teachers’ reactions towards the curriculum, various implementation outcomes, and key implementation drivers. Clients were provided with the opportunity to interpret findings and meaningfully discuss implementation progress, gaps and strategies. We gathered data from the clients on the value of these dashboards in generating understanding and insight. This presentation furthers professional dialogue on the best design for implementation dashboards.


Abstract 4 Title: Dissemination by design: Making dissemination an integral component of evaluation
Presentation Abstract 4:

Dissemination is often an activity that is external to the design and implementation of evaluations, whereas in reality it is an integral component of evaluations. Presenters will walk participants through the rationale for considering dissemination by design at each step of the evaluation process and will share examples and tools for integrating dissemination into program evaluations while leveraging best-practices such as tailoring messages and using plain language by systematically engaging with the prospective audiences (i.e., audience analysis and audience segmentation). Presenters will draw from their current and past work disseminating education research findings, and their experience with available tools and resources (e.g., design programs, graphic designers, webinars, infographics, social media).  

 


Audience Level: All Audiences

Session Abstract: 

Data Visualization Doesn't Wait: Using Visuals Throughout an Evaluation Life Cycle