Session Number: 2164
Track: Organizational Learning & Evaluation Capacity Building
Session Type: Multipaper
Tags: data use, Data Visualization
Session Chair: Holly Downs [Senior Evaluation Faculty - Center for Creative Leadership]
Discussant: Lindsey D. Varner [Research Administrator and Program Evaluation Specialist - University of Florida]
Presenter 1: Holly Downs [Senior Evaluation Faculty - Center for Creative Leadership]
Presenter 2: Aundrea D Carter [CDC ORISE Evaluation Fellow - Centers for Disease Control and Prevention]
Presenter 3: Korinne Chiu [Director of Research and Evaluation - VaxTrac]
Presentation 2 Additional Author: Anamika Satsangi [Fellow - Centers for Disease Control and Prevention]
Presentation 2 Additional Author: Lindsey D. Varner [Research Administrator and Program Evaluation Specialist - University of Florida]
Presentation 3 Additional Author: Lauren Spigel, MPH [Monitoring and Evaluation Coordinator - VaxTrac]
Time: Oct 29, 2016 (08:00 AM - 09:30 AM)
Abstract 1 Title: Beyond Pretty Graphs: Considering Theory and Design Concepts for Dashboards
Presentation Abstract 1:
While data dashboards are becoming more ubiquitous in data visualization discussions and literature (e.g., Evergreen, 2013; Smith, 2013), there is scarce literature on the theory and underlying assumptions about different audiences and contexts that come with using dashboards. More importantly, in utilization-focused evaluation, the design concepts coupled with concerns of audience readiness and support with these newer technologies need to be further considered. This paper explores the theory behind digital dashboards and the design implications for multiple audiences. The paper also provides tips, advantages, and disadvantages of digital dashboards to help burgeoning designers in evaluation and research select the best designs for the given context.
Presentation 1 Other Authors: Mike Raper, Data Analyst, Center for Creative Leadership
Abstract 2 Title: Increasing Evaluation Use through Strategic Dashboard Planning
Presentation Abstract 2:
From a utilization-focused evaluation perspective, stakeholder engagement is integral to increasing the use of evaluation data and, ultimately, the value of the evaluation. Data are more likely to be used for program improvement when stakeholders are continuously and consistently engaged throughout the monitoring and evaluation process. The online dashboard (i.e., a real-time user interface) is a tool to facilitate user engagement. Dashboards can be incorporated into an evaluation through strategic planning and innovative information design.
This paper establishes considerations for planning and designing data dashboards for program monitoring and evaluation (e.g., identifying primary intended user(s), user needs, data facets; assessing resource availability). This paper also demonstrates the application of these planning considerations through select vignettes of dashboards designed for distinct national and state programs. Finally, we address the implications for evaluation when using dashboards as an interactive tool for stakeholders.
Abstract 3 Title: Building Capacity for Dashboard Use: Examples and applications in different contexts
Presentation Abstract 3:
While it is important to engage stakeholders and dashboard end-users in the design process, it is also important to build their capacity to meaningfully use dashboards in their daily work (Baskett, Lerouge, & Tremblay, 2008). Capacity building for dashboard use refers to training and supporting end-users in their understanding and application of data dashboards to meet their needs. This session will discuss examples of dashboard training, execution, and refinement from operational, analytic, and strategic dashboards with different end-users. Methods and tools for soliciting feedback on dashboard effectiveness and how stakeholders ultimately used the dashboards will be shared. Barriers to use and successful practices for designing dashboard training for effective application will be discussed. The data demand and use literature provide insights into how to increase utility of dashboards by being responsive to user needs and ensuring appropriate use of dashboards for organizations in their decision-making processes (Foreit, Moreland, & LaFond, 2006).
Audience Level: Beginner, Intermediate
It is important for the utilization-focused evaluator to consider how to strategically design dashboards and build stakeholder capacity to use dashboards. Organizing information for stakeholders to readily make evidence-based decisions will ultimately increase the use of evaluation findings. As evaluators and programs alike grow interest in the use of dashboards in the field of evaluation (Evergreen & Metzner, 2013; Smith, 2013), it becomes essential to strategize how to design dashboards with the stakeholder in mind and integrate the dashboard, a real-time tool, into the evaluation process. The purpose of these panel presentations is to understand how multiple evaluation teams designed and implemented dashboards in evaluation of international, national, and state programs. Spotlighting the theme of the 2016 AEA conference, this presentation provides insights into how evaluators and program staff actively understand, strategically design, and effectively implement dashboards to improve their respective evaluations.