Eval21 Workshops

AEA is excited to present ten workshops inspired by Eval21 Reimagined: A Virtual Experience! These workshops will take place through our Digital Knowledge Hub throughout December, and require separate registration on the Digital Knowledge Hub. Click on the title of each workshop to register. Take a look at what to expect from each session below:

Blue Marble Evaluation

October 29, 12:00 p.m. EDT

Presenter: Michael Q. Patton

Blue Marble refers to the iconic image of the Earth from space without borders or boundaries, a whole Earth perspective. We humans are using our planet’s resources, and polluting and warming it, in ways that are unsustainable. Many people, organizations, and networks are working to ensure the future is more sustainable and equitable. Blue Marble evaluators enter the fray by helping design such efforts, provide ongoing feedback for adaptation and enhanced impact, and examine the long-term effectiveness of such interventions and initiatives. Blue Marble Evaluation consists of principles and criteria for evaluating transformational initiatives aimed at a more equitable and sustainable world.

Attendees will learn:

  • The nature and niche of Blue Marble Evaluation
  • The principles for designing and implementing Blue Marble Evaluations
  • What a theory of transformation is and how it differs from a theory of change

Register Now

 

Data Parties: Something New for Your Evaluation Toolbox?

November 2, 12:00 p.m. EDT

Presenter: Kylie Hutchinson

So, you've analyzed your data, now what? Your next task is to put it into some form of report...or is it? What about hosting a data party? Data parties are a participatory session (in-person or online) where you meet with other staff, community partners, rights holders, and other interested parties to collaboratively and more equitably explore the initial findings. Data parties are an important but often underutilized activity for promoting the uptake of recommendations, knowledge translation, and a learning culture. Join us for an informative and interactive session that covers who, what, why, when, and how of hosting a data party. Using engaging lecture, demonstrations, and real-life examples, we will share practical tips and techniques for data parties including format options, potential pitfalls, and ways to engage partners in making better sense of data.

Attendees will learn:

  • Reasons for holding a data party
  • How to elect an appropriate option for hosting
  • Ways to help stakeholders engage with your data and results
  • Three tips for overall success

Register Now

 

Mixed Methods Design in Evaluation - Eval21

November 412:00 p.m. EDT

Presenter: Donna Mertens

Developments in the use of mixed methods have extended beyond the practice of combining surveys and focus groups. The sophistication of mixed methods designs in evaluation will be explained and demonstrated through illustrative examples taken from diverse sectors and geographical regions. Mixed methods designs will focus on evaluations of program effectiveness. The designs will be tied to the major branches in evaluation as defined by Alkin (2013) and Mertens and Wilson (2019): methods, use, values and social justice. The content is based on Mertens (2018) Mixed Methods Design in Evaluation (Sage). Participants will have the opportunity to create mixed methods designs using evaluation vignettes for each type of evaluation.

Attendees will learn:

  • How to identify the components of mixed methods designs in evaluation for the purpose of determining intervention effectiveness
  • Using a case study, participants will apply the concepts of mixed methods design for a specific context
  • How to determine the applicability of different approaches to mixed methods design in evaluation in their own work

Register Now

 

Evaluability Assessment and Stakeholder Engagement: Nuts and Bolts for Effective Practice

November 1612:00 p.m. EDT

Presenters: Michael Trevisan & Tamara Walser 

Evaluability assessment has historically been used to determine if a program is ready for an outcome evaluation. It can be used to determine whether program theory matches reality, or as a means to increase program plausibility and effectiveness. This workshop will include a brief overview of current evaluability assessment theory and practice, including its resurgence across disciplines and use globally. The focus of the workshop will be the basics of implementing evaluability assessment using our four-component model as a guiding framework. A key thrust of the workshop will be using evaluability assessment to engage stakeholders in meaningful ways. With stakeholder engagement in place, evaluability assessment also supports culturally responsive evaluation, addresses program complexity, and builds capacity. Examples, case scenarios, small group activities, and discussion will allow participants to connect with the content and gain insight into how they can incorporate evaluability assessment in their work and enhance stakeholder engagement.

Attendees will learn:

  • Current theory and uses of evaluability assessment
  • How to implement an evaluability assessment
  • How evaluability assessment can support and enhance meaningful stakeholder engagement

Register Now

 

Evaluating Outside the Box for Meeting the Transformational Moment

November 1812:00 p.m. EDT

Presenter: Scott Chaplowe

This moment for transformation is a response to our dire global emergency to go beyond incremental change for radical innovation and global systems change at multiple levels in society if it is to survive. How can evaluation, a profession in the business of assessment and advising, meet the moment and inform and participate in the transformation? This workshop will explore evaluation’s potential role supporting the transformational agenda of the times. From education and health care to policing and international development, evaluation has become ubiquitous and plays a prominent role in society. As a field that straddles both theory and practice, evaluation is uniquely positioned to support transformational learning and change. However, this potential largely depends on evaluation’s ability to transform from within. This session will examine the concept of transformational change, and both barriers and enabling practices for evaluation to meet the transformational moment. It will provide practical examples and resources for workshop participants to support transformation change in their practice.

Attendees will learn:

  • What is meant by transformational change and transformational evaluation
  • What are the key drivers for the uptake of transformational evaluation
  • What are the key barriers to transformational evaluation
  • What are key principles, methods and evaluation criteria for transformational evaluation

Register Now

 

Designing Quality Survey Questions - Eval21

December 312:00 p.m. EDT

Presenter: Sheila B. Robinson and Kimberly Leonard

Surveys can reach large populations with relatively small investments of time and technology. As survey fatigue grows however, evaluators must be judicious in using surveys and craft richer, more concise, and more focused questions to yield meaningful data. Successful surveys require understanding the cognitive processes respondents employ in answering questions with accuracy and candor. Using rich examples and interactions, facilitators will demonstrate why evaluators must engage in a respondent-centered, intentional survey design process to craft high quality questions, arguably the most critical element of any survey. Participants will learn the survey design process through a series of activities, developing an understanding of the cognitive aspects of survey response and question design. They will leave with the ability to craft high quality survey questions, resources to further develop their skills, and a copy of the facilitator’s checklist for crafting quality questions, published in their book, Designing Quality Survey Questions (SAGE, 2019).

Attendees will learn:

  • Why a respondent-centered, intentional question design process is key to an effective survey
  • Cognitive processes involved in answering survey questions and their implications for question design
  • To identify common problems with survey questions and ways to address them
  • To craft high quality open and closed-ended survey questions with appropriate response options
  • How to employ the facilitators’ checklist to ensure the development of high quality questions

Register Now

 

Design and Conduct Sound Evaluations Using the CIPP Evaluation Model

December 612:00 p.m. EDT

Presenter: Dr. Guili Zhang

This professional development workshop will teach participants to design and conduct sound evaluations using the updated CIPP Evaluation Model. The interactive, hands-on workshop will help participants plan, design, budget, contract, conduct, report, and assess program evaluations that meet the requirements of the CIPP Model and professional standards for sound evaluations. The workshop will be taught by the co-author of the authoritative book on program evaluation, The CIPP Model: How to Evaluate for Improvement and Accountability. The workshop will school participants in the current, updated version of the CIPP Model; acquaint the participants with selected checklists contained in the CIPP Evaluation Model book (design, budgeting, contracting, reporting, and metaevaluation); engage groups of participants to use an illustrative RFP to apply the design checklist in planning a context, input, process, or product evaluation and to assess their completed design against the metaevaluation checklist; and provide participants with relevant follow-up materials to make sure they depart the workshop with information on how to obtain additional information and assistance related to applying the CIPP Model.

Attendees will learn:​

  • To design sound evaluations
  • To conduct sound evaluations
  • To expertly use the current, updated version of the CIPP Model
  • To effectively use the evaluation checklists (design, budgeting, contracting, reporting, and metaevaluation)
  • To apply the design checklist in planning a context, input, process, or product evaluation and to assess their completed design against the metaevaluation checklist

Register Now

 

Equitable Data Storytelling

December 7 & 1012:30 p.m. EDT

Presenters: Jennifer Nulty and Martena Reed

Today, data is everywhere. We have access to massive amounts of data about participants, service administration, and program effectiveness. What isn’t so easily accessible is how we can best use those data to advance equity. Extracting useful takeaway messages and framing our data findings in a way that highlight inequities and identify equitable solutions can be challenging. Traditionally, data stories about our programs and the communities we serve are centered around communicating need and deficits. These narratives can perpetuate stereotypes and marginalize communities of color. We have the power and opportunity to transform the way we report on data so that it is culturally responsive and engages the communities we serve.


We will discuss strategies you can use to tell stories in a way that promotes equity and uses intentional narrative, data, and design. Using case examples, the principles discussed in this workshop will enhance attendees’ ability to communicate data findings that highlight underlying systemic drivers of inequity and consider how historical and structural factors impact participants of your program, your program itself, and its mission. We will use lecture-style training integrated with large group exercises and small group breakout rooms to give participants a chance to reflect and practice using some of the techniques discussed. Harnessing the power of data by using effective storytelling and visualization techniques will give your organization the tools to better communicate program outcomes responsibly.

Attendees will learn:​

  • How to frame data findings in ways that identify equitable solutions
  • Tools to engage the communities we serve in the data design interpretation process
  • Beginner knowledge on how to leverage data and narrative together to visually tell culturally responsive stories.

Register Now

 

Systemic Design Thinking for Evaluation of Social Innovations

December 13, 12:00 p.m. EDT

Presenter: Janice Noga

Historically, social innovation has ignored context and complexity in favor of predictability, control, and linearity. Evaluation has followed with randomized control trials, group comparative designs, and predictive models based on linear assumptions. Yet, in the real world, programs and evaluators work within complex situations. Always. Systemic design thinking in social innovation has evolved to design for the reality of systems change as a response to context, complexity, and interconnected systemic factors. So why hasn’t evaluation design changed to reflect this? We notice complexity, we describe complexity, but we really don’t design evaluations to respond to complexity. This workshop will examine key characteristics and principles of systemic design thinking, systems thinking, and complexity. Discussion and applied activities will focus on integrating the three to explore how we as evaluators can embrace systemic design thinking in our practice that attend to whole-systems ecologies and complexity in program design and outcomes.

Attendees will learn:

  • The key characteristics and principles of systemic design thinking, systems thinking, and complexity
  • The role of systemic design thinking in the creation and evaluation of complex, multi-system, multi-stakeholder services and programs
  • How evaluators can use systemic design thinking to inform new approaches to evaluation that attend to the systemic nature of programs and the contexts within which they function

Register Now

 

Supporting Emerging Evaluators: Building Capacity for Evaluation Dissertations, Theses, and Culminating Projects

December 1512:00 p.m. EDT

Presenters: Tamara Walser & Michael Trevisan 

Evaluation capstones are increasingly common and include dissertations, theses, and culminating projects where students complete an evaluation as their capstone experience. However, there is a lack of guidance from evaluation scholars and practitioners on completing an evaluation capstone. Instead of fitting evaluation capstones into a traditional research project structure, we share a framework that aligns with the fundamentals of evaluation practice—those characteristics of evaluation that make it distinct from other forms of inquiry. This supports students in developing evaluator competencies, addressing standards and principles of the evaluation profession, and contributing to program and disciplinary knowledge. Student and faculty participants will apply the framework to case examples and their own capstone contexts through guiding questions, small group exercises, and group discussion. This workshop contributes to the evaluation field by building capacity for implementing evaluation capstones that meet academic and professional expectations. 

Attendees will learn:

  • The benefits and challenges of evaluation capstones
  • The unique characteristics of evaluation capstones
  • How to apply this framework to address the unique characteristics of evaluation capstones
  • How to meet academic expectations and address professional evaluation competencies, principles, and standards

Register Now

Search