Eval20 Workshops

AEA is excited to present ten workshops inspired by Eval20 Reimagined: A Virtual Experience! These workshops will take place through our Digital Knowledge Hub throughout November, December, and January, and require separate registration on the Digital Knowledge Hub. Click on the title of each workshop to register. Take a look at what to expect from each session below:

162

Collective Power: Using Participatory Leadership Facilitation in Evaluation Design

November 11, 12:00 p.m. EDT

Presenter: Rita S. Fierro, Ph.D.

Have you ever been scared to invite certain stakeholders to an evaluation meeting because of tension management in a charged context? Have you seen a stakeholder bored while you tried to explain logic models? Participatory Leadership (PL) practices, from the Art of Hosting tradition, provide tools to help evaluators design inclusive evaluations without teaching evaluation concepts. You can involve more than 250 people in your design with relatively little time, allowing for community inclusion. The creative tension can be used to identify creative solutions that leverage the group’s collective power. These processes  increase our ability as evaluators to sense, listen, and recognize the culture and history, the power dynamics, and the common threads among seemingly conflicting voices in the organization or in the community. This workshop will readily support organizations in producing their own social change if the evaluation process is in sync with stakeholders' perspective. We will introduce tools that, when mastered, will never have you scared of losing control of a meeting again.

Attendees will learn:

  • To experience a meeting environment that facilitates meaningful and deep conversations by creating liberating structures where all participants contribute and the environment isn't dominated by a handful of participants
  • To experience a dynamic engagement where localized knowledge and the inherent power differences amongst multiple stakeholders are made explicit
  • To use three different facilitation technologies for small (<20 people), medium (20-50 people), or large groups (50-300+ people) for evaluation design
  • To learn how to use these facilitation technologies to understand the culture, status, and priorities of an organization before finalizing the evaluation design
  • To identify four facilitation skills (sensing, synthesizing, holding space, and pausing) to discuss processes and/or group dynamics
  • Understanding of two facilitation theories that identify the importance of emergence and collective intelligence to engage with social complexity and how it is relevant to evaluation design

163

Designing Quality Survey Questions

November 17 & November 19, 12:00 p.m. EDT

Presenter: Sheila B. Robinson, Ed. D

Surveys are a popular data collection tool for their ease of use and the promise of reaching large populations with a potentially small investment of time and technical resources. But as survey fatigue grows, evaluators must be increasingly judicious in using surveys and craft richer, more concise, and more targeted questions to yield meaningful data. Successful survey research also requires an understanding of the cognitive processes that respondents employ in answering questions with accuracy and candor. Using rich examples and an interactive approach, the facilitators will demonstrate why survey researchers must engage in a respondent-centered, intentional survey design process in order to craft high quality questions, arguably the most critical element of any survey. Participants in this workshop will learn about the survey design process through a series of activities, developing an understanding of the cognitive aspects of survey response and question design. In this highly interactive workshop, participants will increase their ability to craft high quality survey questions, and leave with resources to further develop their skills, including a copy of the facilitator’s checklist for crafting quality questions, published in their book.

Attendees will learn:

  • Why a rigorous, respondent-centered, intentional question design process is key to an effective survey
  • Cognitive processes involved in answering survey questions and their implications for question design
  • How to identify common problems with survey questions and ways to address them
  • How to craft high quality open and closed ended survey questions with appropriate response options
  • How to employ the facilitators’ checklist to ensure the development of high quality questions

101

Social Network Analysis: Methods and Use

December 412:00 p.m. EDT

Presenter: Kimberly Fredericks, Dr.

Interest and use of social network analysis (SNA) as a methodology within evaluation continues to climb.  As such we look for new ways and understanding to help with this analysis. This workshop will dive more deeply into the parameters of when and how to use SNA within your evaluation, data collection methods, and statistical analysis of the findings. The workshop is very hands-on, emphasizing the software and using the concepts and methods to answer research questions. It also covers use of network analysis in applied settings. UCINET and Netdraw will be the base programs utilized to analyze data and discuss results.

Attendees will learn:

  • The parameters around what research questions are best suited for the use of SNA
  • How to match research questions and data collection methods within a larger evaluation context
  • How to enter and manipulate data in UCINET and Netdraw
  • To produce basic output and analyze their findings within UCINET and Netdraw

137

More Strategies to Address Weaknesses of Typical Logic Models with an Updated “Condition Modeling” Approach

December 812:00 p.m. EDT

Presenter: Kirk Knestis, PhD

Typical graphical logic modeling approaches illustrate elements of a program’s theory-of-action in terms of the relationships among inputs, activities or processes, outputs, and short- and longer-term outcomes (GAO, 2012; W.K. Kellogg Foundation, 2004). While particular guidance for how to structure such models may vary slightly (e.g., substituting “impacts” for long-term outcomes), the constraining assumptions and structures associated with conventional logic models can leave evaluators, program designers, and managers hung up on vocabulary (is it an “output” or an “outcome?”) or stuck forcing complex programs into too-simple frameworks.

This workshop will share the presenter’s CONDITION MODELING logic mapping approach, a unique derivation of models described by W.K. Kellogg in 1998, designed to mitigate problematic aspects of typical modeling strategies. Condition models differ from traditional tabular or linked-component models in important ways, purposefully violating conventions regarding (1) the use of a limited, a priori component headings; (2) an arbitrary, fixed number of levels of outcomes/impacts; (3) the inclusion of “outputs” as key model components; and (4) the use of semantic distinctions in the language to differentiate activities/strategies from outcomes/impacts. By violating these common assumptions, condition modeling allows designers and evaluators to more accurately represent the theories underlying their programs and more consistently define model elements. This approach is particularly effective in guiding planning, implementation, and evaluation of programs involving multifaceted interventions; complicated mediating/moderating variables; outcomes for multiple, related stakeholder groups; evolving evaluands (e.g., as in developmental evaluations); and multi-site or multi-level implementation models. Condition models illuminate evaluation data needs more effectively than traditional approaches and can enable increasingly powerful uses of modeling in writing proposals for grant funding. The presenter's recent enhancements to the approach improve the usefulness of condition models to plan, assess, and promote sustainability.

Attendees will learn:

  • Differences between condition modeling and more orthodox approaches
  • Potential benefits offered by condition modeling for program and evaluation planning and implementation
  • Specific strategies they can apply in practice, expanding beyond traditional modeling approaches to improve their practices
  • How condition modeling can be used for “virtuoso” evaluation purposes — framing an overarching research agenda, situating an intervention in existing research literature, building a rationale for funding, advancing program theory while improving implementation, and empowering sustainability by focusing on conditions

127

Principles-Focused Developmental Evaluation

December 1012:00 p.m. EDT

Presenter: Michael Quinn Patton

Developmental evaluation (DE) guides innovative initiatives in complex dynamic environments. Principles-focused evaluation (P-FE) is one special application of DE focused on evaluating adherence to effectiveness principles for achieving results and guiding adaptive action. Blue Marble Evaluation (BME), addressing global challenges of sustainability and equity, is the latest advance in principles-focused developmental evaluation. The essential principles of DE, P-FE, and BME will be examined and applied. Participants will learn to use the GUIDE framework, an acronym specifying the criteria for high-quality principles: (G) guidance for action, (U) utility, (I) inspiration, (D) developmental adaptation, and (E) evaluable. Participants will apply the GUIDE framework to their own projects. Integrating DE, P-FE, and BME moves beyond project/program evaluation to evaluate strategies, collaborations, diverse interventions, and systems change initiatives. Complex concepts, systems thinking, and AEA Guiding Principles will be incorporated. Participants will also learn DE, P-FE, and Blue Marble Evaluation methods, designs, applications, and uses.

Attendees will learn:

  • The niche and nature of developmental evaluation (DE), principles-focused evaluation (P-FE), and Blue Marble Evaluation (BME) and how they interconnect
  • Five purposes and applications of developmental evaluation at local and international levels
  • The relationship between complexity, systems thinking, and DE, P-FE, and BME
  • The use of the GUIDE framework for principles-focused evaluation
  • The particular challenges, strengths, and weaknesses of developmental evaluation, principles-focused evaluation, and Blue Marble Evaluation
  • The essential principles for designing and conducting developmental evaluations, P-FE, and BME

143

Learn How to Conduct a Social Inclusion Analysis

December 1512:00 p.m. EDT

Presenter: Meri Ghorkhmazyan

This workshop will be based on the curriculum of World Learning’s Transforming Agency, Access, and Power (TAAP) Toolkit and Guide for Inclusive Development phase two on Social Inclusion Analysis. Thus far in the field of gender equality and social inclusion, many guidelines have been principle-based and have not offered practical tools, templates that would allow practitioners to consistently implement social inclusion analysis studies. The TAAP Social Inclusion Analysis provides a consistent framework, following the steps of general study design and offers data collection, analysis and reporting tools to collect, understand and visually demonstrate identities of individuals based on their experiences of inclusion and exclusion to inform development initiatives. This is done against six domains of social fabric:

  1. Laws, Policies, Regulations, and Institutional Practices
  2. Access to and Control over Assets and Resources
  3. Knowledge, Beliefs and Perceptions, Norms
  4. Power and Decision- making
  5. Roles, Responsibilities, Participation and Time Use
  6. Human Dignity, Safety and Wellness

The first five of the mentioned Domains Analytical Framework was developed by the Harvard Institute for International Development in the mid-1980s, and World Learning modified them to the current gender and social inclusion debate, and added the 6th based on our extensive research.

The participants of the workshop will be able to use a case study and apply the social inclusion analysis tools, visualize, analyze and discuss how the data can be used in their work. By consistently mapping the evidence, they will be able to see intersectionality of various identities who may consistently be experiencing exclusion, inclusion and discuss ways to bring the included, marginalized and excluded groups for more impactful programmatic or policy work. The TAAP toolkit is available online, and after the workshop, the participants will be able to download the toolkit and refer back to the content of the workshop. A TAAP community of practice has also launched since November 2018. The participants will be invited to join the CoP and continue learning about this methodology in a peer group setting with other organizations.

Attendees will learn:

  • Key concepts of inclusion, exclusion and intersectionality, and how it applies to social inclusion analysis
  • Practical skills in using the TAAP tools for collecting data on various identities
  • Concepts and tools to advocate for inclusive programs using programmatic evidence

122

Intermediate Skills in R: Analyzing and Reporting Inferential Statistics

December 1812:00 p.m. EDT

Presenter: Dana Linnell Wanzer, Ph.D.

R is becoming an increasingly popular statistical analysis software for evaluators and researchers because of its free nature and increasing number of resources available on how to use it. This workshop will provide a hands-on opportunity for evaluators to learn how to conduct basic statistical analyses in R using RStudio with example data. This workshop will use the gradual release of responsibility teaching method such that first the statistical technique will be demonstrated, then the entire workshop will do it together, before giving attendees an opportunity to try it out on their own while instructors help anyone with questions. Attendees will leave the workshop with an R script with all the statistical analyses covered so they can apply this knowledge to analyze and report quantitative data for future evaluation reports.

Attendees will learn:

  • How to select, conduct, and interpret basic univariate statistics in R, including independent t-test, dependent t-test, one-way ANOVA, correlation, linear regression, and chi-square
  • How to analyze the reliability of scale measures using Cronbach’s alpha and omega

201

Living in a Remote-Based World...Let's Learn MURAL, a Digital Workspace for Online Brainstorming, Synthesis and Collaboration

January 812:00 p.m. EDT

Presenter: Mahrukh 'Maya' Hasan

During the COVID-19 pandemic did you struggle to transition to remote-based work? Are the tools and platforms you've used just not able to meet the needs of you or your audiences? This workshop is for any level of participants, whether you are just starting out in your career as an evaluator or you've been at it for a very long time. If you're looking for resources and a practical tool to make a transition to highly collaborative, enjoyable and above all effective virtual work, this workshop is for you.

During this workshop, you will learn how to use the powerful interface called MURAL, a digital workspace for visual collaboration. MURAL enables innovative teams to think and collaborate visually to solve important problems. People benefit from MURAL’s speed and ease of use in creating diagrams, which are popular in design thinking and agile methodologies, as well as tools to facilitate more impactful meetings and workshops. If you have been constrained by distance in co-creating a theory of change or using systems thinking diagrams or process maps, or any other visual-heavy practice, this workshop will make a remote-based work session, presentation, or workshop that much easier.

Attendees will learn:

  • How to setup MURAL
  • Virtual ice breakers
  • How to brainstorm on MURAL
  • Different types of remote work tools and pros + cons
  • Good facilitation techniques: key principles
  • Good remote facilitation: key principles

147

Introduction to Infographics: Fundamentals and Tools for Developing Infographics

January 1212:00 p.m. EDT

Presenter: Stephanie Baird Wilkerson

Join us for a two-day workshop to learn how to use infographics to communicate evaluation findings in an effective and engaging way, and also create an infographic using your own evaluation information! Day one of this workshop introduces infographic basics, best practices, and practical tips for using low-cost tools to produce well-designed infographics for a variety of evaluation stakeholders. Participants will learn about the purpose, features, and use of infographics in evaluation as well as criteria for reviewing infographics. Participants will have the opportunity to view demonstrations of tools and steps for developing an infographic and will gain hands-on experience in creating an infographic.

Day two of this workshop will walk participants through the infographic development process for their own evaluation projects using 10 Steps to Creating an Infographic. The second day of the workshop allows participants to use their evaluation findings or information to craft a powerful message, identify evaluation data and visuals to convey that message, and select design elements to bring the message to life in an infographic. Participants must bring evaluation findings, information, and related data visualizations from a project as well as laptops with Microsoft PowerPoint for both days of the workshop. No experience with graphic design or infographics required. Attendees will receive take-home handouts with resources for future use.

Attendees will learn:

  • The purpose, features, and use of infographics
  • Awareness of best practices in creating infographics
  • Awareness and access to tools and resources for creating infographics
  • How to craft a powerful message for your infographic
  • How to identify accurate and compelling visualizations to convey your message
  • How to select design elements that promote clarity and flow so readers can understand your message
  • How to build your infographic using PowerPoint or a free online template

Utilization of a Culturally Responsive and Racial Equity Lens to help Guide Strategic Engagement and Evaluation

January 21 & January 2212:00 p.m. EDT

Presenters:

  • Mindelyn Anderson, PhD, Mirror Group LLC
  • Kristine Andrews, PhD, Child Trends
  • Paul Elam, PhD, MPHI
  • Tracy Hilliard, PhD, MPHI
  • LaShaune Johnson, PhD, Estella Lucia Evaluation LLC

The field of evaluation is being challenged to utilize a process that considers who is being evaluated and who is conducting the evaluation. MPHI has worked to develop useful frameworks, tools, and approaches that evaluators could consider to focus on the ways that race and culture might influence an evaluation process; this has resulted in the development of a framework for conducting evaluation using a culturally responsive and racial equity lens. This workshop focuses on the practical use of a racial equity lens when conducting evaluation. The framework argues that culture and race are important considerations when conducting an evaluation because we believe that there are both critical and substantive nuances that are often missed, ignored, and/or misinterpreted when an evaluator is not aware of the culture of those being evaluated. Participants will be provided with a Template for Analyzing Programs through a Culturally Responsive and Racial Equity Lens, designed to focus deliberately on an evaluation process that takes race, culture, equity, and community context into consideration. Presenters will also share a “How-to Process” focused on the cultural competencies of individuals conducting evaluations, how such competencies might be improved, and strategies for doing so. This “How-to Process” is the result of thinking around developing a self-assessment instrument for evaluators, is based primarily on the cultural-proficiencies literature, and relates specifically to components of the template. Participants will have the opportunity to engage in small-group exercises to apply the concepts contained in the template to real world evaluation processes. Based on these experiences, participants will gain practical knowledge on the use of the lens.

Attendees will:

  • Define structural racism and distinguish it from other forms of racism
  • Introduce the rationale for using a race-conscious approach in evaluation
  • Begin the discussion on how an understanding of structural racism and the intentional inclusion of a racial equity lens transforms the composition of the evaluation team, evaluation process, methods and approaches, and rigor and quality
  • Share examples of lessons learned using a racial equity lens