Register

Evaluation 2022 Workshops

AEA is excited to present 60+ workshops inspired by Evaluation 2022: (re) shaping evaluation together! Register here to secure your spot.

Seats are limited for these workshops. 

New Orleans will be operating on CST time during the conference.


Two Day Workshops

Full Day Workshops

Half Day Workshops 


Two Day Workshops

01: Principles-Focused Developmental Evaluation

November 7 & 8

9:00 a.m. - 4:00 p.m. 

Presenter: Michael Patton

Developmental evaluation (DE) guides innovative initiatives in complex dynamic environments. One special application of DE focuses on guiding complex adaptive action. The essentials of Principles-Focused Developmental Evaluation will be presented, examined, and applied. Participants will learn to use the GUIDE framework, an acronym specifying the criteria for high-quality principles: (G) guidance for action, (U) utility, (I) inspiration, (D) developmental adaptation, and (E) evaluable. The workshop will include special attention to the relevance and implications of principles-focused developmental evaluation in times of uncertainty and turbulence, as in the global pandemic and accelerating climate emergency. The workshop will cover the niche and nature of developmental evaluation (DE) and principles-focused evaluation (P-FE), and how they interconnect; five (5) purposes and applications of developmental evaluation principles at local and international levels; the particular challenges, strengths, and weaknesses of principles-focused developmental evaluation; and the essential principles and practices for designing and conducting principles-focused developmental evaluations. Concrete case examples will be presented and examined. Small group exercises will provide opportunites to apply concepts and methods. The workshop will include opportunities to work on participants' issues and examples.

02: Utilizing Culturally Responsive and Racially Equitable Evaluation

November 7 & 8

9:00 a.m. - 4:00 p.m. 

Presenters: Tracy Hilliard, PhD, Mindelyn Anderson, PhD, Kristine Andrews, PhD, Paul Elam, PhD, and LaShaune Johnson, PhD

The field of evaluation is being challenged to utilize processes that consider who is being evaluated and who is conducting the evaluation. MPHI has developed a framework for strategic engagement in service of a culturally responsive, racially equitable evaluation (CRREE).

Using this framework can transform evaluations. Critical and substantive nuances are often missed, ignored, or misinterpreted when an evaluator is unaware of the culture of those being evaluated. CRREE can be utilized to undo racism and oppression previously upheld by researchers, evaluators, institutions, and systems.

Full Day Workshops

03: Qualitative Inquiry in Evaluation: An Introduction to Core Concepts and Data Collection Methods

November 8

9:00 a.m. - 4:00 p.m. 

Presenter: Jennifer Jewiss

This workshop introduces core concepts that provide an important foundation for the use of qualitative methods in evaluation. Three primary data collection methods are featured: interviewing, observation, and document review. Partner and small group activities are woven throughout the session to develop participants’ skills in gathering data via these methods. Group discussions explore essential ethical and methodological considerations, including the practice of reflexivity to examine one’s positionality and subjectivity and to foster cultural humility and inclusivity. In addition, the workshop presents a practitioner-friendly conceptual model that illuminates five processes for enhancing the quality of qualitative evaluations and can serve as a valuable touchstone for future evaluation efforts. (Please note that due to the inherent constraints of a six-hour, introductory workshop, data analysis is not covered in this session.)

04: Reshaping Our Role as Evaluators with Trauma-Informed Practices

November 8

9:00 a.m. - 4:00 p.m. 

Presenter: Martha Brown

Our personal and collective traumas, and how we respond to them, shape how we relate to ourselves and others – who too are likely living with unhealed trauma. By not recognizing the impact trauma plays in our lives, we risk perpetuating harm through the evaluation cycle. This workshop will guide us through the journey of realizing the impact of trauma, recognizing its symptoms, avoiding re-traumatization, and promoting healing and resiliency in ourselves and others. Guided by SAMHSA's Principles of Trauma-Informed Care, we will co-create self-care and evaluation practices that heal, not harm; that pay attention to cultural, gender, and historical issues; that empower participants to make choices throughout the evaluation process; and that create safe spaces to do our work.

05: Reshaping Our Positionality and Lens as a Commitment to Transformational and Actionable Culturally Responsive Evaluation Practices with Latinx Communities

November 8

9:00 a.m. - 4:00 p.m. 

Presenter: Lisa Aponte-Soto

Latinos account for nearly 19% of the total population (US Census, 2021). Enacting culturally responsive evaluation (CRE) with diverse multinational, racial, and ethnic Latinx communities demands highly skilled evaluators who can employ evaluation approaches which align and support diverse perspectives in all evaluation phases. This workshop will focus on translating contemporary culturally responsiveness for actionable evaluation practices for and with Latinx communities. The facilitators will highlight synthesized literature and draw on indigenous praxis-oriented perspectives. This workshop is structured in three main components. Part I highlights social justice evaluation theories and foundational principles of CRE with an emphasis on Latino Critical Race Theory (LatCrit). Part II focuses on reshaping our equity lens through self-reflection exercises to assess our positionality as evaluators and foster self-cultivation as agents of culturally responsive evaluators (Symonette, 2008). Part III builds on this paradigmatic framing to apply the nine-step CRE process (Frierson et al., 2010) in action with Latinx communities. This component will discuss the unique cultural values and identity in relevance to Latinx culture and inclusive participatory approaches. The facilitator will also extract from a series of evaluation projects and case studies with Latinx communities to illustrate CRE in practice. Participants should come prepared to ‘dig deep’ and share their experiences with Latinx-focused evaluation planning and practice.

06: Concepts, Design Strategies, and Methods for Evaluating Advocacy and Policy Change Initiatives

November 8

9:00 a.m. - 4:00 p.m. 

Presenters: Annette Gardner and Jared Raynor

Several factors have fueled the need for skilled evaluators that can design appropriate advocacy and policy change (APC) evaluations to meet diverse stakeholder needs: increased foundation interest in supporting APC initiatives to achieve transformational, systems-level change; evaluation of democracy-building initiatives worldwide; and diffusion of advocacy capacity beyond the traditional advocacy community (such as service providers). Evaluators have met these needs with great success, building a new field of evaluation practice, adapting and creating evaluation concepts and methods, and shaping advocate, funder and evaluator thinking on advocacy and policy change in all its diverse manifestations. The field of APC evaluation has matured and now has a rich repository of guides, instruments, and a book to support evaluation practice. But the pandemic and focus on systemic racism have changed the landscape for advocates and their advocacy, such as shifting advocacy tactics to online platforms. Evaluators must similarly adapt their approaches to this new reality, anticipating and navigating change and building advocate evaluation capacity.

The aim of this workshop is to expand evaluator capacity to design tailored advocacy and policy change evaluations under diverse and complex scenarios. The content of this workshop is guided by evaluation research findings on evaluator practice, which are described in the comprehensive book, Advocacy and Policy Change Evaluation: Theory and Practice (Gardner and Brindis). Participants will also explore options for addressing the challenges associated with evaluation practice, such as the complexity and the moving target of the context in which advocacy activities occur, and the challenge of attribution and identification of causal factors.

08: Equity-centered Transformative Research: Transforming the Researcher, the Research Content, and the Practice of Researching

November 8

9:00 a.m. - 4:00 p.m. 

Presenters: Jay Feldman, Nitya Venkateswaran, and Daniela Pineda

The Transformative Research Unit for Equity (TRUE) at RTI International has developed an equity-centered transformative research methodology framework outlining how distinct elements of the research enterprise can be transformed to ensure research and evaluation is in the service of equity. In this session we share that framework and provide training within each of its three core areas. Equity-centered transformative research requires 1) transformation of the researcher, 2) an expansion of the research content, and 3) a shift in the process of conducting research. To that end, in this workshop presenters provide training for:

  • Researcher: to build critical consciousness to recognize and interrupt bias and assumptions
  • Research content: to increase knowledge of the current and historical systemic factors that contribute to inequity in order to focus research on systems, not individuals   
  • Researching: to develop competencies and skills in culturally responsive and equity focused evaluation practices        

CANCELLED: 09: MEL for Community-led Projects: Principles and Tools

November 8

9:00 a.m. - 4:00 p.m. 

Presenters: Gunjan Veda, Holta Trandafili, Matthew Cruse, and Molly Wright

As the decolonization and the locally-led development agendas gain traction globally, there is an increased focus on the need for Community-led Monitoring and Evaluation (ColMEL). Two recent studies – a landscape analysis of 173 CLD programs from 65 countries and a rapid realist review of 56 programs[1] – by the Movement for Community-led Development (MCLD), a global consortium of 1,500+ local civil society organizations and international NGOs, clearly demonstrate that in order to be truly community-led, organizations need to rethink the way they conduct monitoring and evaluation. If communities are leading their own development, they need to know how they are doing, what solutions are working and what aren't and how can they improve them. This means that communities and community-based organizations have to be part of all stages of the MEL cycle, right from deciding what programs should be evaluated for, how and by whom to data analysis, validation and decisions on how to use the results. This requires a radical shift in the way we think about evaluations – they should not be an instrument to measure “human worth, motivation or achievement,”[2] but rather one to support learning and continuous improvement.

This interactive workshop will focus on the principles of community-led development and community-led MEL. It will share tips and two tools with evaluators and evaluation commissioners for undertaking it. The Participatory CLD Assessment Tool is currently being used by organizations of all sizes, across the globe to strengthen their practice of CLD (all the way from design to M&E). The Quality Appraisal Tool for CLD evaluations is a simple excel based tool developed by a multi-organizational team of MEL practitioners to appraise an evaluation report in terms of both rigor and its congruence with the principles of community-led development.

10: (re)Shaping Systems Thinking for Evaluating Complex Interventions

November 8

9:00 a.m. - 4:00 p.m. 

Presenter: Jessica Renger

This workshop will introduce participants to System Evaluation Theory as a framework through which key systems thinking principles can be simplified and applied to meet the demands of complex evaluations. This workshop will draw on a series of published works by both presenters surrounding systems thinking and systems evaluation (Renger, 2015; Renger, 2016; Renger et al., 2017; Renger et al., 2019; Renger et al., in press; Renger et al., 2020). In the first half of the workshop, the presenters will discuss strategies for determining when a systems evaluation approach may be needed, and the presenters will introduce participants to the key systems properties of interdependence and emergence, as well as the systems principles of feedback loops, cascading failures, and reflex arcs. In the second half of the workshop, the presenters will explain how Systems Evaluation Theory integrates the systems properties and principles discussed in the first half of the workshop, to create a comprehensive framework through which complex interventions can be evaluated. After this three-hour workshop, participants will be able to recognize and respond to systems issues in evaluation practice, understand how to adapt their existing program evaluation knowledge to begin evaluating systems, and have access to a practical systems thinking toolkit they can draw upon to feel more confident and capable in their future evaluation endeavors.

11: Social Impact Measurement Using Agile Project Management Approach

November 8

9:00 a.m. - 4:00 p.m. 

Presenter: Marcel Chiranov

Impact measurement seems to have entered quite late in the organizational culture of the entities implementing projects or programs. Understanding the impact of a program can improve the quality and the effectiveness of the respective program, and thus it is of major interest to many organizations and donors. During this period when the budgets are stretched and when more and more people are interested to “do more with less,” being able to design and implement projects with tangible impact in real life can be a substantial competitive advantage for any organization.

Still, it seems there are people questioning whether impact measurement should be on the evaluators' agenda, or rather on the managers’ agenda, since they are responsible for project’s planning and implementation. Without trying to answer this dilemma, the presenter will present a way to measure the impact of an intervention using an Agile project management approach. It is rare for the impact to be fully understood, or correctly, anticipated in the program-planning phase. Quite often this learning builds up during the implementation, based on multiple variables (stakeholders interaction and learning, positive developments in project’s environment, synergy with other economic, social, or policy initiatives, etc.).

Being able to understand when, and how, to approach impact measurement can make the difference between a successful project and a not very successful one. In this workshop the presetner will speak about “impact measurements” as the totality of planned, and/or unplanned, outputs and/or outcomes, where the intervention has contribution, or attribution. In other words, the impact measurement is representing the real life changes caused by the project’s implementation. The projects are acting in complex environments, with many social, economic, security or policy variables. Gaining enough understanding to be able to distinguish between a project’s attribution and contribution would require significant resources, which generally are not available, or are better used to other ends. For this reason, we prefer to treat both attribution and contribution equally.

12: Cost-Inclusive Evaluation: You Can Do It, I'm Doing It, and Here's How

November 8

9:00 a.m. - 4:00 p.m. 

Presenter: Brian Yates

This workshop teaches cost-inclusive evaluation (CIE) with examples from the presenter's 48 years of experience conducting CIE in emergency assistance programs (EAPs) for human rights defenders in international settings, as well as for consumer-centered programs for suicide prevention and mental health services. Illustrations of problems and solutions in CIE are drawn from the presenter's work in EAPs and in treatment and prevention programs for drug abuse, depression, weight control, and adolescent behavioral health. Qualitative and quantitative methods covered include evaluation of costs from multiple interest group perspectives, cost-effectiveness, cost-benefit, and cost-utility. Critiques are provided for Social Return On Investment as well as traditional economic approaches to evaluation. The workshop includes examples of the potential for CIE to reveal hidden inequities, to inadvertently maintain or exacerbate those inequities, and to reduce and remove those inequities. After each topic is taught and illustrated, workshop participants apply their new knowledge to a cost-inclusive evaluation of their own choosing.

13: Deepening Your Participatory Practices

November 8

9:00 a.m. - 4:00 p.m. 

Presenters: Carolyn Fisher, Sofia Alejandra Ladner, and Julia Curbera

Participatory practices are important tools for evaluators working to increase the equity and social justice practices of their work. However, the impact of participatory processes can be limited when they get stuck at a shallow level. In this one-day workshop, facilitators from the Institute for Community Health will lead participants in an exploration of how to go beyond surface-level consultation by presenting on techniques and best practices, leading group discussions and reflections, and engaging with participants in a group project. Topics to be covered include: techniques for co-designing evaluations with folks without previous evaluation background; building deep relationships in the context of an evaluation advisory committee; working in multilingual contexts; making projects more welcoming for people with a wide variety of backgrounds, education levels, cognitive styles, and lived experiences; and making the case for a participatory process to evaluation commissioners. Small groups will conduct a project in which they lay out detailed plans for a participatory evaluation, respond to unexpected situations arising from the co-design process, and make the case for a participatory process to evaluation commissioners.

14: Context and Challenge: A New Stance for Evaluators

November 8

9:00 a.m. - 4:00 p.m. 

Presenter: Gail Vallance Barrington

As evaluators, we have more power and responsibility than we realize but are we ready to be the bridge we should be between programs and their contexts? Do we recognize our biases, racial frames, and historical beliefs? Do we understand how our power and privilege impact our work? Do we view evaluation methods as absolutes or question their foundational assumptions for their impact on social justice? Do we understand the organizations with which we work and how they may constrain social justice? Do we center our programs within the complex systems in which they are embedded and include the broader dimensions of sustainable development and global equity? Do we serve “the greater good?” Should we? Participants will engage in a problem-solving workshop about our stance at the nexus of evaluation practice, identity, global and natural issues, and organizations which fail to respond to the need for urgent change.

It's time to talk about the changes we need to make as evaluation professionals. Large and small group discussions, brainstorming, and tools such as self-assessment, interest-power matrices, systems modelling, dialectical methods, and critical systems heuristics will be employed to understand and unentangle some of these wicked contextual problems. Additional resources will be provided for future use. Participants will have the opportunity to develop personal action plans and make a recommendation to the AEA for the next wise action. All workshop activities will be based on the values of self-reflection, respect, cultural humility, innovation, and personal and professional growth.

16: Making CRE(E) Work: From Design to Dissemination

November 8

9:00 a.m. - 4:00 p.m. 

Presenters: Kimberly N. Harris, Rachel Powell, Jochebad Gayles, Tamarah Moss, and Jennifer Garcia

Inspired by chapters authored in the forthcoming book: Culturally Responsive and Equitable Evaluation: Visions and Voices of Emerging Scholars (Editors, A. Christson Adedoyin, Ndidiamaka Amutah-Onukagha, Chandria D. Jones), these Culturally Responsive and Equitable Evaluators have designed a day-long workshop to teach evaluators how to integrate Culturally Responsive Evaluation Methods into their practice. Culturally Responsive and Equitable Evaluation CRE(E) is a philosophical and methodological approach that transparently centers evaluation in culture. As a result, CRE(E) methodology increases validity and yields outcomes that better align with what community stakeholders want and truly need. Consequently, CRE(E) tends to amplify the stakeholders who are traditionally the most marginalized, the most ignored, and the most impacted by policies, practices, and programs they seldom create. While making a case for CRE(E) has become relatively easier, amid a backdrop of racial injustice that span a spectrum of lethality (persistent and chronic disparities in food access to publicized yet unpunished racial violence against black and indigenous bodies), making CRE(E) work as practice is often met with a range of constraints as reasons why doing CRE(E) is difficult. These reasons span the spectrum of the evaluation engagement and range in a myriad of resource insufficiencies (e.g. awareness, budget, time in the design process, access to community, to funder appetite).

This workshop will address these challenges with examples of CREE(E) implementation at each step of the evaluation engagement, from Design to Dissemination. Attendees will think through their CRE(E) implementation challenges with each other and the workshop facilitators. In addition, workshop participants will learn systematic strategies to help them integrate CRE(E) into an evaluation engagement more consistently. Participants will learn how to identify and leverage opportunities to more comprehensively integrate CRE(E) into their practice. Upon completion of the one day workshop, participants will be part of a community of practice, actively committed to thought partnership around CRE(E) practice.

17: Systemic Design Thinking for Evaluation Practice

November 8

9:00 a.m. - 4:00 p.m. 

Presenter: Janice Noga

The theme for Evaluation 2022, (Re)shaping Evaluation Together, connects well to the topic of this workshop. In terms of social innovation, we are at a juncture for both program design and evaluation that has us questioning historical conventions around social change as well as how to evaluate programs designed to drive such change. There are numerous programs implemented each year, yet, conditions keep worsening. It’s a little like using a pebble to plug a breach in a dam. What are the opportunities that have been missed when designing change efforts to meet complex social challenges? What does evaluation tell us? What does it not tell us? Social innovators and program designers are beginning to recognize that conventional thinking for program design cannot properly address issues of complexity, wickedness, social justice, inclusion, and oppression. Many are starting to embrace an approach to thinking about program design that utilizes systems principles within a design philosophy – this is systemic design thinking.

However, if program design is shifting to embrace systems notions of complexity, interconnectedness, non-linearity, and the importance of context, then so must evaluation. Systemic design thinking for evaluation needs to look beyond models and methods of prediction, linearity, and control to ones that examine how social innovations, both as part of a larger system and systems in their own right, develop as organic responses to pressing needs. Rather than predicting outcomes, evaluation can serve social innovation by helping stakeholders understand what and why something is happening, where change opportunities exist, and how to make adaptive adjustments as the innovation evolves. That is the purpose of this proposed workshop: to help evaluators better understand the challenges facing change agents and innovators, to learn the basics of systemic design thinking, to reflect on how and why evaluation will need to change in order to keep pace, and to practice a little on a concrete example of social innovation. Based on research and the presenter’s own experiences in instructional design and teaching, they believe adults learn best when they are encouraged to put learning into a personal context, question their own practical understanding and beliefs, connect new learning to their personal and professional lives, and identify how such learning can be made useful and relevant.

18: Consulting 101: An Introductory Workshop for Evaluators Who Want to Start Consulting Practices & Consulting 201: Independent Consulting Organizational Principles

November 8

9:00 a.m. - 4:00 p.m. 

Presenter: Matt Feldmann and Laura Keene

This workshop provides the key understandings to initiate an independent consulting practice. The workshop concerns a review of key consulting attributes, niche identification, and challenges participants to think through a marketing approach. The concepts here concern: (1) The Personal Factor, Introductions, Consulting landscape, Do I have what it takes?; (2) Building a Foundation for a Business; Clients, services, and your competitive edge; Niche identification; Marketing; (3) Questions & Next Steps; Parking Lot FAQs, and Participant questions; Networking Karma; wrap-up and planning next steps.

Next comes part 2 of the classic workshop that has been offered consistently at AEA since 1993 and is updated each year based on participant feedback and our experiences in the field. The workshop has evolved over time and will be greatly enhanced from the materials and information from the Independent Consulting Chats that Matt initiated in May 2020 during the pandemic and that meets 48 times per year.

19: Digging Up the Seeds of White Supremacy in Evaluation Practice

November 8

9:00 a.m. - 4:00 p.m. 

Presenter: Rita Sinorita Fierro

Based on the book Digging Up the Seeds of white Supremacy, published in May 2022, this workshop invites participants into deep self-reflection about our own internalized white supremacy. The workshop combines personal journeys, history, systems analysis, and deep personal reflection. In this workshop, participants are inspired to undo internalized white supremacy to bring forth our own freedom and transform our society as a whole – just as writing the book inspired the presenter. The workshop combines the systems analysis in the book with processes from their coaching practice – applying both these micro and macro levels to evaluation practice.

20: Creative Evaluation & Engagement: Theory and Practice

November 8

9:00 a.m. - 4:00 p.m. 

Presenters: Nora Murphy Johnson, A. Rafael Johnson, and Agustín Nancy

Part 1: Theory and Essential Elements of Creative Evaluation Engagement (CE&E). In this workshop, participants will learn about the theory and essential qualities of CE&E, and experience and view examples of how these practices have functioned with clients. This workshop will include overviews of developmental, principles-focused, and arts-based evaluation and engagement as they relate to CE&E. Discussion will be guided by the presenter's 2022 book, Creative Evaluation and Engagement Essentials.

Part 2: Application of the Phases of Creative Evaluation & Engagement (CE& E). In this session, participants will learn about the four phases of CE&E: Align, Learn, Adapt, and Embody. Participants will be asked to examine a current evaluation project/challenge through the CE&E lens as the essential elements of each phase are broken down and applied in a real life context.

21: Beyond the Basics: Foundations and Applications of Qualitative Study Design, Analysis, and Reporting

November 8

9:00 a.m. - 4:00 p.m. 

Presenters: Karen Schifferdecker, Rebecca Butcher and Sharon O'Connor

In this interactive workshop, presenters will provide an introduction and overview of qualitative methods in evaluation before jumping into hands-on practice. The morning will include an overview and review of guidelines and standards, study designs (theoretical frameworks, budgeting, timelines), and considerations of the evaluator’s role and potential biases. Presenters will be using a case study (based on a real project) throughout the workshop. After a short morning break, participants will get substantial hands-on practice time with data coding, analysis, and interpretation. Following lunch, we convene again in the small groups for continued work with some coded data (provided by the presenters). Presenters will close the day with examples of qualitative data reporting and dissemination strategies (seeking input from everyone on the types of projects and audiences they report to, to ensure relevance of the discussion). Participants will leave with new knowledge and skills to effectively integrate qualitative methods in their evaluation work.

22: Evaluation Activation + Career Development in Evaluation (Register for Free Here)

November 8

9:00 a.m. - 4:00 p.m. 

Presenters: Maria Montenegro and Amanda Mottershead

This is a free training that is exclusive to Young Emerging Evaluators (YEE’s), who are less than 35 years of age or have less than five (5) years of professional work experience in evaluation. This session is only available in-person at our annual conference in New Orleans.

This free, interactive workshop is designed to help young and emerging evaluators accelerate their career. The workshop will discuss the realities of a career in evaluation and what it takes to become an evaluator. The workshop will cover and define evaluation competencies and their role in evaluators' career planning. Participants will learn about the evaluation landscape, available career pathways and how to identify professional development opportunities. Through interactive activities, participants will draft a career plan and develop practical skills to develop their career. Participants will leave the training with clear next steps to advance their career journey. 

This training was developed in partnership by EvalYouth Global Network, UNFPA Evaluation Office, P2p+ initiative and Global Evaluation Initiative to promote career development among young and emerging evaluators. The training was developed with input from youth representatives from across the world. EvalYouth North America adapted the training to ensure the content is relevant to young and emerging evaluators in North America.

CANCELLED: 23: Working with Micro-narratives for Monitoring, Evaluation, Learning and Decision-making in Complex Programs: An Introduction to the Practice of Sensemaker

Space is limited to 30 participants for this workshop.

November 8

9:00 a.m. - 4:00 p.m. 

Presenter: Steff Deprez

In this interactive workshop, participants will be introduced to the principles and practice of the SenseMaker methodology. Participants start off with the theory and principles of SenseMaker and the use of micro-narratives for Monitoring and Evaluation of complex programmmes. We will take sufficient time for each step of the SenseMaker process (design, collection, analysis and sensemaking). There will be a variety of examples and cases where SenseMaker was used such as strategic planning, programme formulation, real-time monitoring, particpatory M&E systems and (impact) evaluations. This workshop will be a mix of presentations, group work, Q&A, exercises and guided discussions. Participants will also be able to self-experience the approach.

25: Better Being for Clearer Seeing: An Evaluator Lens “Truth-Up” Promoting Transformation Towards Anti-Racist, Equitable and Liberatory Evaluators and Evaluations

November 8

9:00 a.m. - 4:00 p.m. 

Presentera: Geri Peak and Chloe Greene

In this full-day, interactive workshop, participants will take the morning session to explore how social conditioning informs our unexamined beliefs about race and how racialization subverts our presumptions of bias, rigor and “confidence” in our approaches. We will review the construction of race and how ideas about racial distinction and hierarchy link to beliefs, suppression, violence and oppression. Participants will break into groups at their tables or into triads to explore their own reactions and experiences. After the break, in the second half of the morning we will look at ourselves and consider how personal growth, reflection and transformation in response to deeper acknowledgement of our racialized reality polishes our lens and mirror, sharpens our insight and makes our work more meaningful.

During the workshop, we will look into recent literature and frameworks that explore the impact of epistemicide and the decoloniality and connect and expand the value of diverse ways of knowing. We will also refocus our lens from our trajectory of personal transformation towards the tools and processes we use in support of transforming society, examining how culturally responsive, equitable and liberatory ideas inform the excellence of all of our evaluation practices, holistically. We will close out our time together applying some of the frameworks. Participants will have the opportunity to examine the tools they use through an equitable/liberatory lens.

CANCELLED: 15: Designed for Justice – Embodying Community Feedback in Criminal Legal System Data & Accountability

November 9

8:00 a.m. - 2:15 p.m. 

Presenter: William Faulkner

In this session we will take participants through a firsthand experience of the New Orleans Criminal Legal System (CLS), demonstrating the complexities of how data flows through a local CLS-focused non-profit and providing a forum to shape this data flow for culturally-responsive, equitable community involvement. New Orleans is an epicenter of mass incarceration, with one of the largest incarcerated populations in the most incarcerated state (Louisiana) in the most incarcerated country (the USA) in the world. Among the pillars upholding this status quo are poor public awareness of trends in the CLS, low rates of interest and participation in CLS reform efforts, and specifically for our purposes, underutilization of opportunities to enhance transparency and hold CLS stakeholders accountable.

CourtWatch NOLA (CourtWatch), the local organization around which this workshop will center, uses volunteers and basic, off-the-shelf observation methods to collect, analyze, and communicate data on trends in the ever-evolving tangle of the New Orleans court system. The organization brings deep local experience, including many personal relationships with CLS actors. CourtWatch is currently in the early stages of a multi-year effort to reimagine both the technical and human sides of how evaluative conclusions and learning from its data can feed back to stakeholders (especially the court watchers themselves). This workshop centers around experiential learning, putting participants face-to-face with CLS actors and the reality of the court system, and garnering meaningful contributions from participants to help overcome barriers Courtwatch faces in incorporating the larger community of stakeholders into data collection, analysis, and communication. The participatory activities will proceed step-by-step through the organization’s data flow.

Half-Day Workshops

26: Designing Quality Survey Questions

November 9

8:00 a.m.  - 10:45 a.m.

Presenters: Sheila Robinson and Kimberly Leonard

Facilitators will open this workshop by making a case for an intentional, rigorous, and respondent-centered survey design process through demonstrating three powerful examples of how minor changes in survey item wording resulted in dramatic differences in responses. Participants will then examine a set of survey questions and attempt to identify the various problems embedded in the questions. Facilitators will present real-world examples of how these problems can slip through the cracks, even for seasoned researchers/evaluators, and multi-million dollar companies. They will then share the cognitive aspects of survey design with a demonstration of the demands survey researchers put on respondents with the types of questions we expect them to answer. Participants will engage in a peer review exercise with their own surveys (or facilitator-provided surveys) using the facilitators’ checklist for quality survey questions, published in their text, Designing Quality Survey Questions.

28: Paradise by the Dashboard Light: A Beginner’s Crash Course in Power BI

November 9

8:00 a.m. - 10:45 a.m.

Presenter: Joseph Travers

Data Dashboards are hot these days. Everyone wants them. They hold tons of data, are dynamic and interactive, and they get all the attention. They’re like the popular kids at the high school dance.

However, just like the popular kids, dashboards can seem inaccessible in terms of learning how to use a piece of dashboard software, but also in terms of making them accessible for your stakeholders. Dashboards have so much data and so many visuals, it’s often hard for stakeholders to know what to pay attention to and get key insights from them easily.

Microsoft’s Power BI is one of the leading pieces of dashboard software today, but can be difficult to learn when you’re first starting out. This workshop takes you from complete novice to knowing how to make a simple dashboard - connecting to data, making simple visuals, and making a dashboard that immediately answers the questions your stakeholders have about the data.

Participants must have a laptop with a recent version of Power BI Desktop installed. Power BI is a Windows only program. It is part of an Office 365 organizational license, but can also be downloaded for free. Workshop data will be provided to participants before the conference (if possible) or at the workshop itself.

30: LGBTQ+ 101: Fostering Cultural Responsiveness and LGBTQ+ Diversity in Evaluation Practice

November 9

8:00 a.m.  - 10:45 a.m.

Presenters: Josh Boeger and Kye Adams with support from Gregory Phillips II

In order to re (shape) evaluation together, we need to create a space to re-imagine the concepts of sex, sexual orientation, and gender identities (SSOGI), and how current evaluation practice has promoted limiting and exclusionary categories that fail to capture the complexities and diversity of lesbian, gay, bisexual, transgender, queer, and other sexual and gender minority (LGBTQ+) individuals and communities. This introductory-level workshop will provide a theoretical, practical, justice-focused approach to LGBTQ+ evaluation, challenging attendees to broaden the ways they think of capturing and categorizing data. Recent discourse within the field of evaluation urges evaluators to examine their positions and the power they have to transform the system they work within to build towards a more equitable future. In her 2017 Culturally Responsive Evaluation and Assessment Conference keynote, Dr. Robin Miller called for the field to do better when it came to accounting for sex, sexual orientation, and gender identity within evaluation practice. To do so, it is crucial that evaluators have the language, understanding, and strategies to be inclusive of the LGBTQ+ community in their work.

This workshop will highlight the importance of LGBTQ+ cultural responsiveness and community inclusion both within evaluations tailored to this population, as well as in more general evaluation work. This work draws on a variety of theoretical and practical approaches to evaluation, including Culturally Responsive Evaluation, Empowerment Evaluation, and Systems Evaluation. After laying the groundwork for why this work is necessary, the facilitators will use activities and discussion sections to clearly define the process and steps of what inclusive and culturally responsive language entails when working with the LGBTQ+ community. The remainder of the session will focus on an interactive, collaborative approach to teaching attendees how to apply this mindset and information to evaluation design, data collection and interpretation, and community engagement, and will provide guidance for attendees to help strategize how they will implement this knowledge in their work moving forward.

This workshop is designed for individuals new to working with the LGBTQ+ community, new to conducting evaluations responsive to the needs of LGBTQ+ populations, interested in learning the basics of LGBTQ+ data collection and interpretation, or interested in participating in a collaborative environment designed to advance LGBTQ+ cultural responsiveness within the field of evaluation as a whole.

31: An Intro to Qualitative Comparative Analysis: An Approach for Investigating Nuance and Complexity

November 9

8:00 a.m. - 10:45 a.m.

Presenter: Jason Torres Altman

Do you strive for a cutting-edge evaluation strategy that promotes utility in formative and developmental evaluation, in a way that some traditional methods seem less suited for? Our values drive us to engage stakeholders through our collection methods to elevate the voice of those that are often not heard. We also attempt to make visible those that go unseen, however, many traditional analysis methods reduce them to means or regress them to averages, completely ignoring the important nuance we know exists. Qualitative Comparative Analysis may be the method that comes to the rescue, purposefully investigating complexity, asking in what ways, and to what extent did multiple pathways influence desired implementation across cases. This workshop will provide food for thought for participants but will also challenge intermediate and advanced evaluative thinkers. Please join us for a hands-on and engaging walk through this new method and leave with all of the tools you need to activate and accentuate a more realist perspective and meaningful utility through evaluation.

32: Consulting 301: Taking Your Independent Consulting Practice to the Next Level

November 9

8:00 a.m. - 10:45 a.m.

Presenters: Matt Feldmann and Laura Keene

This intermediate consulting workshop considers practical consulting firm topics including insurance, contracting with professionals, organizational incorporation, contracting, and working with subcontractors. Laura Keene and Matt Feldmann manage successful consulting practices (Keene Insights and Goshen Education Consulting, respectively) and have used the concepts from this intermediate workshop to strengthen their practices. In addition to providing rich information and opportunities to engage in structured activities around these business development concepts, this session will include valuable samples, worksheets, and insider tips. Is it time to get serious about your independent consulting practice? This lively workshop will help you find out and will provide what you need to tackle the tasks ahead. 

33: Facilitation Matters: Techniques for Evaluators

November 9

8:00 a.m.  - 10:45 a.m.

Presenter: Rachel Scott

As evaluators we are called to “act with urgency to help transform the systems, policies, and practice that have created today’s challenges, and help build toward a more equitable, sustainable future”. However, sometimes we can be left wondering how to start this process. With so many voices that need to be heard, how can we be mindful to intentionally value and respect diverse perspectives in doing the work of evaluation? How do we encourage stakeholders to become active participants in evaluation?

This professional development workshop seeks to answer those questions by growing evaluator skill in facilitation techniques. In this workshop, the presenter will share a set of facilitation protocols designed to help evaluators and organizations collect data, interpret data, and contemplate how to use data to make positive change. Session attendees will actively participate in a series of facilitation protocols and leave with new tools to encourage dialogue and collect and use data.

34: Harnessing the Power of Power BI to Support Evaluations and Rapid Data Use

November 9

8:00 a.m. - 10:45 a.m.

Presenters: Gizelle Gopez, Cathy Lesesne, Raj Parihar, Jenica Reed, and Michele Sadler

The use of Power Bi has recently garnered the use and traction across a variety of federal evaluation clients seeking to transform their data into actionable insights. Power Bi is able to combine data from multiple sources via Microsoft Excel that is readily available and is user-friendly for first-time users seeking to streamline data collection and data visualization. This interactive workshop will introduce evaluators to a range of ways to use Power Bi as a key part of evaluation with a special emphasis on using this tool for rapid evaluation where quick sharing and visualizing data can influence program delivery and optimize data-informed decision-making.

Incorporating evaluative thinking in the design, implementation, and use of Power Bi dashboards, this workshop will include: 1) an in-depth demonstration of the Power Bi suite of tools and their functionality; 2) provide three use case examples where Power Bi has supported evaluation efforts; 3) give participants the opportunity to practice using Power Bi (free version) by applying the concepts from each of the case examples; and 4) engage participants in dialogue around effective use of Power Bi in their program evaluations. Evaluation concepts are interwoven throughout the workshop such as defining the purpose, users, and use of the dashboard; defining evaluation questions and data needs; and identifying the visualizations that work best for to answer evaluation questions. The presenters will use a combination of interactive group activities such as practice cases, group discussions, and peer-to-peer learning as a method for deeper discussion about connecting evaluation data to off-the-shelf tools like Power Bi to further rapid data use, data monitoring, and continuous quality improvement.

35: Putting CREE into Practice: Elevating, Listening and Incorporating the Voices of Differently Abled Stakeholders

November 9

8:00 a.m. - 10:45 a.m.

Presenters: Kimberly Harris, Phillip Eaglin, and Delmar Wilson

A culturally responsive evaluation (CRE) practice centers stakeholders' voices, acknowledges the importance of lived experiences, and seeks to intentionally and comprehensively integrate them both into each phase of the evaluation engagement. The structures and policies that create context and consequences play out at every level of one's lived experience, from the individual to the societal. For students living the intersectional reality of differently-ablism and non-whiteness, career identity formation is far too infrequently contemplated. Considering the fact that fewer than 10% of people in STEM are reported as having a disability (NSF, 2019), people with intersecting "otherized" identities are invisible, with neither their capabilities nor their creative wisdom acknowledged or appreciated. Moreover, while differently abled students account for nearly 14% of the school age population, they only make up between 9% and 10% of students pursuing undergraduate STEM degrees. These statistics reduce by approximately half when considering the proportion differently abled students to pursue graduate STEM degrees (5%) (Ghadiri et al., 2018).

This half-day workshop will explore the challenges and strategies for designing and conducting culturally responsive and equitable evaluations of education programs designed exclusively for differently-abled students. What are some of the crucial considerations when designing a culturally-responsive evaluation of education programs for differently abled learners? How do we co-create wisdom and meaning making? How can the evaluation process be used to elevate the voices of student learners? How do teachers and program developers partner with students as allies and mentors? What can we learn from the challenges along the way? We will utilize both a case-study and work-group approach to learn some best practices and co-create potential strategies that can be applied in the field. This workshop will be facilitated by a team comprised of a PI of an NSF funded initiative, Computer Science for All (https://www.changeexpectations.org/csforallrpp), program educator, and the program evaluator. We will present our experience as a Case Study, chronicling our journey and assessing its alignment with the CRE(E) methodological approach.

37: Most Significant Change (MSC) Technique Today

November 9

8:00 a.m. - 10:45 a.m.

Presenter: Dr. Jess Dart

Most Significant Change (MSC) technique is a story-based and participatory approach for unearthing and making sense of expected and unexpected instances of impact. The User Guide, authored by Dr. Rick Davies and Dr. Jess Dart, was released in 2005. Since then, it has been translated in 14 different languages and is used across the world. Despite launching more than 15 years ago, MSC remains an invaluable tool for evaluators and change makers looking to gain greater insight into the impact of their initiatives. It is also finding new relevance in today’s world as a means of unpacking outcomes in even highly challenging and uncertain environments. MSC is able to deal well with emergence and is often cited as a part of the evaluator’s toolkit when working with complexity. It gives voice to people with lived experience. It can be used by adequately prepared novices and can be a useful tool to drive self-determination. It is also evolving in the digital context, and can form a key part of a sophisticated organisational impact measurement framework.

This workshop will provide a basic introduction to this practical and powerful tool, taking participant’s through the core steps. We will then explore the evolution of Most Significant Change, and how this tool can be used to address some of the challenges presented by complexity, uncertainty and genuine participation and voice in today’s context.

38: Transformative Mixed Methods: Supporting Equity and Justice

November 9

8:00 a.m. - 10:45 a.m.

Presenter: Donna Mertens

Transformative mixed methods evaluation designs are consciously employed to address issues of equity and justice. Evaluators who work with populations that experience marginalization, discrimination and oppression can benefit from learning more about how their evaluation designs can address these challenging issues through development of transformative mixed methods designs. This workshop will explain the transformative framework and demonstrate its application through illustrative examples taken from diverse sectors and geographical regions with populations that experience discrimination based on contextually relevant dimensions such as race, ethnicity, gender, disability, economic status and sexual identity. Transformative mixed methods designs will focus on evaluations of program development and effectiveness. Participants will have the opportunity to create mixed methods designs using evaluation vignettes tailored to their interests.

39: Digital Data Placemats

November 9

8:00 a.m. - 10:45 a.m.

Presenter: Brianna Roche

Participatory evaluation techniques are key to including participant voice in our work. Data Placemats can be an effective tool to co-create meaning with stakeholders, but how does this method translate in a virtual space? This session will introduce attendees to "Digital Data Placemats" that can be used for virtual, hybrid, or in-person formats. Bring your data and a laptop, no prior knowledge of Data Placemats required!

40: Longitudinal Methods: Building and Maintaining Participant Commitment to Longitudinal Evaluation

November 9

8:00 a.m.  - 10:45 a.m.

Presenter: Anna Woodcock

Many processes unfold over time, necessitating longitudinal evaluation designs. Spanning a week, month, year, or decades, longitudinal research and evaluation poses a host of methodological challenges, foremost of which is participant attrition. This workshop introduces the Tailored Panel Management (TPM) approach to explore how psychological research informs recruitment and retention strategies in longitudinal studies. Using examples and case studies from more than a decade of research, we will focus on practices regarding compensation, communication, consistency, and credibility that promote sustained commitment to longitudinal evaluation participation.

This workshop will provide: practical skill-building for real-world evaluation and an in-depth understanding of the TPM approach that participants can apply to current and future longitudinal evaluation projects.

41: Letting the Data Speak: Inclusively Integrating and Synthesizing Data to Tell the Story

November 9

8:00 a.m. - 10:45 a.m.

Presenters: Jonathan Jones and Ghazia Aslam

Participants will learn about and experience first hand the Data Analysis, Integration and Synthesis process EnCompass uses to compile, affinity map and synthesize data in stages and using appreciative, participatory and inclusive methods that ;"let the data tell the story" and that provide the team with a coherent, traceable, evidence-based narrative from data collected using up to six different methods and multiple sources of interest (e.g., geographices, stakeholder groups).

Participants will learn and practice using methods and concepts through small group work and individual practice. Focus will be on applying the method and aspects of the method to participant organizational and evaluation contexts. We first present an introduction to the approach - Data Analysis, Interpretation and Synthesis (DAIS) workshop – through an interactive video, mini lecture and discussion around structuring a DAIS workshop. What is it? What is the benefit? Who needs to be there? How will it be formatted?

42: Moving from Resistance to Action: Navigating Resistance in Service of Equity in Evaluation

November 9

8:00 a.m. - 10:45 a.m.

Presenters: Jasmine Williams-Washington, Amber Trout, PhD, and Michelle Revels

Since the summer of 2020, the use of hashtags to signal the support of communities impacted by structural racism is no longer the limit that organizations will go. Today, organizations are moving past virtue signaling and responding to elevated tensions and increased awareness of structural racism by changing how they do their work. We see organizations express desire and take steps to become anti-racists and pro-black shifting the status quo and sharing power with others. This shift will result in some resistance. Resistance is a natural human response to change and often emerges during organizational change efforts. Staff may feel uncertain, uncomfortable, and unsafe. Equity-centered evaluation is not exempt from these feelings. Evaluators can use scenario planning to identify necessary conversations to ‘unfreeze’ resistant staff to see possible changes, building capacity to pursue equity relentlessly.

43: Practical Causal Mapping for Evaluators with the Causal Map App

November 9

8:00 a.m - 10:45 a.m.

Presenter: Steve Powell

Causal mapping presents an exciting refinement of existing non-experimental evaluation methodologies, offering evaluators a clear and straightforward way to analyse large quantities of narrative, qualitative information about causal connections that people make in speech and writing.

This workshop will give a practical introduction to analysing qualitative data causally, in particular using the Causal Map software, leaving participants ready to start analysing their own data. This approach was the winner of the UK Evaluation Society IPSOS Mori Innovation in Methodologies Prize in 2021.

Causal mapping can be used in an exploratory and ‘goal-free’ way to identify and analyse respondents’ perceptions of causal change, and/or to find out about their perception of the effects (intended and unintended, positively or negatively regarded) of a specific intervention, and to confirm whether this is consistent or not with predetermined objectives and theories of change. Causal mapping positions itself at an interesting point half-way between quantitative and qualitative approaches. It is loosely related to newer quantitative methods of causal inference and Directed Acyclic Graphs (DAGs), and can also be compared with traditional Qualitative Data Analysis (QDA), though the process of coding and analysis in causal mapping is much more structured. This helps to produce evidence in a transparent and credible way.

44: Design and Conduct Sound Evaluations Using the CIPP Evaluation Model

November 9

8:00 a.m.  - 10:45 a.m.

Presenter: Guili Zhang

This professional development workshop will teach participants to design and conduct sound evaluations using the updated CIPP Evaluation Model. The interactive, hands on workshop will help participants (evaluators, evaluation clients, evaluation students, etc.) plan, design, budget, contract, conduct, report, and assess program evaluations that meet the requirements of the CIPP Model and professional standards for sound evaluations. The workshop will be taught by the co-author of the authoritative book on program evaluation, The CIPP Model: How to Evaluate for Improvement and Accountability. The workshop will school participants in the current, updated version of the CIPP Model; acquaint the participants with selected checklists contained in the CIPP Evaluation Model book (design, budgeting, contracting, reporting, and metaevaluation); engage groups of participants to use an illustrative RFP to apply the design checklist in planning a context, input, process, or product evaluation and to assess their completed design against the metaevaluation checklist; and provide participants with relevant follow-up materials to make sure they depart the workshop with information on how to obtain additional information and assistance related to applying the CIPP Model.

45: Understanding and Managing Conflict in Evaluation

November 9

8:00 a.m.  - 10:45 a.m.

Presenters: Jeanne Zimmer and Sandra Ayoo

Conflict can occur at any stage of an evaluation, and unresolved conflict can challenge even the most skilled evaluators. Conflict between evaluators and clients and among stakeholders can create barriers to the successful completion of evaluation projects. Evaluators often lack the competency to diagnose and manage conflicts - and may also find themselves with a power disadvantage.

One way to transform the practice of evaluation is to give voice to experiences of conflict and to empower evaluators to understand the nature of conflict in evaluation so that they are not derailed from their practice. There is significant literature examining unaddressed workplace conflict indicating that it can result in not only burnout but also take a toll on physical and mental health. Evaluators typically need to develop their interpersonal competencies as they enter practice, since graduate programs in evaluation may not sufficiently teach non-technical evaluation skills, such as communication and conflict resolution. Evaluators must ensure that the resolution of conflict does not come at the expense of those who are historically and deliberately marginalized, and understand how to hold space for all voices.

Through a hands-on, experiential approach using real-life examples from program evaluation, participants will learn practical applications of conflict-resolution skills as they apply to situations in program evaluation. Attendees will have the opportunity to assess their approach to handling conflict and to improve their self-awareness towards conflict-resolution fluency.

52: Social Determinants of Health

November 9

8:00 a.m. – 10:45 a.m.

Presenter: Becky Garrow

This workshop will cover social determinants of health and how they can influence evaluation design, goals and plans, as well as best practices in evaluation data collection with socio-cultural competency. Understanding the educational, medical, social, economic, and built environment factors that influence health, opportunity, and willingness to participate in evaluation or research studies can help strengthen evaluation planning and procedures.

63: Approaches, Frameworks, and Strategies to Evaluate Coalitions and Collaboratives

November 9

8:00 a.m. – 10:45 a.m.

Presenters: Susan Wolfe and Ann Price

This workshop will cover topics that include an introduction to coalition and collaboratives, benefits and potential pitfalls, frameworks, models, and principles to guide and provide structure for evaluation, tools and resources. The focus will be on understanding how to evaluate the internal structures, processes and dynamics. The facilitators will employ a variety of teaching/learning strategies during this workshop. All of the teaching will be supplemented by a workbook that will be provided to participants that will include materials and information needed to participate in the activities and exercises. Some strategies include:

  • Think, Pair, Share Small Group Discussions – the workshop layout will be such that participants will sit in groups of 4 to 8 individuals. Discussion topics will be presented, and participants will be provided an opportunity to think, talk in pairs, and then share with their smaller group.
  • Interactive Exercises – the workshop presenters will simulate interactive data sharing strategies such as data placemats and data walks.
  • Individual Exercises – Worksheets will be provided to participants that will allow for reflection and individual work.
  • Facilitated/Large Group Discussions – Throughout the workshop the presenters will use an interactive style to engage participants in their learning.

29: Evaluating Coalitions and Collaboratives' Activities, Outcomes, and Impact

November 9

11:30 a.m. – 2:15 p.m.

Presenters: Susan Wolfe and Ann Price

This workshop will cover topics that include how to apply a participative and community empowered approach to evaluation, helping coalitions find and use data for planning and informing their communities, techniques to help coalitions develop action plans with clear output and outcome measures, assisting coalitions with interpreting and using their outcome data, and overcoming challenging when working with coalitions.

The presenters will employ a variety of teaching/learning strategies during this workshop. All of the teaching will be supplemented by a workbook that will be provided to participants that will include materials and information needed to participate in the activities and exercises. Strategies include:

  • Lecture – Very brief lectures will present new topics and content as precursors to each of the more active and interactive strategies.
  • Think, Pair, Share Small Group Discussions – Discussion topics will be presented, and participants will be provided an opportunity to think, talk in pairs, and then share with their smaller group.
  • Interactive Exercises – Presenters will simulate interactive data sharing strategies such as data placemats and data walks.
  • Individual Exercises – Worksheets will be provided to participants that will allow for reflection and individual work.
  • Large Group Discussions –Presenters will use an interactive style to engage participants in their learning.

36: Theory, Practice, and Praxis for Liberatory LGBTQ+ Evaluation

November 9

11:30 a.m. – 2:15 p.m.

Presenters: Josh Boegner and Erik Glenn with support from Gregory Phillips II

This advanced-level workshop will provide a theoretical, practical, and justice-focused approach to lesbian, gay, bisexual, and transgender (LGBTQ+) Evaluation. Through hands-on activities and collaborative inquiry, participants will develop advanced skills enabling them to confidently, competently, and collectively work towards LGBTQ+ inclusion and liberation in their practice. Principles of Equity, Justice, and Cultural Responsiveness within the field of evaluation form the basis of some of our profession’s most important work. Centering marginalized voices, experiences, and individuals in our practice is critical to ensuring we as evaluators are poised to best support our clients and partners. While a large body of theory and practice has grown around these principles, recent discourse has acknowledged the erstwhile exclusion of the LGBTQ+ community from most discussions of culturally responsive and transformative evaluation. Though sometimes mentioned, LGBTQ+ Evaluation has rarely been seriously considered or discussed by the field at large. Moreover, to date, no frameworks, models, or principles specific to LGBTQ+ Evaluation have been widely adopted and implemented.

Recent years have seen the beginning of a sea change in LGBTQ+ Evaluation, as this work has begun to be recognized for its importance within the field. The presenters of this workshop are particularly proud of having been at the forefront of much of the theoretical work which has helped to advance this conversation, including forthcoming keystone publications on the topic in both the American Journal of Evaluation and a special issue of New Directions of Evaluation for which we serve as Guest Editors. However, even with the growing availability of this body of theory, many evaluators may justifiably feel intimidated or uncertain of how to practically approach designing and implementing an LGBTQ+ Evaluation given that there is relatively little prior literature in the field to draw on, and no specific guidance available from professional organizations regarding practice with LGBTQ+ communities. This workshop draws on the momentum of the LGBTQ+ Evaluation theory-building process and its relevance to the field while simultaneously addressing gaps in specific standards or guidance by providing attendees with conceptual, methodological, and practical skills they will need to become effective LGBTQ+ evaluators. “Theory, Practice, and Praxis for Liberatory LGBTQ+ Evaluation” is an advanced, project-based workshop aimed at providing attendees with hands-on, collaborative, collective opportunities to reflect and build upon LGBTQ+ liberation in their own practices. This workshop takes an inquiry-based approach to teaching LGBTQ+ evaluation through encouraging attendees to thoughtfully and critically interrogate what it means–in theory, practice, and praxis–to conduct evaluation with LGBTQ+ communities in a way that is culturally responsive, equitable, and transformative. Liberatory adult education theories and texts, such as pedagogy of the oppressed, will guide both curriculum and teaching throughout the session. Participants will be encouraged to practice creativity, and experiment with anti-oppressive practices.

46: Negotiation Skills for Evaluators

November 9

11:30 a.m. - 2:15 p.m.

Presenters: Sandra Ayoo and Jeanne Zimmer

This is an interactive workshop: brainstorm on what people know about negotiation – definitions or personal reflections of negotiation as a competency in the Planning and Management domain. There will also be a lecture and discussion of concrete examples on interest-based negotiations and persuasion to manage a feasible evaluation plan, budget, resources, and timeline using interests, options, criteria, and alternatives (ICOA) framework. Group work will include creating IOCA negotiation framework, each group working on one of the four concepts. Attendees will also role play a case study to understand tactics like extreme demands, commitment, take-it-or-leave-it offers, unreciprocated offers, personal insults, good-cop and bad-cop, and bluffing.

47: Renovating Organizational Culture: A Look at Measuring Changes in Knowledge, Skills, and Behaviors

November 9

11:30 a.m. - 2:15 p.m.

Presenters: Tanya Hurst and Marisa Acierno

For decades – academics, practitioners, and governments alike have invested resources into strengthening knowledge, skills, and behaviors at the individual and organization levels to address development challenges. From traditional, classroom trainings to one-on-one mentorship, from on-the-job learning to certification programs, individuals and organizations have received targeted and tailored capacity strengthening support to improve outcomes and achieve sustainable change in the areas of agriculture and food security, climate change, democracy human rights and governance, private sector engagement, gender equality, and women’s empowerment, and global health, among many others. As researchers and evaluators, we are often asked to assess the effectiveness of these programs using data gathered from pre- and post-testing, observations, interviews, and self-assessment questionnaires. However, there are limitations in these data collection methods and approaches to improving knowledge, skills, and behaviors which limit one’s ability to truly understand how change has occurred, if any, and its effectiveness.

Based on our team's experience supporting the United States Agency for International Development (USAID) Africa Bureau Division for Economic Growth, Environment and Agriculture (EGEA), Practical, Innovative, On-the-Job Training (PIVOT) model and experience, including the Climate and Finance Practicum (CFP), DevTech will facilitate a hands-on, participatory action learning session exploring a new approach to building knowledge, skills, and behaviors by renovating one’s culture along with the innovative approach to measuring its effectiveness and ability to bring about sustainable and scalable change.

48: Data Quality Management

November 9

11:30 a.m. - 2:15 p.m.

Presenter: Ana Coghlan

A major purpose of many program evaluations is to generate data for decision making. However, how can we be sure that our quantitative data are of good enough quality to make well informed decisions? While evaluators may receive training in aspects of data quality, overarching ways to enhance and manage data quality are rarely addressed. In this workshop, evaluators will be introduced to a comprehensive data quality management system for quantitative data, first developed by the Global Fund and several international development agencies, that consists of specific data quality assessment criteria and standard operating procedures. Through large and small group discussions, participants will first identify their own data quality issues. Participants will then review and relate their own experiences to certain assessment criteria and identify procedures for strengthening the quality of their data. Lastly, participants will review the basic components of a Data Quality Management Plan.

49: Leadership Matters in Evaluation

November 9

11:30 a.m. - 2:15 p.m.

Presenter: Dr. Quinn Motivates

This workshop is designed as a executive coaching framework aimed at enhancing evaluators’ value as trusted advisors in leading data-driven organizational and programmatic changes. As organizations envision scaling and sustaining effective programs and initiatives, they are seeking to retain evaluators who fully understand how to leverage evaluation data for greater impact. Moreover, expectations of evaluators as leaders are even higher among evaluators serving as members of collective impact projects, collaborative partnerships, innovative high-profile initiatives, and multi-site programs. This means that evaluators will need to build competences in thought leadership, strategic planning, and organizational transformation. This workshop addresses the shift in skills and competencies required of evaluators over the past few years as federal and foundation-funded programs expand roles in data utilization, strategic planning, and organizational capacity-building. Strategies and tactics presented are designed to assist evaluators in raising their leadership capacity, and will be shared via three modules as follows:

  • Lecture – Very brief lectures will present new topics and content as precursors to each of the more active and interactive strategies.
  • Module 1: Deconstructing Leadership Styles for Maximizing Evaluators’ Individual and Organizational Performance
  • Module 2: Building Organizational Capacity as Key to Transformational Leadership

50: Harnessing Dynamic Theories of Change to Drive Learning and Achieve Impact

November 9

11:30 a.m. - 2:15 p.m.

Presenters: Michael Moses and Amanda Stek

The social sector has reached an inflection point. Challenging long-held assumptions about power, who holds it, and how they wield it is more important than ever before. Dynamic theories of change – an updated take on a familiar tool – when combined with evaluative thinking and collaborative approaches, can help changemakers learn to more effectively achieve impact.

In this interactive workshop, attendees, confronted by a practice scenario, will practice developing adaptive theories of change. They will be guided through an exercise in which they use learning tools, such as learning logs, after-action reviews, and others, to practice reflecting on emerging data, and leverage evaluative thinking to adjust their theories of change in light of new information. By the end of the workshop, participants will have developed, tested, and iterated on their theories of change, getting ever-clearer about goals, definitions, boundaries, perspectives, and meaning as they do so.

51: Research Methodologies for Decolonization and Co-labor

November 9

11:30 a.m. - 2:15 p.m.

Presenter: Anelise Gregis Estivalet

Collaborative research is considered to have a double meaning: the link with those who preceded us and who have been trying to decolonize sciences since the 1950’s and the specificity in the face of other attempts at a decolonized investigation. Furthermore, the role of researchers and evaluators in the process of perception that the colonized have of themselves, including the intellectuals' own perception of themselves as colonized beings, is essential. However, it is necessary to decolonize not only the form of knowledge production in universities, but also the evaluation practice and the adoption of knowledge that will be worked on in the educational process and in the process of scientific production. Thus, the role of the researcher/evaluator is fundamental for us to advance in a process of social justice, while paying attention to an epistemological vigilance that must constantly permeate the work we do. Moreover, it is necessary to decolonize the way we produce knowledge in universities and how we work with social organizations and society in general.

This workshop will have as its main proposal both the approach of collective research and the resumption of its antecedents as a evaluation proposal that intends to align academic knowledge with those of the investigated actors. All these perceptions coincide at a point where the knowledge product made in collaboration must be useful to people. For this, it is essential to have a collaborative researcher aligned with the groups in which the research is carried out.

53: Forming, Sustaining, and Funding Multi-Agency Collaborative Evaluation Teams to Promote Equity

November 9

11:30 a.m. - 2:15 p.m.

Presenters: Carolyn Ziembo, Sara Shuman, PhD, MPH, Lauren Wechsler, MPP

Program evaluation is typically undertaken and completed at the individual organizational level. However, recent events have highlighted the interconnectedness of people and organizations and the need for increased cross-agency collaboration to achieve racial, health, and economic equity.

In this workshop we will introduce a model for engaging in, sustaining, and funding multi-agency collaborative evaluation — which can be a powerful tool for promoting equity and justice in communities and populations that face social and structural barriers to health and well-being. This session will be taught by a team, representing a community-based organization, funder, and evaluation consultant. 

Specifically, this workshop will outline how multi-agency collaboration in evaluation can promote equity and justice across spaces and populations, focusing on the roles and perspectives of agency leadership, funders, and evaluators. Through the use of case study, discussion, and guided activities, participants will learn the benefits and opportunities of multi-agency evaluation, discuss strategies to form and sustain effective teams, identify evaluation tools that work well across organizations, develop strategies for approaching funders, and create plans of action for their own organizations and work.

54: Mobile Data Collection: Survey Design Using KoBoToolbox and XLSForms

November 9

11:30 a.m. - 2:15 p.m.

Presenter: Qundeel Khattak

KoBoToolbox, developed by the Harvard Humanitarian Initiative, is an open source suite of tools for data collection and analysis in humanitarian emergencies and other challenging environments. Kobo allows for built-in checks, skip logics, and advanced calculation of indicators meaning enumerators do not have to do them manually, which ensures higher quality of data collected and fewer human errors. Data can also be collected offline, surveys created and deployed in multiple languages, and data can be imported and exported in XLSForms. XLSForms is a form standard created to help simplify the creation of surveys in Excel, which is compatible with various mobile data collection tools like KoBoToolbox, ODK, CommCare, and SurveyCTO, among others. KoBoToolbox and XLSForms, ultimately, promote a better use of time and saves on costs as it eliminates the need manual data entry and analysis, as seen with traditional paper-based surveys.

55: Clear the Deck for Evaluations: Organizational Effectiveness

November 9

11:30 a.m. - 2:15 p.m.

Presenter: Becky Garrow

Organizational effectiveness is key to ensuring your group or organization can execute a strong evaluation project. Many people struggle with the distractions and sometimes ineffective practices present in many modern workplaces — such as managing their email inbox, conducting effective and results-oriented meetings, prioritizing time and energy to completing large knowledge-based tasks, delegating, and communicating effectively — especially with different generations. This workshop will cover the major principles of Organizational Effectiveness and will provide participants with real-world strategies, tools, and resources that they can implement immediately with themselves and/or at their workplace to increase productivity.

Increasing organizational effectiveness can help evaluators focus on the work that matters — conducting quality evaluations, supporting communities and clients in meeting their process, outcome, or impact evaluation goals.

56: An Introduction to Cost Benefit Analysis

November 9

11:30 a.m. - 2:15 p.m.

Presenter: Susan Horne

This year's conference theme highlights disruptive shifts in evaluation including "an emergence of new funders and social finance actors." To improve accountability for any impacts, new methods of assessment are needed. In addition, government-funded evaluation projects and certification agencies may require an assessment of the financial efficacy of a project. A Cost Benefit Analysis can provide the requisite proof that a project is financially feasible or incurs costs in excess of any benefits that may accrue. The workshop will introduce attendees to the basics and provide a hands-on exercise performing a cost benefit analysis.

57: The Basics of Using Theory to Improve Evaluation Practice

November 9

11:30 a.m. - 2:15 p.m.

Presenter: John LaVelle and Stewart Donaldson

This workshop is designed to provide practicing evaluators with an opportunity to improve their understanding of how to use theory to improve evaluation practice. Lecture, exercises, and discussions will help participants learn how to apply evaluation theories, social science theories, and stakeholder theories of change to improve the accuracy and usefulness of their evaluations. A range of examples from evaluation practice will be provided to illustrate main points and take-home messages.

58: Actionable Tools to De-bias Your Evaluation Practice

November 9

11:30 a.m. - 2:15 p.m.

Presenter: Deepika Andavarapu

This workshop provides you with three powerful tools to de-bias your evaluation. Equity is often an explicit value that many of us share. However, it is difficult to reach or hear from under-served population and when you reach them, it will be hard to build the trust. In this workshop we will provide you hands-on tools that will operationalize equity. These tools will ensure that your evaluation is truly representative of all groups.

59: Evaluation Plans in Action: Tackling Real-World Roadblocks

November 9

11:30 a.m. - 2:15 p.m.

Presenters: Carissa Beatty and Mary Davis

Developing and executing an evaluation plan are essential foundations for quality programs and initiatives. Collecting and using evaluation data as outlined in your evaluation plan helps to assess needs, drive continuous quality improvement, evaluate outcomes and inform future program decisions. So what happens when you run into the real-world road blocks when trying to roll out your evaluation plan? How can you pivot, salvage, rescue your best-laid evaluation plans when things go wrong?

During this workshop, presenters will discuss the steps to planning and implementing an evaluation plan in a way that provides a clear path to success and explore, through case studies or examples, common situations that cause evaluation plans to stall or fail. In addition, this session will provide peer-to-peer opportunities to discuss barriers and roadblocks to evaluation plan implementation, and brainstorm solutions to these hurdles.

60: Moving from “Silos” to “Symphonies”: Orchestrating Institutional Change Through Diversity Action Planning

November 9

11:30 a.m. - 2:15 p.m.

Presenters: Janelle Coleman and Chris Kilgore

One of the challenges of institutionalizing diversity, equity, and inclusion initiatives at larger flagship institutions is establishing buy-in and university-wide accountability. In this engaging workshop, we will discuss our process for developing a framework for diversity action plans, educating campus leaders on how to design them, creating a means to evaluate them, and developing a structure for implementing and reporting on progress toward meeting established objectives. Presenters will share lessons learned throughout the process, as well as some general practices that participants can implement at their respective institutions. Come ready to engage in meaningful dialogue around strategic planning for inclusive excellence!

61: Developing All Types of Evaluation Budgets Using Checklists

November 9

11:30 a.m. - 2:15 p.m.

Presenter: Guili Zhang

This professional development workshop will teach participants to design the full range of sound evaluation budgets. The interactive, hands on workshop will school participants on the six factors to consider in developing evaluation budget, the ethical imperatives in budgeting evaluation, evaluation budget line items (personnel, travel, consultants, supplies, equipment, services, and indirect costs), and developing different types of evaluation budgets (fixed-price budgets, budgeting under grants, cost-reimbursable budget, cost-plus a fee, cost plus a grant, cost plus a profit; budgeting under cooperative agreements, and modular budgets). The workshop will engage participants to use illustrative RFPs to apply the checklist in designing the full range of evaluation budgets; and provide participants with relevant follow-up materials on how to obtain additional information and assistance related to designing evaluation budgets.

62: Designing a Performance Quality Improvement System That Combines Compliance, Quality Assurance and Performance Measurements to Meet the Needs of a Large Multi Service Non-Profit   

November 9

11:30 a.m. - 2:15 p.m.

Presenter: Stanley Capela

This workshop will focus on training internal evaluators to incorporate compliance, quality assurance and performance measurements in developing a performance quality improvement system that can best meet the needs of small to large multi-service government and non-profit organizations. This lecture will be based on knowledge acquired over 43 years’ experience as an internal evaluator working for a multi-service non-profit, as well as experience as a reviewer of over 149 government and non-profit organizations and programs from 37 states, the District of Columbia, Canada and military bases in three different countries.

64: Creating Anti-Racist Collaborative Evaluation Teams 

November 9

11:30 a.m. - 2:15 p.m.

Presenter: Monique Liston

This workshop is for experienced evaluators who are interested in building highly collaborative anti-racist evaluation teams across organizations or utilizing multiple independent consultants. In this workshop, we will explore barriers to anti-racist collaboration within the evaluation context, build a toolbox for anti-racist collaboration, and prepare for building anti-racist collaborative teams in real life situations.

65: How to Write Evaluation Reports for Non-Evaluators

November 9                                

11:30 a.m. - 2:15 p.m.                              

Presenter: Anane Olatunji

During the first half of the workshop, the presenter will discuss some real-life examples of challenges in writing, and will also look at some examples of what makes a well-written report and why and also visit such topics of under-utilized report sections: scope and limitations. Special attention will be given to the “backside” of reports. This is where the report moves from analysis to conclusions and recommendations. Ironically, these tend to be the most challenging sections for evaluators and yet the most meaningful sections for non-evaluators.

The second half of the workshop will be spent on developing skill directly through writing. Participants will apply the knowledge and insights learned from the first half of the workshop to produce improved written sections of documents. Thus, participants are encouraged to bring in their own reports and work samples to the session. Under the instructor’s lead, the group will offer specific feedback. By the end of the workshop, you will feel much more confident and capable about writing evaluation reports that add value to clients and stakeholders.

 

Search