From John Gargani, 2016 AEA President
For many, the word design conjures up images of sleek cars, elegant interiors, and high fashion. What do these things have to do with evaluation? Very little. That is not to say evaluators don’t have style. Heck, we invented the “data-nerd-cool” style long before there were apps, iPhones, the internet, or even desktop computers.
At Evaluation 2016, I’m asking evaluators to bring their data-nerd-cool to the consideration of an entirely different notion of design. For a growing number of professionals, design has come to represent a method, mindset, and moral obligation for making the world a better place. I believe this perspective deserves our attention — and our participation. Evaluation 2016, with its theme of Evaluation + Design, is our opportunity to join in.
Design professionals are a lot like evaluators. They work as individuals and groups to understand the needs, values, and culture of other people, at both an intellectual and a visceral level. Developing a deep connection with others — to no longer see other people as others — is a central premise of their work. This fits squarely with our longstanding emphasis on culturally responsive, empowerment, participatory, and other human-centered approaches to evaluation.
Their work spans the full continuum of evaluation contexts. Designers seek innovative solutions to pressing problems, such as global poverty, public health, gender equity, and environmental sustainability. Their solutions may be products (solar power systems for the rural poor), services (traveling health clinics), or policies (national education systems). They may scale up innovations with nonprofit organizations, for-profit companies, hybrid social enterprises, or government action.
A constant throughout this rich and varied work is the critical role of evaluation. Yet, evaluation as we understand it is only just making its way into the design community, its purpose often limited to that of establishing the effectiveness of innovations implemented at scale. Evaluation can do much more. So perhaps we should take a page out of the designers’ book and see evaluators and designers as members of one community with a common purpose. Working together, we can accomplish what neither can alone.
And perhaps we should see ourselves not only as evaluators, but also designers. Design plays a critical role in our work. In collaboration with stakeholders, we help design programs, evaluations, and ways in which information is communicated. We depend on design, yet it sits at the edge of our professional identity, training, and conversations.
At Evaluation 2016, I hope to bring design closer to the center of our profession. I am working to do that by providing opportunities for imagination, play, collaboration, reflection, and beauty. I believe that is how we, as one community, can become more effective change makers.
I look forward to making that happen with you — by design.
From Denise Roosendaal, AEA Executive Director
AEA has offered the International Partnership Program (IPP) for two years. In these two years, three partnerships were approved for funding.
In March 2015, Washington Evaluators invited members of the National Monitoring and Evaluation Network of the Kyrgyz Republic and the Monitoring and Evaluation Community of Practice (MonEvCoP) of Tajikistan to come to Washington, D.C. to share operational and organizational experiences. Two representatives from these organizations shared what they have learned and consulted with a wider audience of U.S.-based evaluators at a presentation for their membership, government agency representatives, universities, and consulting firms.
The Central Asian Evaluators increased their skills and knowledge about an effective, culturally competent, contextually ethical practice of evaluation through their interactions with the evaluators they met in Washington, D.C. They had the opportunity to form relationships with different organizations in D.C. that do work in Kyrgyzstan; therefore, they strengthened their sense of affiliation with the evaluation community. The evaluators were able to take lessons learned from the visibility and value of evaluation back to their countries to use to enhance evaluation’s role there. Washington Evaluators supported these outcomes because they do believe in strong affiliations with evaluators around the world. – Donna Mertens, professor emeritus, Gallaudet University, and international initiative liaison for the Washington Evaluators
In June 2016, Laurie Stevahn will travel to South Africa to meet with the board of directors of the South African Monitoring and Evaluation Association (SAMEA) to discuss the commonalities between the United States and South Africa in the challenges of professionalization of evaluation. Because AEA and SAMEA are taking different approaches to this topic, there is much to learn from each other to inform a process that ultimately informs the debate on professionalizing evaluators.
There are few experts in professionalizing evaluation. Having such people inform SAMEA, AEA, and the wider key stakeholder community is invaluable not to just SAMEA, but to South Africa. – Dr. Laura Wildschut, academic program coordinator, Centre for Research on Evaluation, Science, and Technology, and SAMEA board member
In September 2016, the Oregon Program Evaluators Network (OPEN) will host Nguyen Thi Thu Que, president of the Vietnam Network for Monitoring and Evaluation (VNME), for a five-day visit to Portland to conduct trainings, including culturally responsive evaluation that can be relevant to OPEN as it reaches out to the large Vietnamese-American community in Portland and engages with local press. She will receive training from OPEN leadership to gain tools and knowledge to take back to the VNME organization. Que will also meet with representatives of Mercy Corps and other relevant organizations to discuss building institutional capacity.
This year, OPEN has chosen to focus our conference, trainings, and other programming on culturally responsive evaluation. Our leadership team identified the IPP opportunity as a perfect way to bring further expertise to Oregon. We chose to partner with the Vietnam Network for Monitoring and Evaluation (VNME) due to an identified opportunity to share expertise.
Oregon has a large and growing population of Vietnamese Americans, leading to a high demand for cultural understanding among Oregonian evaluators. Additionally, VNME is a newly established VOPE, whereas OPEN has been around for 20 years. We see this as an opportunity to share the lessons we have learned over the years as an organization, in hopes of helping a young VOPE flourish. – Lindsey Smith, president, Oregon Program Evaluators Network (OPEN)
This program is an avenue through which AEA can respond to and propose opportunities for AEA’s partnership with peer Voluntary Organizations for Professional Evaluation (VOPEs) in the international arena and between various organizing bodies (AEA affiliates, TIGs, AEA working groups) and officially recognized VOPEs in other countries.
In line with AEA’s values, the objective of this program is to strengthen the field of evaluation by strengthening the professional organizations and practitioners of evaluation in a two-way communication and knowledge sharing framework. AEA values a global and international evaluation community and understanding of evaluation practices, and hopes to gain much knowledge and perspective through these partnerships. Therefore, the partnership (and the associated support) are directed to the VOPEs.
These partnerships are examples of how AEA is assisting organizations and evaluators around the globe to connect in a way that is constructive and collaborative in nature.
The final application deadline for the 2016 fiscal year is June 30. Stay tuned for more information on application deadlines for the next fiscal year’s program.
From Zachary Grays, AEA Headquarters
DEADLINE: Monday, June 20, 2016
AEA welcomes applications for its Graduate Education Diversity Internship Program that provides paid internship and training opportunities during the academic year. The GEDI program works to engage and support students from groups traditionally under-represented in the field of evaluation. The goals of the GEDI Program are to:
- Expand the pool of graduate students of color and from other under-represented groups who have extended their research capacities to evaluation.
- Stimulate evaluation thinking concerning under-represented communities and culturally responsive evaluation.
- Deepen the evaluation profession's capacity to work in racially, ethnically, and culturally diverse settings.
Interns may come from a variety of disciplines, including public health, education, political science, anthropology, psychology, sociology, social work, and the natural sciences. Their commonality is a strong background in research skills, an interest in extending their capacities to the field of evaluation, and a commitment to thinking deeply about culturally responsive evaluation practice.
The Internship: Building on the training content described below, the interns work the equivalent of approximately two days per week at an internship site near their home institutions from approximately September 1 to July 1. The interns may work on a single evaluation project or multiple projects at the site, but all internship work is focused on building skills and confidence in real-world evaluation practices. Interns receive a stipend of $8,000 in recognition of their internship work based on completion of the internship and satisfactory finalization of program requirements, including any deliverables due to the host agency, progress reports, and reflections on the internship experience.
Training and Networking Components: It is assumed that students come to the program with basic qualitative and quantitative research skills. The GEDI Program then works to extend those skills to evaluation through multiple activities:
Fall Seminar. A five-day intensive seminar, held in Claremont, California, provides an orientation that expands the student's knowledge and understanding of critical issues in evaluation, including thinking about building evaluation capacities to work across cultures and diverse groups. The interns complete a self-assessment in the fall, clarifying their own goals during program participation.
AEA Annual Conference. Interns will spend a week at the American Evaluation Association annual conference. While there, they attend (a) pre-conference workshops selected to fill gaps in their knowledge and skills, (b) conference sessions exploring the breadth and depth of the field, and (c) multiple networking events to connect them with senior colleagues. The interns also conduct a small-service learning project in the form of an evaluation of one component of the conference.
Winter Seminar. A three-day seminar, held in January or February, provides the students with additional training, coaching on their evaluation projects, and panel discussions with evaluation practitioners working in a range of contexts.
Evaluation Project. Interns will have the opportunity to provide support to an agency's evaluation activities in close proximity to their graduate institution. Interns will provide three updates on their evaluation project activities as part of the internship program, describing and reflecting on the application of their evaluation knowledge to the actual project activities.
Monthly Webinars. The students gather each month for a two-hour webinar to check in on evaluation projects and site placements, add to existing skill-sets, and learn from invited guest speakers.
AEA/CDC Summer Evaluation Institute. The program ends with attendance at the Summer Evaluation Institute held in Atlanta each June. There, students once again connect and finalize project reporting, attend training workshops, and participate in a graduation ceremony.
Specific Support Mechanisms: Interns are supported by colleagues at school, at their site placements, and within the sponsoring association:
An Academic Advisor. The academic advisor at the Intern's home institution supports and coordinates coursework and other activities, while helping to integrate the internship program with the student's plan of study.
A Sponsoring Agency. Students generally are matched with sponsoring agencies near their graduate institution that provide the opportunity to perform evaluation activities compatible with students' research interests and skills.
Supervising Mentor. A colleague at the host site with evaluation experience acts as a guide and mentor throughout the program.
GEDI Program Leadership. GEDI Program Director and AEA President (2015) Dr. Stewart Donaldson is an experienced evaluator. Working with a cadre of colleagues, he, and Co-Director Dr. Ashaki M. Jackson oversee the curriculum and site placements. Throughout the internship the leadership are available to guide, advise, and support the interns in achieving their professional goals and the goals of the program.
AEA Staff Support. AEA staff provides logistical support throughout the internship. Post-internship, they work to connect program graduates with opportunities for leadership, participation, and networking within the association.
Online Community. The GEDI cohort uses an online community space for checking in, turning in updates, asking questions, and informal networking.
Student Benefits: Interns receive support from advisors and mentors, quality training focused on evaluation, real-world work experience, registration waivers and guidance at two professional evaluation conferences, and multiple opportunities for professional networking. In recognition of the time involved in the program (approximately two days per week), each intern also receives a stipend and is reimbursed for major travel expenses related to the program (airfare and shared hotel specifically), but is responsible for travel incidentals (to and from home/airport, to/from hotels, meals not taken together, etc.).
Eligibility: We seek students who are not already enrolled in an evaluation program/specialization or pursuing an evaluation degree who:
- Are enrolled in a masters or doctoral-level program in the United States and have completed the equivalent of one full year of graduate level coursework;
- Are residing in the United States;
- Have already been exposed to research methods and substantive issues in their field of expertise;
- Demonstrate via written essays the relevance of evaluation training to their career plans and their commitment to culturally responsive practice;
- Are eligible to work for pay in the United States outside of an academic environment (non-U.S. citizens will be asked to provide documentation of current eligibility); and
- Have support from his/her academic advisor.
Criteria for Selection: The interns will be selected based on their completed applications, materials provided, and subsequent finalist interviews focusing on:
- Their thinking around and commitment to culturally responsive evaluation practice;
- The alignment between their skills, aspirations, locale, and internship site placement needs;
- The quality of their academic, extracurricular, and personal experiences as preparation for GEDI; and
- Their capacity to carry out and complete the program, including support from an academic advisor
To apply: Download the GEDI Application and return all requested materials via email as described on that documenton or before Monday, June 20, 2016. Please note that it may take a few weeks to compile the requested information and thus we recommend that you begin as soon as possible before the deadline.
More about the program: Go to the GEDI homepage
From Sheila B. Robinson, Potent Presentations Initiative Coordinator
Greetings, Potent Presenters! Have you given much thought to the role your body plays in your presentation? You may think you appear cool as a cucumber on stage, behind the podium, or in the front of the room, but your hands, feet, and face cannot be trusted to keep your secret if you’re shaking and quaking on the inside. Even if you do feel calm and confident when you present, does your body tell a consistent story to your audience?
Now is a great time to think about body language and public speaking. With conference acceptance notices still more than a month away, perhaps you’re not quite ready to dive into crafting your content or creating your slides. Learning and practicing effective body language for public speaking is time well spent for any type of presentation.
No matter what you’re feeling, it turns out your body is likely telling the audience about it even more than your face! In this Princeton University study, Don't Read My Lips! Body Language Trumps the Face for Conveying Intense Emotions, researchers found “in four separate experiments, participants more accurately guessed the pictured emotion based on body language — alone or combined with facial expressions — than on facial context alone.”
What is a presenter to do? Fortunately, there is no shortage of advice on body language dos and don'ts for presenters. Check out this pair of articles from Business Insider: In 10 Body Language Tips To Make Your Next Presentation Great we are reminded to use the space we have available and do some walking, while in The 10 Worst Body Language Mistakes People Make While Giving Presentations we’re cautioned about crossing arms and legs.
Here, two other sources answer the age-old question “What should I do with my hands?” and with a somewhat surprising and similar answer: pretty much nothing. In What Should I Do With My Hands?, we find out that “luckily, your hands are quite capable of taking care of themselves. Hands — or any other part of the body — are usually a problem when your attention is in the wrong place.” In How to Use Natural, Strong Gestures in Public Speaking, we learn that “any movement that reinforces or amplifies your message is good, and any movement that detracts from your message is not.”
And let us not forget the role of your feet! Watch Your Feet is an amusing video with a serious message on how your feet tell your story. Are you a candlestick, the Eiffel Tower, or a ship at sea? Lin Sagovsky explains how “the feet are the most eloquent storytellers of all!”
Finally, many, if not most, readers are likely already familiar with Amy Cuddy’s famous TED Talk, Your Body Language Shapes Who You Are. With over a bazillion views (okay, 33,878,457 as of this writing), presenters are power posing like crazy at conferences and other presentation venues all over the world.
From Cheryl Oros, Consultant to the Evaluation Policy Task Force (EPTF)
With spring comes renewal, and the Evaluation Policy Task Force (EPTF) is updating its work plan and has added several new members on three-year terms. You may want to consider working on the EPTF Board in the future as part of your contribution to the field of evaluation or becoming involved in current efforts.
AEA instituted the EPTF to develop an ongoing capability to influence evaluation policies critical to practice, such as defining evaluation and its necessary requirements, methods, implementation, resources, and budgets. The EPTF works with congressional committees, the White House, and agencies to promote sound evaluation policies in the federal government. Its projects have included providing guidance on evaluation to the Office of Management and Budget’s (OMB) Executive Office of the President; helping draft legislative language relevant to evaluation; exploring state evaluation efforts; and assisting U.S. federal agencies and representatives in foreign governments in setting policies to expand evaluation capacity.
Learn more about the EPTF’s work and review policy guidelines in The Evaluation Roadmap for a More Effective Government.
The EPTF is currently comprised of 10 members, six of whom are returning members:
George Grob, EPTF chair and former consultant; director, Center for Public Program Evaluation; former deputy inspector general, evaluation, HHS and the Federal Housing Finance Agency
Grob has conducted numerous evaluations; taught evaluation; served on the Editorial Advisory Board of the American Journal of Evaluation and the Advisory Board of the Eastern Evaluation Research Society; and has developed performance management systems. He served as the former executive director of the Citizens' Health Care Working Group; former director of planning and policy coordination at HEW; and the former co-chair of the Evaluation and Inspections Round Table of the President’s Council on Integrity and Efficiency. Grob is a recipient of the AEA Alva and Gunnar Myrdal Government Award.
Mel Mark, former EPTF chair; head of the Psychology Department at Pennsylvania State University
Mark is the former president of the AEA, author of numerous book chapters and articles on evaluation, former editor of the American Journal of Evaluation, and recipient of the AEA Lazarsfeld Award for Contributions to Evaluation Theory.
Stephanie Shipman, assistant director, Center for Evaluation Methods and Issues in the Applied Research and Methods Team, GAO
Shipman is the author of numerous GAO reports on evaluation methods and policies; founder of Federal Evaluators, an informal network of officials in the federal government interested in the evaluation of public programs and policies; and recipient of the AEA Alva and Gunnar Myrdal Government Award and the Eastern Evaluation Research Society’s Invited Author Award.
Jonathon Breul, adjunct professor, McCourt School of Public Policy at Georgetown University
Breul is a member of the United Nations Education, Scientific, and Cultural Organization’s Oversight Advisory Committee; former executive director and partner of the IBM Center for the Business of Government; former senior advisor to the deputy director for management at OMB; and recipient of the AEA Alva and Gunnar Myrdal Award for Evaluation Practice.
Rakesh Mohan, director, Office of Performance Evaluations, an independent agency of the Idaho State Legislature
Mohan serves on the evaluation advisory committee for the VOICES for Healthy Kids, a joint initiative of the Robert Wood Johnson Foundation and the American Heart Association. He also serves on the editorial advisory board of the American Journal of Evaluation; served on the AEA board of directors; served on the U.S. Comptroller General’s Advisory Council on Government Auditing Standards; and was a recipient of the AEA Alva and Gunnar Myrdal Government Evaluation Award and the Excellence in Research Methods Award from the National Conference of State Legislatures.
Cindy Clapp Winceck, independent consultant
Clapp Winceck is the former director of the USAID Office of Learning, Evaluation, and Research and serves on AEA International Working Group.
The four new members to join the EPTF this year are:
George Julnes, professor, School of Public and International Affairs, University of Baltimore, Maryland
Julnes served on the AEA Board of Directors and serves on the editorial boards of the American Journal of Evaluation, New Directions for Evaluation, and Evaluation and Program Planning. He has authored publications on evaluation theory and methodology and was a recipient of the AEA Lazarsfeld Award for Contributions to Evaluation Theory.
Nick Hart, senior program examiner at OMB
At OMB, Hart focused on federal Social Security and anti-poverty programs. He is president-elect of Washington Evaluators.
Mary Hyde, director of the Corporation for National and Community Service’s Office of Research and Evaluation
Hyde is responsible for an ambitious research and evaluation agenda capable of comprehensively addressing the agency’s mission and illuminating its most effective policies, programs, and practices. She is a community psychologist with 20 years of experience using empirical evidence and scientific inquiry to improve outcomes for programs, organizations, and communities.
Katrina Bledsoe, director, ThinkShift, DeBruce Foundation
Bledsoe focused on solving the challenge of upward mobility and economic security. She is former consultant to the Annie E. Casey Foundation, the Office of Juvenile Justice and Delinquency Prevention, and the National Science Foundation. Bledsoe has taught evaluation and policy, is the author of publications on evaluation, is on the editorial board for the Journal of Multi-Disciplinary Evaluation, and is recipient of AEA Multiethnic Issues in Evaluation Topical Interest Group’s Scholar Award and Eastern Evaluation Research Society’s Invited Author Award.
The AEA board liaison to the EPTF is:
Kathryn Newcomer, president-elect of AEA and board member
Newcomer is currently serving as director of the Trachtenberg School of Public Policy and Public Administration at the George Washington University. She is fellow of the National Academy of Public Administration; a member of the Comptroller General’s (GAO) Educators’ Advisory Panel; past president of the National Association of Schools of Public Affairs and Administration; and author of books and articles on evaluation, public administration, and leadership.
The EPTF is supported by Denise Roosendaal, AEA executive director, and Cheryl Oros, consultant to the EPTF. For further information and to volunteer to assist the EPTF, contact Cheryl at EvaluationPolicy@eval.org.
From Mike Hendricks, AEA Representative to the International Organization for Cooperation in Evaluation (IOCE), with contributions from Jim Rugh, EvalPartners Co-Coordinator
EvalPartners, the global movement to strengthen evaluation (an important movement in which AEA plays a vital role) has recently published the first-ever global evaluation agenda. It’s titled Global Evaluation Agenda 2016-2020, often shortened to EvalAgenda2020, and in our humble opinion, this Agenda is required reading for anyone interested in evaluation, whether international or domestic. You can read either the full report or just the seven-page executive summary, but we urge you to read at least one. AEA is also tracking various other international activities on our International Connections webpage.
Two beliefs motivated the development of EvalAgenda2020: First, evaluation has enormous potential to improve society, but second, evaluation has yet to reach its full potential. In order to achieve more, four key areas need to be strengthened: (1) the enabling environment for evaluation (including especially more demand for evaluation); (2) institutional capacities to conduct and utilize evaluations; (3) individual capacities to conduct evaluations; and (4) the interlinkages among these first three factors. EvalAgenda2020 includes a separate chapter describing and discussing each of these areas, and each chapter includes a conceptual framework, theory of change and specific ideas for how to move forward.
This is the what of EvalAgenda2020, but it’s also important to know how this Agenda was created. Specifically, it was not created by a small group of “evaluation experts” working in secrecy. Quite the contrary, the contents were developed over a 15-month period of world-wide discussions, beginning in September 2014 with several weeks of online consultations among many participants from around the world. After that start, many of the 90+ evaluation conferences and meetings held during the International Year of Evaluation 2015 discussed different aspects of the emerging Agenda and provided their own suggestions. AEA, for example, contributed four pages of ideas that are appended in Section B of the full Agenda. The near-final draft was presented and discussed in Kathmandu, Nepal, this past November as part of EvalPartners’ Global Evaluation Forum.
Once you read the Agenda, there are three ways you can help bring it to life. First, you can support the eventual outcomes of AEA’s International Working Group (IWG), which is right now identifying ways AEA can help strengthen each of the key areas. The IWG will soon recommend actions to the AEA Board of Directors, and after the board decides, President-elect Kathy Newcomer and Executive Director Denise Roosendaal will report AEA’s action plans during a special session at our annual conference in Atlanta this October. There will be many opportunities for you to volunteer with that important work, once it is clarified.
Second, if you are involved in a Topical Interest Group (TIG) or Local Affiliate, you can help your TIG or Local Affiliate to support EvalAgenda2020 by seeing where their interests overlap with the Agenda. A quick review of the list of TIGs and Local Affiliates tells us there are lots of possibilities here.
Third, each of us as individual evaluators can also support the Agenda. Strengthening our own enabling environment(s), strengthening the capabilities of institutions we work in and/or with, and strengthening our own capabilities and those of our colleagues – each of these supports EvalAgenda2020 and is a great contribution.
These are exciting times for evaluation around the world, and EvalAgenda2020 is energizing the global community of evaluators. Be a part of the action! We are creating a pool of volunteers to pull from for specific projects/tasks related to the international agenda. To submit your name, please email email@example.com. If you are a TIG or Affiliate leader, you can enter details about how your TIG/Affiliate might already be working toward these goals. To enter your activity, please go to http://www.eval.org/p/is/ty/type=8.