AEA Newsletter: January 2017

Message from the President - An Eye Toward 2017

From Kathy Newcomer, AEA President

Newcomer Pic 3 (2).JPG

Happy New Year! Thank you, AEA Members, for giving me this incredible opportunity to work with you to advance evaluation practice and our association! As I embark on this exhilarating journey, I am grateful for all the work that has gone before. Thank you, John Gargani, for leading our association, and thank you to our outgoing board members — Melvin Hall, Robin Miller, Donna Podems, and Stewart Donaldson — for all of your past and ongoing contributions. I look forward to welcoming and working with our new Board members — Rita Fierro, Veronica Olazabal, Deborah Rugg, and our President-Elect Leslie Goodyear! I am also delighted to continue working with our dedicated treasurer, Susan Tucker! We are fortunate to have such talented and committed evaluation professionals serving on the board, as TIG and affiliate leaders, and on working groups!

My goals for AEA over the coming year are to:

  • enhance engagement with our members, the broader evaluation community, and current and potential consumers of evaluation with several initiatives designed to promote learning through evaluation, and
  • boost our efforts to promote professionalization within our field, both domestically and with our many international partners.

Recognizing that our society has experienced some wrenching moments of divisiveness and sorrow over the last two years, and that there is uncertainty in the current political landscape that affects public policies, I want to remind us that program evaluation presents a tool to help society learn about itself and how to act in the public interest. To quote our guiding principles: Evaluators have obligations that encompass the public interest and good. These obligations are especially important when evaluators are supported by publicly-generated funds; but clear threats to the public good should never be ignored in any evaluation. Because the public interest and good are rarely the same as the interests of any particular group (including those of the client or funder), evaluators will usually have to go beyond analysis of particular stakeholder interests and consider the welfare of society as a whole.” This obligation is indeed our challenge moving forward, but I know evaluators are up to the task!

For example, consider that inequality and social immobility have been vexing challenges for public policy at all levels of government and society for decades, even generations. While evaluation itself will not unilaterally address the factors that cause these issues to persist, understanding the role of socioeconomic, racial, and ethnic diversity in our world can enhance the context-appropriateness of our evaluations, and contribute to more informed recommendations about the policies and programs we evaluate, ultimately contributing to more fair and just society. To engage our members and communities this year I am happy to announce that, thanks to the vision and leadership of Dr. Melvin Hall,  AEA will convene three dialogues about the role of race and class in evaluation work and impact leading up to our November conference. The first of these dialogues will be held January 30 in Washington, D.C., followed by a second jointly convened with the American Educational Research Association (AERA) in San Antonio this May, and with CREA in Chicago in September — all culminating with a plenary at our November conference. We will live stream all three 90-minute dialogues, and have a special presentation at our Annual Conference in November in D.C.

As we continue to tackle the challenges faced by evaluators, intentional and strategic professionalization of our field is needed now more than ever. I will continue to work closely with our Competencies Task Force to develop our next steps in getting our competencies promoted and used. Along with our active Evaluation Policy Task Force, we will work to support and encourage the support of longer-term evaluation infrastructure that supports a range of methods as part of professionalization of evaluation.

Our TIGS and affiliates are key partners in professionalization efforts, and I will work closely with both sets of leaders to ensure we support collaborative initiatives. For example, I will work with the AEA management team, TIGs and affiliate leaders to support them in hosting more regional conferences, such as those pioneered by affiliates like the Eastern Evaluation Research Society (EERS), the Oregon Program Evaluators Network (OPEN), the Arizona Evaluation Network (AZENet), and others!

I have already begun collaborating with our fellow evaluation associations and key groups within the International Organizations of Evaluation, such as EVAL YOUTH, in endeavors such as EVAL YOUTH’s recent and highly successful Virtual Conference on December 2.

"From Learning to Action" is the theme of our Annual Conference, and in line with this theme, I challenge our members to:

  • Think creatively about innovative ways to engage audiences at the annual conference — please think beyond panels and posters, and submit something new;
  • Invite evaluators or evaluation users who might not normally attend AEA, but are clearly stakeholders in our work, to participate in conference sessions; and
  • Submit a 60 second video on learning from evaluation to highlight how we can foster learning from evaluation in a variety of settings. Please watch for contest guidelines to follow!

My door and ears are always open and I welcome engaging with our members to learn how AEA can better serve us and promote evaluation practice and evidence-informed policy and practice! Despite the challenges we may face, we are up to it! We have the capacity and skills to help society learn from evaluation! Please help me foster membership engagement — contact me at president@eval.org.

 

Diversity - Evaluating the Design Loft Workshops

From Zachary Grays, AEA Staff, and the 2016-17 GEDI Cohort


20160820_170652.jpgOver the course of the winter break, the GEDI scholars completed an evaluation report on the Design Loft sessions at last year’s American Evaluation Association annual conference. The sessions, conceptualized by Dr. Cameron Norman (Cense Research + Design) and AEA Past-president John Gargani (Gargani + Company), was an experimental educational, hands-on space that challenged participants to think creatively about the intersection of design and evaluation. Dr. Norman hosted eight consecutive 45-minute workshops that we were responsible for evaluating. Our guiding questions for this evaluation were: How satisfied were attendees with the Design Loft? and Did attendees consider the content conveyed during the sessions to be useful?

We answered the evaluation questions using participant surveys, observations, and, our favorite method, Photovoice. As GEDI scholars we also wanted to ensure that each aspect of the evaluation process was culturally responsive to both the environment and attendees of different expertise levels and fields. Our approach included lay terminology in our measures, various ways that respondents could elaborate on their experiences, and the opportunity for participants to ask us questions as we collected data in real-time. In this, we aimed to maintain the learning milieu created by the conference and increase accessibility to the evaluation as it unfolded.

Capture.JPGOne hundred and twenty total participants, including many new and junior evaluators, attended the Design Loft sessions. Paper Prototyping and User Persona were the most popular and most liked. The hands-on, project-based approach added a unique quality and learning value to the conference that also brought people together. Qualitative and quantitative findings indicated that participants enjoyed meeting new people and networking. Observations yielded what we describe as activation of our childlike curiosity and creativity—the importance of the designer’s innovation. Participants seemed to enjoy drawing on the dry erase tables, which offered the freedom to visualize participants’ thinking. This inventiveness—more imaginative thinking beyond technical aspects of an evaluation—can be key to understanding what works in programs.

Participants thought Design Loft sessions provided creative and innovative tools to engage stakeholders and a way to make the logic model/evaluation framework process more interactive for clients. Even though the majority of survey respondents (80 percent) stated that they would apply design thinking in their work, others felt the instruction needed to be more explicitly connected to evaluation. In addition, when asked if the lessons were relevant to their employment, we received an equal number of “very relevant” and “somewhat relevant” responses. Some participants wrote that they were unsure how to incorporate the lessons. One offered that their Design Loft session was “not 100 percent relevant, but that [they] will find a way [to build in the lesson].” The GEDI data revealed some uncertainty and lack of confidence among participants in applying the new skill because they arrive or left with gaps in their knowledge as new evaluators. The Design Loft sessions succeeded at sparking participants’ interest but could not fully ensure the transfer of a concrete skill in the time allotted for the sessions. Therefore, participants were moderately satisfied with the Design Loft sessions.

As GEDI scholars, developing and implementing this project gave us a better understanding of the social responsibility evaluation holds and the privilege of translating participants’ perspectives offers us. The major shift in our current political arena has also moved us to rethink standard approaches to evaluation practices and processes, including how to elevate voices and interpret data carefully to be honest, offer useful insight, and advance the work or program without harming one another.

Thanks to the creativity of Dr. Norman and Dr. John Gargani, and the leadership of Dr. Ashaki Jackson and Dr. Stewart Donaldson, we all had an opportunity to showcase our individual strengths. This project has truly transformed us as evaluators. In the end, we hope AEA continues to create more spaces where people not only learn a new skill and express their creativity, but build community while advancing evaluation practices that integrate design thinking.

 

 

Potent Presentations Initiative - What Can We Learn from Motivational Speeches?

From Sheila B. Robinson, Potent Presentations Initiative Coordinator

Watching and listening to motivational speeches can be, well, motivating! Two-thirds of the Potent Presentations formula are message and delivery (with the other third being design, of course) and these are the cornerstones of motivational speeches. While these types of presentations typically don’t include the design element of slides, there is a lot we can learn from them relevant to the types of presentations we might deliver — conference presentations, workshops, stakeholder meetings, evaluation reports, and others.

While these types of presentations — conference session, workshops, meetings, etc. — don’t necessarily have the explicit purpose of motivating people in the same way that commencement speeches or locker room pep talks do, in effect, every presentation we give is motivational in nature. We want to motivate participants to be more interested in our topic, motivate them to feel a sense of ownership about a program, or motivate them to take some sort of action.

HubSpot’s 16 Motivational Speeches to Inspire Your Next Presentation is a collection of commencement speeches, film clips, performances, and even a cartoon, delivered by writers, authors, actors, comedians, tech executives, and one fake (yes!) thought leader. You read that right — a fake thought leader*. The speeches are each entertaining in their own way. Some are moving, some are humorous, and some are inspiring. Each speaker’s central message is different, but what unites them for us as presentation designers is that by focusing on how each is crafted and delivered we can gain considerable inspiration for the message and delivery components of our next presentation.

Like any great presentation, the speakers’ messages in this collection feel tailored to and appropriate for their audiences, and they use elements of humor, personal stories, and repetition of main points. Their words and ideas evoke various emotions. Aspects of their delivery include sounding natural and being poised and practiced; this is even true for those reading their speeches (p2i’s messaging model speaks to the importance of memorizing the beginnings and endings of presentations). It’s clear they have spoken these words before the performance and given significant attention to tone of voice and pacing.

We need your help!P2i.JPG

  • Have you successfully used p2i tools or p2i principles in your presentations?
  • Do you have “before” and “after” slide examples you would be willing to share?
  • Do you have ideas for, or are you interested in writing a blog article on Potent Presentations?
  • Do you have an interest in sharing your tips for Potent Presentations through a brief video or webinar?

*Consider what we can potentially learn from poking a little fun at our craft. Check out these fun parodies of public speaking: ‘Thought Leader’ gives talk that will inspire your thoughts andHow to sound smart in your TEDx Talk.

Please contact me at p2i@eval.org and let’s talk! I’m happy to help, offer guidance, or collaborate on any of these.

 

Announcing the 2016 AEA Alva and Gunnar Myrdal Evaluation Practice Award Winner

AEA honored four individuals and one organization at its 2016 Awards Luncheon in Atlanta. Honored this year were recipients in five categories who are involved in cutting-edge evaluation/research initiatives that have impacted citizens around the world. We will spotlight each award winner in upcoming issues. In this issue, we extend our congratulations to Gail Vallance Barrington. 

Gail Vallance Barrington, President, Barrington Research Group, Inc.

2016 AEA Alva and Gunnar Myrdal Evaluation Practice Award

Gail Barrington closeup 2016.jpgGail Vallance Barrington is a well-known Canadian evaluator, consultant, teacher, and writer. Since founding her consulting firm, Barrington Research Group, Inc., in 1985, she has conducted over 130 program evaluation studies, and at one time managed a staff of 20. She still continues to practice mainly in the fields of education and health. Her work has won awards from AERA and CES, including the CES Award for Contribution to Evaluation in Canada in 2008.

Her love of teaching has been a continuing theme in her life. For the last 10 years she has taught an online course on health services and systems evaluation for the master's program at the Centre for Nursing and Health Studies at Athabasca University. Recently, she began teaching qualitative and mixed methods for the online master's in program evaluation at Michigan State University. In 2017, she will teach a course on program evaluation for the master's of education in health sciences education at the University of Alberta. Dr. Barrington enhances the evaluation practice of other evaluators, especially independent consultants, through her many professional development contributions. Since the 1990's, she has offered in-person workshops at annual conferences hosted by CES, AEA, and EES; at the Summer Institute hosted by AEA; and at the Claremont Graduate University Summer Professional Development series. In 2013, she started offering webinars on consulting skills and was named to the Dynamic Dozen, the top-rated presenters and trainers at AEA. Her webinar on intermediate consulting skills was a top-rated e-learning webinar in 2013. She has mentored many young evaluators and students over the years and continues to work with several on an individual basis.

Jan AEA Pq.pngHer love of writing is a continuing focus. She writes newsletter columns and blogs on evaluation and consulting topics. Her book, Consulting Start-up and Management: A Guide for Evaluators and Applied Researchers, (SAGE, 2012) remains a popular reference for those contemplating independent practice. She is a member of the editorial board for New Directions in Evaluation, and co-edited issue #111 (2006) on independent evaluation consulting with Dawn Hanson Smart.

Dr. Barrington has helped to shape our standards of excellence in evaluation practice through her ongoing leadership within AEA and CES. She served on the AEA Board of Directors (2006-2008) and chaired a number of committees, including the Ethics and Nominations committees. She is currently a member of the AEA Competencies Task Force. Simultaneously, she has supported CES in many capacities, and is currently vice president of the national CES Board of Directors and chair of the Credentialing Board, responsible for the Credentialed Evaluator (CE) designation.

She is a graduate of McGill University (BA) and Carleton University (MA) and holds a doctorate in educational administration from the University of Alberta (1981). She is a Credentialed Evaluator and a certified teacher. In 2014, she was made a fellow of the Certified Management Consultants of Canada.

AEA remains central to her professional life. For more information see www.barringtonresearchgrp.com

Email: gbarrington@barringtonresearchgrp.com

 

Face of AEA - Meet John Burrett

Burrett.jpgName: John Burrett
Affiliation: Haiku Analytics Inc.
Degrees: B.Sc., M.A. (Economics)
Years in the Evaluation Field: I have been in and out of evaluation for the better part of 25 years
 
Why do I belong to AEA?
 
I belong to the AEA because of the breadth and quality of the information it makes available and networking it makes possible, bringing me in contact with new techniques and perspectives. The AEA membership comprises a wide variety of interests and organizations, and therefore you see a range of both requirements from evaluation and approaches to evaluation. This is much broader than is the case in Canada, where the breadth and depth of private sector foundation activity, for example, is more limited and most evaluation activity is government-driven.
 
The membership is, moreover, very active and seems to constantly contribute information and experience. This is reflected in the huge quantity and quality of information available through AEA's website, forums, workshops, etc., that I find very valuable.
 
What's the most memorable or meaningful evaluation you have been a part of? 
 
There have been a few, including, early in my career, major nationwide evaluations of urban social housing and housing for aboriginal people. These were both huge programs serving a great need for many people, and I think our work was very important both in the continuation and refining of the programs.
 
But the one that stands out most for me now is one that I was a part of recently. This was an evaluation by the National Research Council Canada of their program of research in metrology. Metrology is the field of measurement, and in NRC's case, this means doing research into increasing the very precise accuracy of measurements of such things as radioactivity, gravity, or of the passage of time. NRC conducts the research and then promulgates the results out into the research and commercial worlds. They wanted to know more about who was using this research and how those parties are interconnected. NRC engaged me to conduct a social network analysis. What impressed me was not really related to the choice of technique so much as the attitude of the NRC toward learning.
 
What made this great for me was simply the openness of the client to do something very new to them (social network analysis), as well as the openness of the subjects of the evaluation to participate and their interest in learning what their networks looked like - and particularly the NRC scientists who wanted to know how to expand their reach.
 
This was gratifying to me because I have been excited by the potential for this technique to uncover the keys to how research and information dissemination programs work or could work better, in an evaluation context. This client got the idea and ran with it, with me. I don't know if this was because of their relatively scientific bent, or whether it was something else, but, regardless of the specific technique, it was a refreshing experience in approaching evaluation in a creative and positive way.
 
What advice would you give to those new to the field? 
 
See evaluation as an open book. Always look for new ways to answer the questions and keep developing more and new skills. Keep it fresh and always bring the best technique you can to it, even though it seems often that the best you can do is the standard approach. Try not to fall into what I have seen to be the trap of doing what you always do just to get things done. You can always do something better, learning from the last time.
 
I also think, echoing John Gargani's remarks at the latest AEA conference, that human skills are very important. In my career, I have worked both on the policy and political sides and on the evaluator's "side of the fence." That has always made it clear to me how important it is to put yourself in the shoes of the "owners" of the program or initiative that you are studying. Still keeping in mind the absolute necessity of impartiality, it has always worked so much better when internal stakeholders, in particular, are a real part of the process and can see their perspectives and knowledge being sought and understood. 
Recent Stories
AEA Newsletter: November 2017

AEA Newsletter: October 2017

AEA Newsletter: September 2017