Dear AEA colleagues,
In this month’s column, my final one as AEA president, I’d like to give a recap of the highlights of the work of the AEA Board and leadership.
AEA and Its External Connections
Let’s begin with the big picture and move inward. I want to mention three important aspects of the work of the board that position AEA in the bigger world.
Internationally, one of the highlights this year was the AEA Board’s endorsement of the declaration of 2015 as the International Year of Evaluation. You’ll be hearing much more about this as we move into 2015. Let me also extend a big welcome to Mike Hendricks as our new representative to IOCE. We are honored to have Mike serve in this way with his rich array of experience and knowledge, caring and respectful way of living in the world, and commitment to AEA and the field of evaluation globally.
Within the U.S., the Evaluation Policy Task Force (EPTF) continues to be very active with its attention to federal policy related to evaluation. We have had the opportunity to support some important legislation this year. (See, for example, Cheryl Oros' Policy Watch column this month.) Additionally, the EPTF is strengthening the attention to state policy, along with its ongoing attention to federal policy. I’m especially pleased that Rakesh Mohan, with his expertise in state policy, joined the EPTF this year.
As we position ourselves alongside other evaluation-related disciplines, I’m pleased to report that the board approved the recommendations of a board task force to pilot test some more intentional and broader connections with other disciplines.
Development of the Evaluation Profession and AEA Members
At Evaluation 2014, board member Donna Podems led a well-attended listening post about evaluator competencies to inform our board task force on this topic and its connections to credentialing and professional development. This work, which will continue into 2015, is a key topic for our profession.
Happenings at AEA
As you know, we transitioned to SmithBucklin as our association management company in 2013. However, 2014 is the first year that SmithBucklin was responsible for the full cycle of preparation and execution of the annual meeting. It is also a year in which we have been working out the boundaries between the governance role of the board and the operational roles of SmithBucklin.
We are also now in our sixth year under policy-based governance. It made sense to do a thorough review and reorganization of our policy manual to make it more accessible to the membership and to update policies that needed some tweaking in light of the changes in the services of SmithBucklin compared to Kistler and Associates. In early 2015, you can expect to see a more accessible policy manual on the website. The manual will reflect the ways that we are clarifying and positioning the link between AEA governance and AEA operations.
This year, the board updated the “ends goals” to clarify the results – the ends – we are seeking for evaluators, evaluation users, and the larger society. The clarified ends goals are now being used by Denise Roosendaal, AEA executive director, to build the strategic plans of the organization and ensure that our programs and activities are in support of those goals. We are also creating a multiyear planning and budgeting process to better support our goals.
We are also working to strengthen the connection between the conference theme from year to year and the connection to the work of the board. Although the conference theme is the responsibility and choice of the president, I’ve kept the board informed as we have developed the conference theme and sought to link it with the direction that the board is going in support of its ends goals. President-Elect Stewart Donaldson and I have been working closely to build links between our themes. For the first time, we had a conference session in the presidential strand in which Stewart and I looked at the link between the themes for this year and next year.
We are also excited about our work on leadership development. As the association grows, we need more transparent and proactive ways to build leadership pipelines within the association. We are looking at how to build the pipelines for Topical Interest Group (TIG) leadership, as well as overall organizational leadership. Speaking of TIGs, we welcomed five new TIGs into AEA this year. TIGs themselves are important leadership components in AEA and the field of evaluation.
All in all, we feel that we are making good progress on our board governance roles, our support of Denise and the AEA staff, and our attention to being both outwardly and inwardly focused. We are eager to see AEA contribute even more to the evaluation field and the benefits to society through the application of evaluation in many sectors of society. As a board, we are eager to strengthen our contribution to the field of evaluation and support the public good through our work.
For me, the highlight of the year has been the opportunity to focus on my main interest regarding evaluation: visionary evaluation for a sustainable, equitable world. As I shift more of my attention back to InSites, the organization I direct as my regular job, I plan to continue focusing on using systems thinking, building relationships, and more fully understanding what it means to live in a way that supports a sustainable, equitable world for generations to come. I also plan to follow up on the direction emphasized in the closing plenary of the 2014 conference.
I close with my deep gratitude for the many people who have supported me, the board, and the association as a whole over this past year. May our AEA community continue to grow in its commitment and capacity to support the public good for generations to come.
And now, let’s welcome Stewart Donaldson, 2015 AEA president. Stewart, I know we’re in great hands with your leadership and I’m here to support you in my role as past president!
AEA 2014 President
Name: Nora F. Murphy
Affiliation: Consultant and Founding Member, TerraLuna Collaborative
Degrees: B.A. in Education (Earlham College); M.A. in Research Methodology (Psychology in Education, University of Pittsburgh); Ph.D. in Evaluation Studies (Organizational Leadership, Policy, and Development, University of Minnesota)
Years in the Evaluation Field: Nine
Joined AEA: 2006
Why do you belong to AEA?
I was working as a program manager for the Student Conservation Association in Pittsburgh at a time when foundations in the area were just starting to think more critically about the role of evaluation in their grantmaking. I received training in evaluation as part of a pilot project to develop standards for youth-serving organizations, and it was as though a whole new world opened up for me. I was able to ask better questions about my program and to see new ways in which to answer those questions. I was hooked on evaluation! But, as with many things, opening the door to this new world led to more questions than answers. I took classes to learn more, and my professor, Bill Bickel, was adamant that we engage with AEA through conferences, workshops, and EvalTalk. I feel lucky that he instilled that idea in me early in my evaluation career.
The first time I attended an AEA conference I felt such a sense of relief – I’d found my people. They were warm, easy to approach, nerdy like me (in a good way), and cared about the world. People spoke the same language as me and expanded my thinking about both what’s possible and what my responsibilities are. Over the years I have seen my mentors Jean A. King and Michael Q. Patton contribute actively to and benefit from participation in AEA. I’ve learned from watching that active participation in AEA is critical to my own professional development (often to my personal development, as well), and to the development of the field.
Why do you choose to work in the field of evaluation?
Because this is my path. Because this is how I can make a difference in the world.
I believe fiercely that we all need to work toward a more just and equitable world. Evaluation lets me use my skills and talents in partnership with organizations that share that same belief. It may sound idealistic, but I believe that through these partnerships we can collaboratively heal our communities (and ourselves) and eliminate inequities. Developmental evaluation has become, for me, a powerful way to engage in these partnerships with social innovators. As of today, all of my evaluation contracts are for developmental evaluations, something that thoroughly excites and delights me.
What's the most memorable or meaningful evaluation that you have been a part of?
I recently worked on a Principles-Focused Developmental Evaluation of the Otto Bremer Foundation‘s support for collaboration among agencies serving homeless youth in the Twin Cities of Minnesota. In 2012, six nonprofit organizations supporting homeless youth began collaborating around shared principles. Working with Michael Q. Patton, the agencies first identified shared values and common principles. They found that their work was undergirded and informed by nine essential principles. They then began a process of designing a study to examine evidence of the effectiveness of these principles. At this point they hired me (then a PhD candidate) to lead the evaluation process as a part of my dissertation research. I conducted 14 in-depth case studies of homeless youth who had interacted with the agencies. The results of the case studies were synthesized, and the agency leadership participated in reviewing every case study to determine which principles were at work and their effectiveness in helping the youth meet their needs and achieve their goals. Taken together, this set of principles provides a cohesive framework that guides practice in working with homeless youth. The case studies and additional information can be found here: http://www.terralunacollaborative.com/publications/.
What advice would you give to those new to the field?
I have five pieces of advice:
- Get involved. There are so many ways to do this. Start small and, as they say, the rest will follow.
- Learn. Find colleagues close by with whom you can collaborate and learn.
- Read. Read books and articles by people who think like you and those who don’t.
- Search. Look back at the EvalTalk and AEA365 archives when you have a question. Often there has already been a rich exchange about your topic that you can learn from.
- Ask. Don’t be afraid to approach people in the field – even if they are authors or “big names.” I’ve never encountered someone who wasn’t gracious and giving of their time.
I am Patrick Germain, director of strategy and evaluation at Project Renewal, a large human services organization in New York City, and adjunct assistant professor at NYU Wagner Graduate School of Public Service. I am also the president of the New York Consortium of Evaluators (NYC’s local AEA affiliate) and blogger at www.MeasuredNonprofit.com.
Like many of you, I came to evaluation on a non-linear path. After undergraduate study in Spanish literature, I found myself working at a homeless family shelter in San Francisco. Coming from a relatively comfortable suburban childhood, this was something of a culture shock, and the beginning of a journey of inquiry and improvement that I am still on today.
This journey is rooted in the underlying values of humility and continuous self-improvement, values that became more central to my career as I moved into evaluation. As a well-educated white male working in poor communities of color, I must constantly check my privilege, listen purposefully, and ensure that my work is ethically sound and culturally attuned to the communities in which I work.
Evaluators are often the arbiters of the “truth” with the moral obligation to speak that truth to power. Working in communities that have traditionally been excluded from power structures means that the quality of my work carries a much higher obligation than professional commitment. Conducting high-quality utilization-focused evaluation is not merely good work, it is value-laden and value-creating action that leads directly to enhancement of the public good.
Nearly three years ago, a few local evaluators and I started NYCE when we somewhat shockingly discovered there was no New York City AEA affiliate. When the opportunity of being president of NYCE came to me, I hesitated because I doubted my qualification after only one year of “official” evaluation experience. But reflecting on my first AEA conference the prior year, I realized that although I had never called myself an evaluator, I had been applying evaluative values and approaches for much of my professional career.
Inclusion, outreach, and community building are among my top priorities at NYCE, because there are many people out there like I used to be, unaware that they have a claim to the title “evaluator” and that there is a professional community specifically for them. By creating an open, diverse, and welcoming evaluation community, NYCE can become not just a community of evaluators, but also a community that creates evaluators.
When I look at AEA’s values, I see more than just aspirational language; I see a reflection of the commitment I have made to humility, to continuous self-improvement, and to creating a better world.
As we say goodbye to 2014 and hello to 2015, the International Year of Evaluation, I’ll briefly recap what our year in diversity was like here at AEA. So many exciting things happened in 2014 that broadened the capacity of evaluators to practice culturally competent evaluations and increased the diversity of evaluators entering the field. It has been an incredible journey, and there is no doubt that what we have in store for 2015 will contribute immensely to our efforts in promoting diversity and culturally responsive evaluation. Here are just a few of the highlights from 2014’s year in diversity.
The Graduate(s): The 10th GEDI Cohort Graduates at Summer Institute
The Graduate Education Diversity Internship (GEDI) graduation at the Summer Institute 2014 marked 10 years of GEDI excellence! Continuing the legacy of diversity at AEA, the GEDI program engages and supports students from groups traditionally underrepresented in the field of evaluation. Crystal Coker, Shipi Kankane, Bailey Murph, and Anael Ngando made their final presentations of the internship during an intimate luncheon attended by special guests, including AEA President Beverly Parsons. Read more here.
Now well into their program work, AEA had the pleasure of welcoming the 2014-2015 GEDI Cohort!
- Danielle Cummings, Harlem Children's Zone, New York City
- Kevin Lee, Opportunity Fund, Berkley, California
- Kristin Mendoza, National Cancer Institute, Washington, D.C.
- Iliana Perez, Harder + CO, Claremont, California
- Erica Roberts, National Cancer Institute, Washington, D.C.
- Kisha Woods, Education Development Center, Washington, D.C.
- Natalia Woolley, Kaiser Permanente, Los Angeles, California
Visit the following pages to learn more about the GEDI interns, GEDI program, GEDI host sites, and GEDI Program Directors. AEA will accept applications for host sites and interns for the 2015-2016 GEDI Cohort soon. Stay tuned!
OPENing Up About Cultural Competence
On June 26, 2014, AEA affiliate Oregon Program Evaluators Network (OPEN) sponsored Cultivating Cultural Competence in Evaluation: Because Evaluations Are Not Culture-Free, a panel and roundtable discussion event focused on actively engaging issues of culture with evaluation activities and the application of one of AEA’s greatest resources, the AEA Statement of Cultural Competence. Local and respected panelists were invited to share their experiences with cultural competence in evaluation in their respective disciplines during the event, which was led by Kari Greene, senior research analyst at the Oregon Public Health Division and leader of the policy subgroup of the AEA Cultural Competence Working Group. Attendees were presented with the unique opportunity to interact and engage with each other to address the bevy of burning questions surrounding culturally competent evaluation. What resulted was an exuberant exchange and an escalated desire to contribute to cultural awareness in practice. Read more here.
Postcards from Evaluation 2014: The AEA Cultural Competence Working Group Asks: What is a Culturally Competent Evaluator?
One of the hugest hallmarks of the year was Evaluation 2014 in Denver, CO! Tucked away in a corner at Hyatt Regency Denver, AEA’s very own Cultural Competence Working Group set up the sequel to their photo booth media project. In a project aimed at addressing how evaluators increase their capacity to conduct culturally responsive evaluation, the group set out to ask the more than 3,000 evaluation professionals in attendance: What is a culturally competent evaluator? What is culturally competent evaluation?
Led by Derrick Gervin of the Cultural Competence Working Group, attendees responded to this simple statement: Cultural competence in evaluation is …. The answers varied, taking shape from both the personal experience of the respondents and their future goals for being culturally competent. Read more here.
Meet the New MSI Fellows
AEA is proud to introduce the newly selected fellows for the 14th Minority Serving Institution fellowship. With more than 40 exceptional applicants, it goes without saying that narrowing down the selected five fellows was incredibly difficult. Meet the 2014-2015 MSI fellowship fellows!
- Tiffeny Jimenez, Assistant Professor, National Louis University (NLU)
- Julia Lechuga, Assistant Professor, The University of Texas at El Paso
- Tamarah Moss, Ph.D., MPH, MSW, Assistant Professor, Howard University
- José A. Muñoz, Assistant Professor, California State University
- Elizabeth Williams, Ph.D., Assistant Professor, Tennessee State University
AEA welcomes the newest MSI cohort! Visit the AEA website to learn more about the MSI Program and this year’s fellows. Interested in reading the first-hand experiences of MSI alumni? Check out their posts here on aea365 (week of December 7-13, 2014)!
This has been an extraordinary year in diversity here at AEA. As we bring the year to a close, I would like to challenge you as members to become more active in championing diversity not only here at AEA but also in the evaluation practice as a whole. Inclusivity, diversity, and cultural competency are key to the longevity the profession and association. There are many ways to get involved (hosting a GEDI scholar, participating in the Cultural Competence Working Group, etc.) and contribute to this goal. Next year is the International Year of Evaluation, and AEA couldn’t be more excited about extending the Olive Branch on a global platform to our colleagues abroad. Stay tuned for more exciting diversity news and opportunities in 2015.
Have a story to share for the AEA diversity column? Contact me at firstname.lastname@example.org to learn how to share your diversity in evaluation stories. I would love to share your story!
In late November, the Senate and House Budget Committee chairs, Senator Murray and Representative Ryan, introduced the Evidence-Based Policymaking Commission Act of 2014 that would address steps to enhance the conduct and quality of evaluations of federal programs. The bill would establish a commission to study how best to expand the use of data to evaluate the effectiveness of federal programs and tax expenditures. The Commission would determine whether the federal government should establish a clearinghouse for program and survey data, which researchers from both the private and public sector could access and use to perform program evaluations and policy-relevant research. The Commission’s findings and recommendations would be due no later than 15 months after the majority of members are appointed. The National Academy of Public Administration would administer the Commission.
AEA sent a letter to the Committee chairs endorsing efforts to embed evaluation into program design and to ensure that data are available for evaluation of federal programs. At the same time, AEA explained that, to make more effective use of evaluation methods, it is important to amend the bill to promote methods best suited for answering pressing questions about federal programs and policies. This would allow for a greater variety of methods that may be needed to evaluate federal programs, without overemphasizing a single method of impact analysis (such as the randomized controlled trial specifically mentioned in the bill). Otherwise, policy makers would be deprived of evidence based on the most appropriate evaluation methods. AEA also offered its assistance to the new Commission if the bill was passed.
Three recent publications are likely to be of interest to those concerned about evaluation policy. In early December, Brookings featured a new book, “Show Me the Evidence: Obama's Fight for Rigor and Results in Social Policy” by Ron Haskins and Greg Margolis. The book describes the development, enactment, and implementation of six evidence-based social policy initiatives by the Obama Administration. Evidence of effectiveness was incorporated into the grant selection process for initiatives covering education, employment and training, health, teen pregnancy, and community-based programs
In November, a new book, “Moneyball for Government” by Peter Orzag and Jim Nussle, was announced by Results for America, a nonprofit organization. It consists of a compilation of essays by a high-level, bipartisan group of advocates for evidence-based decision making in federal programs.
The Pew Charitable Trusts and the MacArthur Foundation released a study, Evidence-Based Policymaking: A Guide for Effective Government, in November on states’ efforts to assimilate performance and evaluation information into evidence-based policymaking. Based on these and other state activities, the guide offers “a framework that governments can follow to build and support a system of evidence-based policymaking.” Its framework has five components: program assessment, budget development, implementation oversight, outcome monitoring, and targeted evaluation. The guide notes that several federal grant programs have provided incentives for states to invest in evaluation by targeting about $5.5 billion for seven initiatives that support proven programs.
If you know of any bills related to federal evaluation that may be introduced in the next Congressional session or have any comments, suggestions, or questions, please contact me at EvaluationPolicy@eval.org. We look forward to your input.
There’s a new podcast in town, Presenter. Presentation pro Jon Schwabish and I have launched the Rad Presenters Podcast as a platform to talk about the sticky presentation topics. Our p2i site has heaps of good information to get you started on your presentation, the kind of thing anyone can use and learn from. The Rad Presenters Podcast is for those who have some presenting experience under their belts and are ready for the next big step.
We have intentionally kept the podcast episodes to 45 minutes – just right for your morning commute, your lunch time walk, your dinner preparation time. And (this is my favorite part) in most episodes, we interview another rad presenter. So far, we have hosted presentation luminaries such as:
Nolan Haims – We talked templates!
Sheila Robinson – Listen right now to the nitty gritty of audience engagement with one of AEA’s very own: http://www.radpresenters.com/Recordings/RadPresenters_EpisodeNo3.mp3
Upcoming guests include a frequent p2i mention, Gavin McMahon, and AEA member Ann Emery. You can get notification of new episodes by following us on Twitter or subscribing on our site or subscribing to the podcast on iTunes.
One more bonus: At the start of each episode, we answer tough questions submitted by the audience. What’s puzzling you? Send it in to email@example.com and we will reply on the air.
The Rad Presenters Podcast is an excellent complement to the Potent Presentations Initiative!