Sarah Mason, Ph.D. student in the Evaluation and Applied Research Methods program at Claremont Graduate University, describes her experience representing AEA on a panel of emerging evaluators at the Australasian International Evaluation Conference in Melbourne.
Arriving at the Australasian International Evaluation Conference in Melbourne, I felt a bit like an imposter.
“Oh, you’re from the American Evaluation Association!” one conference goer exclaimed on the first day, looking at the label on my badge.
“Yeah, um, sort of” I replied – less eloquently than I would have hoped – the sound of my slowly dwindling Australian accent echoing through.
As an Australian-born but U.S.-based evaluation Ph.D. student who has lived eight of the past 12 years outside Australia, it felt odd to represent AEA on a panel of emerging evaluators – particularly in Australia. I felt neither American enough to represent any kind of American evaluator, nor experienced enough in the Australian context to be considered an Australian one.
But over the next three days it slowly dawned on me that Australian-based evaluators aren’t “all Australian” any more than U.S.-based evaluators are “all” anything.
In fact, a large number of the Australian-based evaluators I met during the conference were born in Great Britain, New Zealand, the Pacific Islands, Canada, or the U.S. And those who were Australian came from a multitude of backgrounds: some lived in rural/remote locations, others were indigenous, and others still (like me) had spent the bulk of their adult lives living internationally.
It was this diversity that struck me most during my time at the AES conference. Not simply the diversity of participant backgrounds, but rather the AES’ choice to actively seek out such diversity – of opinions, perspectives, and experiences – in shaping conference proceedings.
Among its invited speakers, the AES conference invited perspectives from Europe (Marlene Laubli Loud, the Swiss Evaluation Society), the United Kingdom (Penny Hawins, UK Department for International Development), the United States (Stewart Donaldson, American Evaluation Association), New Zealand (Kate McKegg, Aotearoa New Zealand Evaluation Association), and Canada (Benoit Gauthier, Canadian Evaluation Society). In addition to the voices of established evaluation academics and practitioners, conference convenors invited views from new and emerging evaluators, including myself, Ruth Aston (AES, University of Melbourne), Melitta Rigamoto (ANZEA, Pasifika Futures), and Allan Mua Illingworth (Fiji, Secretariat of the Pacific Community).
And although speakers and participants raised a number of familiar issues that seem close to the hearts of across the globe (think: professionalization, evaluator competencies, the need to demonstrate the value of evaluation), the AES’s focus on inviting diversity enabled conference goers to examine these issues by looking outwards at others’ experiences before reflecting inwards on their own.
It is this globally focussed “outwards-in” mindset that I will take with me from the AES conference: a reminder not to forget the vast wealth of evaluation knowledge held by evaluators across the globe, and the value of drawing on their multiplicity of experiences – not simply those close to home – when contemplating my own evaluation practice.
Moreover, it is through attending international meetings such as the AES conference that AEA members can actively play the valuable role of the “outsider,” inviting evaluators across the globe to reflect upon, critique, and learn from our experiences in order to advance their own.
From Sheila B. Robinson, Potent Presentations Initiative Coordinator
Happy fall! Fall is a favorite time of year for many people. Perhaps it’s the start of a new school year, the vibrant colors of changing foliage, the anticipation of a holiday season, or simply looking forward to the return of cooler temperatures. (Okay, maybe not so much that last one.) For many of us, fall brings a flurry of activity as we painstakingly prepare our presentations for Evaluation 2015. Excitement builds as we wonder who our audiences will be and how our work will be received.
A Method to the Madness
For the last three months, these articles have featured links and resources for message, design, and delivery, the three building blocks of p2i. Now, it’s time to put the pieces together and make them work for you.
One resource that bears another mention is the Presentation Preparation Checklist. While it features items that should ideally be considered up to about three months prior to the conference, it’s a useful tool at any time, even if you think you have it all covered. It’s kind of like having that travel checklist many of us use. It never hurts to review the list, even when you think you’re not likely to forget anything on it!
Perhaps you have already crafted your message, designed great slides or other visuals, and are prepared with your best public speaking skills for delivery. Why not check out this list of 90 quick tips from the premier public speaking organization Toastmasters International? The world leader in communication and leadership development is celebrating its 90th anniversary through October 2015.
Have you been participating in the virtual Slide Clinic? This four-part webinar series is jointly sponsored by our Potent Presentations Initiative (p2i) and the Data Visualization and Reporting Topical Interest Group (DVR TIG). The purpose of this series is to help AEA members prepare outstanding and informative presentations for this year's annual conference and beyond.
- Tailor to your presentation type – e.g., a demonstration, panel, roundtable, Ignite, or think tank
- Identify three to five big ideas to cover
- Include an activity or story
- Use the Messaging Model Handout to allocate the right amount of time to each segment of the presentation
Missed the live webinars? Watch all four recordings in the Coffee Break archive.
Slides from Virtual Slide Clinic Coffee Break Webinar Series
From Cheryl Oros, Consultant to the Evaluation Policy Task Force (EPTF)
AEA recently sent a letter to the United States Ambassador to the United Nations (U.N.) with comments on the newly approved U.N. Agenda 2030: Transforming the World Follow Up and Review (that is, monitoring and evaluation) processes. We applauded the Agenda 2030 for its expressed principle that “follow up and review will be rigorous and based on evidence informed by country-led evaluations and data which is high quality, accessible, timely, and reliable.” At the same time, we noted that more was needed to plan for useful evaluations.
We proposed the following complementary steps:
- Develop an equally robust and transparent plan for other aspects of the follow up and review process, particularly analysis of the indicator data, complementary evaluation, and use of the evidence generated.
- Use Sustainable Development Goal data in the context of evaluation studies. Those data and the related evaluations could convey the complexity of initiatives as they are implemented and provide a deeper understanding of the actual progress toward goal attainment or what can be changed to raise achievements. The U.N. Agenda 2030 only minimally addresses the potential use of this data.
- Develop a U.N.-coordinated but multistakeholder plan for building evaluation capacity. Activities held as part of the current 2015 International Year of Evaluation (UN Res A/69/237) emphasized the importance of the U.N. Agenda 2030 principle of “enhanced capacity-building support for developing countries, including the strengthening of national data systems and evaluation programs.”
The Sustainable Development Goals (and therefore indicators) are to be “achieved” by 2030. The goals are accompanied by targets and will be further elaborated through indicators focused on measurable outcomes. They are action oriented, global in nature, and universally applicable. Compared to the Millennium Development Goals that ended this year, the Sustainable Development Goals target complex systemic outcomes and integrate considerations of equity and sustainability. The indicator development process supporting follow up and review described by the U.N. Statistical Commission has been transparent and participatory.