From John Gargani, AEA President
ImpCon is a new direction for AEA. It is a pre-conference event in which we will host the annual convening of our partner, Social Value International.
Across the globe, impact investors, social entrepreneurs, and socially responsible corporations are applying market-based solutions to the world’s most challenging problems. Their efforts are leveraging unprecedented amounts of private capital for the public good. They are critical to the success of the sustainable development goals. They are transforming the way business is being done around the world.
At ImpCon, evaluators and private sector actors will come together to tackle the problem of measuring impact in this new and growing sector. A series of international speakers, expert panels, and social events will culminate in an action agenda for improving the state of the art of impact measurement.
If you work in these areas or are curious about them, I invite you to register now and join us at the Carter Center. Seats are limited.
EVAL 16: Evaluation + Design
Of critical importance for AEA is collaborating with our membership around the world to design the profession of evaluation. There is a global push to improve the quality and impact of evaluation. At this conference, you will be asked to consider what evaluation can become, why it matters, and how we can make it matter more.Everything we evaluate is designed. Every evaluation we conduct is designed. Every report, graph, or figure we present is designed. In our profession, design and evaluation are woven together to support the same purpose—making the world a better place. By considering both as parts of a whole, we can advance that purpose.
Be part of it. Join us in Atlanta!
From Zachary Grays - AEA Staff
DEADLINE: October 14, 2016
- Broadening their understanding of evaluation as a profession; and
- Strengthening their knowledge of evaluation theory and methods through workshops, webinars, mentoring and experiential projects.
The goals of the program are to help faculty at MSIs to:
1. Enhance the evaluation activities and/or curriculum in their departments or universities;
2. Orient students to evaluation as a career/profession;
3. Disseminate information about evaluation and AEA to colleagues;
4. Expand their knowledge of evaluation; and
5. Encourage collaborative writing projects that reflect cross-disciplinary ideals.
The MSI faculty is expected to participate in the following programmatic components (exact dates are subject to change):
- Orientation via webinar/conference call
- Webinar-based Experts Exchange with leaders in teaching evaluation approximately monthly
- Monthly teleconference for working and reporting on individual and group initiatives, collaboration and peer support.
- Attendance and participation in the AEA/CDC Summer Evaluation Professional Development Institute
- Participation in either (or both) individual or a group/cohort culminating evaluation exercise
- Webinar-based summer training debrief/focus group after summer training
- Attendance and participation at AEA's annual conference including networking opportunities and specialized training and opportunities for the MSI faculty group as a culminating activity in November of 2017
- Webinar-based conference debrief/focus group in late November
- Ongoing access to resources through a specialized webpage
- Ongoing affiliation to the AEA
- Ongoing affiliation to an AEA local affiliate if present in the region
The following financial support is provided to those participating in the MSI Faculty Initiative:
- Registration fee waiver to AEA annual conference and workshops
- Registration fee waiver to Summer Evaluation Professional Development Institute
- For those not local to the Washington, D.C area, airfare and hotel while at the annual conference (Evaluation 2017)
- For those not local to the Atlanta area, airfare and hotel while at the Summer Institute professional development series
To be considered for the AEA MSI Faculty Initiative, applicants must:
1. Be a full-time, early career faculty member at a MSI within the continental United States and Puerto Rico.
2. Have a course assignment that includes the teaching of a significant evaluation and/or research methods course within one's academic department.
3. Teach in the education, social/behavioral sciences, physical/natural sciences, humanities, public health, or business or non-profit administration.
4. Demonstrate interest (through a written essay) in learning more about evaluation theory, methods, and the profession, as well as a commitment to integrating new learning from initiative participation within your class structure.
5. Demonstrate commitment to program participation and completion, including submission of a brief final report on your plans and progress toward enhancing your research and/or evaluation courses.
6. Propose and deliver a final "product" demonstrative of the benefit of participation and contributing to the profession and AEA (e.g. presentation, publication, teaching materials, modules, etc.).
7. Provide a letter of support from the appropriate department chair or dean.
Click here to apply this year's MSI Fellowship. All materials must be received by Zachary Grays at the AEA offices on or before October 14, 2016.Want to learn more about the MSI Fellowship? Contact MSI Coordinator Art Hernandez via email@example.com.
Do you go to a professional conference with a notebook or electronic device at the ready? If so, I’ll bet you take plenty of great notes on the content of a good presentation; jot down the presenter’s contact information; or note any good websites, books, or other resources the presenter mentions. But have you ever considered why a particular presentation is successful or unsuccessful?
I recently attended a presentation on a topic of great interest. I was excited to learn and ready to come away with fabulous information I could apply in my own context. Unfortunately, I found myself so distracted by the presenter’s mannerisms that I almost lost out on learning the content. One I remember in particular was that presenter wore long sleeves — sleeves that were, in fact, a little too long, and she spent the better part of the presentation tugging on her right sleeve with her hand, scrunching it up in her palm, letting it go, and tugging and scrunching again. Was she nervous? Had she practiced this presentation? Did the audience members, most of whom were much older and more experienced than she, intimidate her? Was she knowledgeable about the topic but just not an experienced presenter? Or was she less than confident in her own grasp of the topic? I wondered about all of this as the minutes ticked by, and I missed much of what she said about the topic.
Have you ever attended a presentation on a topic that excites you only to find yourself disengaged and disappointed? Have you ever attended a presentation on a topic you dreaded hearing about but were pleasantly surprised to find yourself completely engaged?
What worked or did not work for you in each of these circumstances? This is a topic of constant study for me and I have reams of notes, not only on the content of presentations, but also on the presenters and presentations themselves.
I take notes not only on what the presenter does, or how the presentation is structured, designed, and delivered, but also on how the audience reacts to the presenter. Do they laugh when the presenter makes a joke? Do they tune in or tune out at different times? Do multiple people walk out of the presentation, and what is going on when this happens? Does the presenter do anything in particular that reinforces or distracts from the message?
So, why take notes? There are two important reasons to do so. One is, of course, to avoid forgetting information. The other is to facilitate later study and, even more important, reflection. There is plenty of research to back the importance of taking notes for learning, and there is much on reflective practice as well. I learn to be a better presenter from attending presentations, noting what works and what doesn’t, and reflecting on how I can improve my practice.
How should you take notes? Well, that’s certainly up to you, but check out this article from Scientific American, A Learning Secret: Don’t Take Notes with a Laptop, for a little food for thought. When I want to indulge in luxurious note-taking, I prefer Moleskine notebooks. If you are a laptop, tablet, phone, or other electronic device note-taker, there are certainly plenty of options for you to both take and organize notes. I use Apple devices, and find the built-in notes app adequate, but also use Evernote, a popular and feature-rich app available for most devices.
From Cheryl Oros, Consultant to the Evaluation Policy Task Force (EPTF)
The new Evidence-Based Policymaking Commission Act of 2016 was signed by the president on March 30, 2016, after several years of bipartisan efforts. The primary mission of the commission is to examine how to increase the availability and use of government data to build evidence and inform program design while protecting the privacy and confidentiality of those data.
More specifically, the commission is required to:
- Determine how to integrate administrative and survey data (including which incentives may facilitate interagency sharing of information to improve programmatic effectiveness and enhance data accuracy and comprehensiveness) and make those data available to facilitate research, program evaluation, analysis, continuous improvement, policy-relevant research, and cost-benefit analyses by qualified researchers and institutions while protecting privacy and confidentiality;
- Recommend how to overcome legal and administrative barriers, improve data infrastructure to facilitate data merging and access for research purposes, ensure database security, and modify statistical protocols to best fulfill the integration and increased availability of data, including linkages across administrative data series;
- Recommend how best to incorporate “rigorous evaluation” (i.e., “outcomes measurement, randomized controlled trials, and rigorous impact analysis”) into program design; and
- Consider whether a federal clearinghouse should be created for government survey and administrative data, particularly for program evaluation and federal policymaking; how such a clearinghouse could be self-funded; which types of researchers, officials, and institutions should have access to data and what their qualifications should be; and what limitations should be placed on the use of data provided.
The 15-member commission is comprised of economists, lawyers, data security and confidentiality experts, academic researchers, and data managers appointed by congressional leaders and the president. The commission held its inaugural meeting on July 22, 2016, and a second meeting focused on privacy, confidentiality, and data security concerns in dealing with federal data on September 9 (http://www.census.gov/about/adrm/data-linkage/what/policymaking.html). A public hearing will be held on October 21, followed by a meeting focused on evaluation issues on November 4 and another on health and justice issues on December 12 — all in Washington, D.C. (http://www2.census.gov/about/linkage/updates/2016/update-2016-09-01.pdf). The commission will have a budget of about $3 million, support staff from the Census Department, and bipartisan support to study and submit to the congress and president a detailed statement of its findings and conclusions, together with its recommendations for legislation or administrative actions, by early September 2017.
The AEA offered its support via letter when the bill was introduced, provided another letter summarizing AEA’s policy positions on evaluation to the commission, and is invited to testify on evaluation issues on November 4. The general public can also provide comments to the commission which must be received by November 14, 2016 (see https://www.regulations.gov/document?D=USBC-2016-0003-0001).
The Office of Management and Budget (OMB) provided several papers to the commission, which will be of interest to AEA Members and can be found online:
- Federal evidence-building efforts.
- Using administrative and survey data to build evidence.
- Using administrative data for evidence building.
AEA will continue to support and advise the commission. Of equal or greater importance, it will, on its own and by working with various groups, provide advice to the incoming administration on evaluation policies. If you have suggestions on groups to contact, please contact me at EvaluationPolicy@eval.com. Please join the EPTF to learn more about its policy work, and offer your suggestions at the AEA annual conference this October in Atlanta, Georgia; meeting time to be announced.