Evaluation 2015: Exemplary Evaluations in a Multicultural World

View Printable Version

Improving the Evaluation Process through Community Networks

Session Number: ACA5
Track: Arts, Culture, and Audiences
Session Type: Multipaper
Session Chair: Audrey Kremer [National Geographic Society]
Discussant: Antonio C. Cuyler [Florida State University]
Presenter 2: Don Glass [arts | education | research]
Presenter 3: Catherine Nameth [UCLA]
Presenter 4: Micaela Mercado, Research Associate [New York University]
Presentation 1 Additional Author: Shani James [Senior Technical Specialist - ICF International]
Presentation 1 Additional Author: Sharika Bhattacharya [Senior Research Associate - ICF International]
Presentation 4 Additional Author: Alexandra Gensemer [Research & Evaluation Program Manager - McSilver Institute for Poverty Policy & Research]
Presentation 4 Additional Author: Cathleen Plazas, MSW [Program Coordinator - Brooklyn Academy of Music (BAM)]
Time: Nov 13, 2015 (01:45 PM - 03:15 PM)
Room: Plaza B

Abstract 1 Title: Building and Evaluating Communities of Practice—Application in the Arts
Presentation Abstract 1: Communities of practice (CoP) are increasingly important tools for people with common interests and concerns to work towards and achieve their shared goals. The U.S. Department of Education's Arts in Education grant programs have provided a practical platform for using a CoP to foster effective and sustainable arts programs across the country. As the technical assistance providers for these grant programs, ICF has established effective strategies for establishing and evaluating a CoP among these grantees. ICF's framework for this CoP can serve as a model for CoPs in a variety of contexts.
Abstract 2 Title: Improving the Integration of Arts Learning and STEM
Presentation Abstract 2: This paper will provide an overview of how improvement science methods have been applied to the practical measurement and improvement feedback of an art museum education program that integrates arts learning and STEM. The problem of practice, drivers of change, and practical measures will be played out across several collaborative design and evaluation cycles with a team of museum educators, content experts, teachers and an evaluator. A comparison will be made between an initial provisional report generated by the external evaluator, and the final report that included the participation and insights of the education team who engaged in the improvement science process. Methodological connections will also be drawn between improvement science methods and collaborative, participatory, and developmental evaluation approaches in the education context.
Abstract 3 Title: Pause-Commit-Engage—A Rubric for Direct Observation in Informal Learning Environments
Presentation Abstract 3: At public arts or science festivals, there is often little, if any, effort on evaluation. When evaluation is considered, the person or persons tasked with it may be constrained by budget and preparation time as well as by event location and popularity. In the case of a public science festival held on a university campus, the sheer size of the event as well as the number of attendees challenges evaluators to go beyond counting visitors and to focus instead on aspects possibly more informative for evaluation reporting and program redesign, such as attendee learning. The presenter will introduce a brief rubric for the direct observation of event attendees’ informal learning based on observing the interaction between attendee, presenter, and activity.
Abstract 4 Title: Creating a Value-Added Evaluation Framework for the Arts
Presentation Abstract 4: A mixed methods program evaluation was designed to evaluate a range of programs including art administration, critical viewing of and writing about performance and film, dancing and choreography, and examining issues of social justice provided by a partner nonprofit organization. The evaluation is grouneded in "value-added" initiatives alighed with the Common Core State Standards, and the Blueprint for Teaching and Learning in the Arts. This paper discusses the collaborative process used to determine the feasilbiity of program evaluation given the diversity of programs and various skills participants may acquire as a result of particpatining in these programs.
Audience Level: None

Session Abstract: 

Improving the Evaluation Process through Community Networks

For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.

Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 6, 2015. Email cancellation requests to registration@eval.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 6, 2015 all sales are final.