Rubric Design to Promote Understanding and Evaluation Use
Session Number: 2779
Track: Use and Influence of Evaluation
Session Type: Panel
Tags: Rubrics, Rubrics Methodology
Session Chair: Thomaz Kauark Chianca [Managing Partner - COMEA Relevant Evaluations]
Discussant: Michael Scriven [Claremont Graduate University]
Presenter 1: Krystin Martens [Lecturer / Coordinator of Online Learning - Centre for Program Evaluation / University of Melbourne]
Presenter 2: Julian King [Director, Julian King & Associates Limited - Kinnect Group]
Presentation 1 Additional Author: E Jane Davidson [President - Real Evaluation]
Time: Oct 27, 2016 (01:00 PM - 01:45 PM)
Abstract 1 Title: Making evaluation reasoning explicit (MERE) rubrics
Presentation Abstract 1:
A well-designed and implemented rubric makes evaluative reasoning explicit by revealing how judgments of merit, worth, or significance are made. This practice promotes user understanding and buy-in. The author will share insights from three research on evaluation studies she conducted on rubrics and rubric use in program evaluation. From this work, a set of MERE rubric design considerations have emerged that can assist you in making evaluative reasoning explicit in your next evaluation project. The MERE design considerations are informed by peer-reviewed program evaluation literature, interviews with practitioners who use rubrics as an evaluation-specific methodology, and parallel literature. MERE design considerations are based on Scriven’s nature and logic of evaluation (NALE). The connection and nomenclature will be outlined so you can NALE your next evaluative rubric!
Presentation 1 Other Authors: We will have TWO discussants, but there is NO room to do this. Please add Jane Davidson as an additional discussant because Michael Scriven may or may not be able to attend due to health issues.
Abstract 2 Title: Getting from rubrics to reports - tools and tricks to assist sense-making.
Presentation Abstract 2:
Once you've designed your rubric, you then need to use it to inform decisions about what evidence to gather, and how to gather it (e.g., indicators, data sources, and methods). If that includes mixed methods, it helps to have an efficient process and structure for managing the streams of evidence so that it all maps back to the rubric and follows a logical thematic sequence for reporting. This becomes doubly important where teams of evaluators and themes of qualitative information are involved. In this presentation, the author will share a practical set of steps developed for managing this process. You will learn how to use a rubric to determine the structure and content of your evaluation report, and how to get from the rubric to the report via a sense-making template.
Audience Level: All Audiences
Evaluators who use rubrics as an evaluation-specific methodology often also use them more broadly, to frame and focus evaluations from start to finish. This may be because the core form (characteristics and configuration) and function (the natural purpose) of a rubric embodies the nature and logic of evaluation, and in so doing, supports such use. The papers in this panel session lead participants through rubric design considerations, how to design and implement a rubric to streamline evaluation activities. The discussion will touch on how to know if you have designed, developed, and implemented a good rubric, one that can assist evaluators in drawing evaluative conclusions and promotes understanding and evaluation use.
For questions or concerns about your event registration, please contact email@example.com or 202-367-1173.
For questions about your account, membership status, or help logging in, please contact firstname.lastname@example.org.
Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 3, 2016. Email cancellation requests to email@example.com. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 3, 2016 all sales are final.