Evaluation 2016: Evaluation + Design

View Printable Version

Applying Theory-Driven Evaluation in Program Design

Session Number: 1447
Track: Program Theory and Theory-Driven Evaluation
Session Type: Panel
Session Chair: Huey T. Chen [Professor - Mercer University]
Discussant: Jonathan Morell [Director of Evaluation - Syntek ]
Presenter 1: Leslie Ann Fierro [Assistant Clinical Professor of Evaluation - Claremont Graduate University]
Presenter 2: Katrina L Bledsoe [Senior Research Director - DeBruce Foundation]
Presenter 3: Huey T. Chen [Professor - Mercer University]
Time: Oct 29, 2016 (08:00 AM - 09:30 AM)
Room: International South 4

Abstract 1 Title: Using Theory to Improve Program and Evaluation Design
Presentation Abstract 1:

Despite common misguided thought that theory is neither practical nor relevant, this presentation will illustrate how theory can be used to improve program and evaluation design. Drawing on theories of change, evaluation theories, and social science theories, the principles of sound program and evaluation design will be explored. The challenges and opportunities of using theory in design, including specific examples illustrating optimal use, will be provided. Finally, future directions for improving the use of theory in program and evaluation design will be discussed.


Abstract 2 Title: Culturally Responsive Program Design
Presentation Abstract 2:

Cultural responsiveness in evaluation is becoming a “must do” in terms producing valid, reliable, and credible evaluations of both programs and policies. However, cultural responsiveness must start earlier than the evaluative phase. Programs, program theories, as well as evaluation and method design needs to be culturally reflective and responsive. I contend that without understanding the cultural theory that underlies the program, program development as well as evaluation will only represent a “half-story.” This presentation will discuss how theory-driven evaluation can help provide the needed foundation for program design that is reflective of and responsive to the culture and cultural context. It is anticipated that discussion amongst the audience participants and presenter will focus on how cultural responsiveness in program design ultimately leads to reflective, accurate, and credible evaluations.


Abstract 3 Title: Contribution of Stress Test to Program Design: A Theory-Driven Evaluation Approach
Presentation Abstract 3:

The author proposes a stress test for intervention program plans, through which evaluators can contribute to program design. When a program plan is completed, before implementation begins, assessing the quality of a plan would benefit everyone. The stress test based upon the action model/change model schema intends to provide such information. Evaluators invite key stakeholders, the design team, and other experts to conduct a set of exercises challenging crucial assumptions of the program plan to detect its potential strengths and weaknesses. The test assesses: the extent of securing support from stakeholders and community, the feasibility of reaching the target population, the degree of difficulty in implementation, and the likelihood of producing desirable changes and unintended consequence. The test will reveal a plan’s potential weaknesses to suggest improvements, as well as showing its potential strengths. A program plan, having passed the stress test, will have a greater chance of success when implemented within a real world setting.


Audience Level: Intermediate

Session Abstract: 

This panel attempts to explain why theory-driven evaluation (TDE) is useful for designing a program regarding: What concepts and approaches can be applied in the planning process, as well as how evaluators could contribute to a design team’s work. These three papers in this panel contribute to the application of TDE in program design in their own unique way. Donaldson’s paper discusses the relationships between theory, evaluation, and program design; whereas Bledsoe’s paper focuses on how to use theory to design a culturally responsive program; finally, Chen’s paper discusses how to conduct a stress test on major theoretical assumptions underlying a program for exposing potential weaknesses and/or strengths of a program plan for improvement.  Evaluation and Program Planning journal editor, Jonny Morell, and the Chair of TDE TIG, Charles Gaper, will serve as discussants providing feedback and insights on the presentations, and on theory-driven evaluation’s growing role in program design.  

 



For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.



Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 3, 2016. Email cancellation requests to registration@eval.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 3, 2016 all sales are final.