Session Number: 2769
Track: Program Design
Session Type: Topical Interest Group (TIG) Business Meetings
Tags: DesignThinking, Developmental Evaluation, Program Design
Session Chair: Chi Yan Lam [Queen's University]
Presenter 1: Huey T. Chen [Mercer University]
Presenter 2: John Gargani [Gargani + Company Inc]
Presenter 3: Brenda Stead [Stead Consultants]
Presenter 4: Cameron Norman, CE [CENSE Research + Design]
Time: Nov 12, 2015 (03:00 PM - 04:30 PM)
Abstract 1 Title: Advancing Program Design and Evaluation Theory through Bilateral Learning
Presentation Abstract 1: Abundant evaluation literatures discuss how program design can be benefited from principles and guidance provided by evaluation theory. For example, the experimentation evaluation approach has demonstrated the Campbellian validity typology is useful for guiding how to design a program in a way allowing for a rigorous assessment of its effectiveness. However, much fewer literatures discuss evaluation theory can learn from program design and its application in the real world. This paper attempts to address the imbalance by arguing that knowledge of program design can provide innovative ideas and insights for developing new evaluation concepts and theories. This paper will illustrate stakeholders do not share identical views with researchers regarding program design, implementation and effectiveness. By closely examining these differences and rationales, evaluators could further advance evaluation theory. For example, program design’s concepts such as program complexity and openness have a potential for integrating various stand-alone evaluation theories. Both program design and evaluation theory can benefit from bilateral learning.
Abstract 2 Title: Promoting Exemplary Programs with New Program Design Methods
Presentation Abstract 2: Evaluators have participated in program design and redesign efforts for at least 70 years. Yet as a profession, we do not have a well-developed understanding of how programs should be designed or how evaluators can best support the design process. Patton’s concept of Developmental Evaluation has renewed interest in program design, but it does not fully bridge the theory-practice divide. I describe recent work on program design methods that I and Stewart Donaldson have undertaken in an effort to turn the messy work of design into a systematic, collaborative effort. It is our hope that our work opens the door to teams of designers, evaluators, and stakeholders working together to create programs that reliably delver impacts that are most valued.
Abstract 3 Title: Engaging Evaluators in Program Design and Policy
Presentation Abstract 3: It is predominantly the case that evaluators are not “invited to the party” where policies are developed and programs are conceptualized, designed or re-designed. How can we better bridge evaluation with policy and program design? How can evaluators engage ‘early on’ in policy and program decision-making, and use their competencies to add value to program design, theory, and practice?
As a consultant, educator, evaluator, trainer, and former federal government director of strategic program planning, evaluation, and organizational change, our next panel member will share the recent experience she and her academic colleague had in developing and delivering a training initiative for a provincial government department in Canada. This initiative was undertaken with senior officials, senior program managers, managers, and program and evaluation consultants – to build team capacity in program design and evaluation. Also, as a Canadian Evaluation Society (CES) Board member heavily involved in the Professional Learning Committee and the Joint Committee on Standards for Educational Evaluation, and the AEA Program Design TIG, she will discuss how evaluator competencies can be transferred to facilitate program design and policy, at an early stage.
Abstract 4 Title: Social Innovation and Program Design
Presentation Abstract 4: Social innovation is a process of bringing together the needs of society with a plan to develop something new as a means of addressing those needs. Yet, what is considered new, what value it holds, and whom and how people come together to create those innovations is an area where evaluation can and does make substantive contribution. Evaluation knowledge is the key to connecting the ideas to the design and its outcomes as it operates within systems of influence. In this presentation, the role of evaluation as a part of program design for social innovation will be explored through the use of examples from academic research, practice and references to relevant design and systems theories. The aim is to showcase the role of the evaluator and designer as part of the social innovation process.
Audience Level: Advanced
Notions of design have entered the mainstream in both public and private sectors. Underpinning this shift is an emerging realization that the once-professionalized approaches and mindsets designers employ to solve complex problems may be applied to other contexts. Bridging evaluation with design holds potentials to reconceptualize both the theories and practices of evaluation, and as a consequence, enhance evaluation influence. This panel of expert evaluators draws on their theoretical and practical experiences to explore what ‘program design’ could mean for evaluators and evaluation practice.