Evaluation 2016: Evaluation + Design

View Printable Version

Iterating Empirical Data Collection and Agent-Based Modeling as an Evaluation Methodology

Session Number: 1174
Track: Systems in Evaluation
Session Type: Panel
Tags: complexity, systems
Session Chair: Melanie Hwalek [CEO - SPEC Associates]
Presenter 1: Jonathan Morell [Director of Evaluation - Syntek ]
Presenter 2: Jonathan Morell [Director of Evaluation - Syntek ]
Presenter 3: Kirk Knestis [CEO - Hezel Associates]
Presenter 4: Melanie Hwalek [CEO - SPEC Associates]
Presentation 2 Additional Author: Kirk Knestis [CEO - Hezel Associates]
Presentation 2 Additional Author: Melanie Hwalek [CEO - SPEC Associates]
Presentation 2 Additional Author: Caitlin Griffin [Lead Research Associate - Network For Teaching Entrepreneurship]
Presentation 3 Additional Author: Caitlin Griffin [Lead Research Associate - Network For Teaching Entrepreneurship]
Time: Oct 27, 2016 (08:00 AM - 09:30 AM)
Room: L406

Abstract 1 Title: History and context
Presentation Abstract 1:

The work presented in this panel derives from a longstanding research agenda whose overall purpose is to advance the contribution of “complexity” to the field of evaluation. We differentiate “complexity” from “complex system” because the latter is a subset of the former. A major aspect of our research efforts has been to explore the contribution of agent-based modeling to the conduct of evaluation as done by traditional means. This presentation will present an overview of the work we have done to date, and will prime the audience for important elements to look for in the video to follow.


Presentation 1 Other Authors: H. Van Dyke Parunak van.parunak@gmail.com,
VP, Technology Innovation, AxonAI, Inc.
Abstract 2 Title: Video Presentation: Integrating Traditional Evaluation and Agent Based Modeling – History, Development, and Lessons Learned
Presentation Abstract 2:

This session will consist of a video that tells the story of our efforts. Topics included will be: 1) Development of the program theory through several iterations of going back and forth between the data analysis and the executable model. 2) Overview of what the data were, and decisions about how it might best used to best advantage. 3) Assumptions about the program that would not have been revealed using only traditional logic modeling methods. 4) Evaluative knowledge about the program that resulted from iterating between traditional data analysis and modeling. 5) Knowledge and understanding about evaluation that resulted from collaboration among the diverse team members representing backgrounds in STEM education, school support, community based care, mental health, organizational behavior, and computer science. 6) Understanding of how the methodology should be applied in the future.


Presentation 2 Other Authors: H. Van Dyke Parunak van.parunak@gmail.com,
VP, Technology Innovation, AxonAI, Inc.

Abstract 3 Title: Negotiating Access to AGEC Data
Presentation Abstract 3:

Given the purpose of the study, access to actual evaluation data was a requirement. Project partner Hezel Associates was charged with identifying such data from the firm’s past projects. The right data set would have to: 1) include enough variables to allow ABM experimentation of different theories-of-action, and 2) have been analyzed previously to provide context for this research. The data would have to be available from a willing entity who would see the additional analysis as being in their interest. Such a partner was found in the higher-education partners in AZTransfer, an Arizona higher-ed collaborative working to facilitate student transfer from community colleges to 4-year universities. The AZTransfer steering committee negotiated access to data documenting the AGEC program, and continue to be supporters of the effort. This session will discuss the implications of this partnership on the study and on providing novel useful information to the client.


Abstract 4 Title: Implications for Other Settings
Presentation Abstract 4:

Many of the educational institutions and nonprofit organizations that Melanie Hwalek works with collect longitudinal data as part of their normal monitoring and evaluation activities. Melanie’s presentation will provide some examples of how the model being developed through the Faster Forward Fund grant might be applied to other types of data and to other types of decision making. As a practicing evaluator with decades of experience evaluating a wide array of social programs, she will explore ways that evaluations that SPEC Associates conducted in the past could have been enhanced had these methods been available then, such as with community-based long term care services, mental health interventions, and school-based support services


Audience Level: All Audiences

Session Abstract: 

This panel discusses the value of integrating three themes: 1) modeling and simulation, 2) agent based approaches to complexity, and 3) combining modeling with traditional evaluation over the course of an evaluation’s life cycle. With funding from the Faster Forward Fund, we have been applying agent based simulation to an empirical evaluation data set. The program we have been working with is the Arizona General Education Curriculum (AGEC), a program designed to facilitate students’ transition from community to four year colleges. The core of the presentation will be a video that describes the process through which our work evolved, and the results of our efforts. The video will be embedded in a set of presentations that describe the history of our efforts, negotiating access to AGEC data, implications for use in other settings, and observations about the value and limitations of the method we have developed.



For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.



Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 3, 2016. Email cancellation requests to registration@eval.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 3, 2016 all sales are final.