Evaluating Programs for Elementary and Middle School Students

Session Number: PreK6
Track: PreK-12 Educational Evaluation
Session Type: TIG Multipaper
Session Chair: Krista Collins [Director, Strategy and Innovation - Boys and Girls Clubs of America]
Discussant: Shira Solomon [Independent Evaluator - Solomon Evaluation]
Presenter 1: David Fleming [Associate Professor - Furman University]
Presenter 2: Jennifer Sulewski [Senior Research Associate - Institute for Community Inclusion]
Presenter 3: Linlin Li [Senior Research Associate/Project Director - WestEd]
Presenter 4: Teresa C King [Coordinator, Applied Research and Program Evaluation - Fort Worth ISD]
Presentation 1 Additional Author: David Fleming [Associate Professor - Furman University]
Presentation 2 Additional Author: Agnieszka Maria Zalewska [Research Associate - School for Global Inclusion and Social Development, Umass Bostron]
Presentation 2 Additional Author: Jennifer Sulewski [Senior Research Associate - Institute for Community Inclusion]
Presentation 3 Additional Author: Linlin Li [Senior Research Associate/Project Director - WestEd]
Presentation 3 Additional Author: Cathy Ringstaff [Senior Research Associate - WestEd]
Presentation 3 Additional Author: Kylie Flynn [Senior Research Associate - WestEd]
Presentation 4 Additional Author: Teresa C King [Coordinator, Applied Research and Program Evaluation - Fort Worth ISD]
Time: Oct 28, 2016 (08:00 AM - 09:30 AM)
Room: A601

Abstract 1 Title: A Mixed-Method Evaluation of Public Montessori Programs in South Carolina
Presentation Abstract 1:

This paper describes an ongoing, mixed-method study of public Montessori programs in South Carolina by the Riley Institute at Furman University. The research team is using state databases, teacher and principal surveys, classroom observations, and new measures of non-academic outcomes to provide a holistic evaluation of Montessori programs. In this paper, I detail the evaluation design decisions and tradeoffs. Of particular importance is an innovative case selection strategy to mitigate the problem of selection bias. The Riley Institute research team also used a matching procedure to create a sample of non-Montessori students for analysis purposes. Further, I discuss our preliminary results on a host of important outcomes, such as standardized test scores, model fidelity, and student behavior. 

This paper describes an ongoing, mixed-method study of public Montessori programs in South Carolina by the Riley Institute at Furman University. The research team is using state databases, teacher and principal surveys, classroom observations, and new measures of non-academic outcomes to provide a holistic evaluation of Montessori programs. In this paper, I detail the evaluation design decisions and tradeoffs. Of particular importance is an innovative case selection strategy to mitigate the problem of selection bias. The Riley Institute research team also used a matching procedure to create a sample of non-Montessori students for analysis purposes. Further, I discuss our preliminary results on a host of important outcomes, such as standardized test scores, model fidelity, and student behavior. 


Abstract 2 Title: Using an iterative approach to evaluate a web-based college readiness intervention for middle school students
Presentation Abstract 2:

The Future Quest Island project uses an interactive, accessible web-based game for middle school students, as well as accompanying tools for teachers, to introduce college and career readiness topics into inclusive classrooms. Evaluation is an important part of this effort, informing a yearly Knowledge to Action Iterative Cycle (KTAIC). This cycle, currently in its third of five school years, includes implementing the intervention in partner schools; collecting data from students, teachers, and parents, and making changes to both the intervention and the evaluation tools each year. Evaluating such a dynamic and developing project has presented challenges, including finding or designing the right instruments, coordinating and tracking data collection with an ever-growing number of participants, and figuring out the best ways to collaborate with schools. This presentation will focus on how we addressed these challenges, some of the lessons learned, and how using an iterative process is improving our evaluation.


Abstract 3 Title: Evaluation of Word Learning Strategies: A Program for Upper-Elementary School Students
Presentation Abstract 3:

A significant number of our nation’s students do not develop the level of reading proficiency they need to achieve in school, successfully join the increasingly knowledge-oriented workforce, and assist the U.S. in competing in the global economy. Reading is a complex process involving multiple interrelated components, and vocabulary is one of the most important of these components (Bowers & Kirby, 2009; Carlisle, 2010; McCutchen & Logan, 2011). The Word-Learning Strategies (WLS) program is a comprehensive supplementary program designed to develop upper-elementary students’ vocabularies in order to improve reading comprehension. The goals of this paper are to address (1) the feasibility of implementing the WLS program in urban elementary schools with high numbers of English learners and students from poverty backgrounds; (2) the potential impact of the WLS program on students’ vocabulary and reading comprehension; (3) implications for vocabulary instructional practice; and (4) implications for evaluation theory, method, and practice.

 


Presentation 3 Other Authors: Rachel Tripathy, rtripat@wested.org
Abstract 4 Title: Rethinking Early Warning Systems: How early is early enough
Presentation Abstract 4:

This session will provide an overview and design of a school district's Early Warning System.  Early warning systems using indicators associated with student dropout began gaining momentum between 2005-2008 and  the ISD created their system in 2009.  Using the work of Jerald and others  (Balfanz, Allensworth, & Jerald, 2008; Heppen & Therriault, 2008; Jerald, 2006), the ISD created an early warning system based on historical data of students who dropped out of the district between 2005-06 and 2007-08.  Upon completion, the system focused on identification of four critical indicators: core course failure, absenteeism, discipline referrals, and federally defined at-risk indicators.  While the system does not predict dropout, it generates a daily weighted index that identifies students with excessive absences and/or discipline referrals, failing core course grades, and those students meeting multiple at-risk criteria allowing the district an opportunity to intervene. 


Audience Level: None

Session Abstract:  Evaluating Programs for Elementary and Middle School Students