Responsive Evaluation to Broaden Participation in STEM

Session Number: 2561
Track: STEM Education and Training
Session Type: Multipaper
Session Chair: Karen Mutch-Jones [Director, STEM Education Evaluation Center - TERC]
Presenter 1: Debra Bernstein [Senior Researcher - TERC]
Presenter 2: Sabrina Gisselle De Los Santos [Research and Development Associate]
Presenter 3: James Hammerman [Director, STEM Education Evaluation Center (SEEC) - TERC]
Presenter 4: Michael Cassidy [TERC]
Presentation 1 Additional Author: Karen Mutch-Jones [Director, STEM Education Evaluation Center - TERC]
Presentation 2 Additional Author: Eric Hochberg [Senior Researcher and Evaluator - TERC]
Presentation 3 Additional Author: Audrey E Martinez-Gudapakkam [Research Associate - TERC]
Presentation 4 Additional Author: Santiago Gasca [Research Associate - TERC]
Presentation 4 Additional Author: Rachel Hayes [Project Manager - TERC]
Time: Nov 15, 2019 (04:30 PM - 05:30 PM)
Room: CC 200 B

Abstract 1 Title: The Evaluation Landscape: Identifying Challenges and Motivating Change
Presentation Abstract 1:

As members of the STEM Education Evaluation Center at TERC, much of our work focuses on designing and implementing evaluations that are responsive to underserved populations in STEM, and that inform and support programs intended to broaden participation of students, preschool through graduate school. In particular, we study learning experiences and outcomes for students of color, students with disabilities, English language learners, and girls/women in STEM classrooms and informal settings. 

Based on evidence from the field and results of numerous projects, our first presentation will: frame evaluation issues and challenges that must be addressed if our goal is to ensure high quality learning opportunities for all; suggest how evaluation data can motivate critical changes at program and organizational levels; and identify how mistakes and successes enabled staff to improve all phases of evaluation work—from design to dissemination—leading to more inclusive and equitable educational experiences. 


Abstract 2 Title: Evaluation Design that Promotes Inclusion
Presentation Abstract 2:

We will share how evaluation questions and design features of specific projects allowed us to identify programmatic barriers that impact participants, recommend evidence-based revisions to partners who build and run programs, and identify the value of inclusivity for stakeholders (e.g., funders and those who authorize the activities and study).  For example, we will describe our evaluation of Backyard Wilderness, a project promoting environmental awareness and activation of citizen scientists, highlighting ways in which measures of effective Bioblitz implementation, family engagement, and motivation required revision to fully capture the experiences of certain remote-rural and urban communities. In addition, we will describe how evaluators found some implementations of Learn to Mod, a project where students learn computer science through gaming, were more supportive of girls’ participation than others. Analysis of these features helped program partners improve activities, which cultivated broader interest and participation.  Designs for both projects informed future programming.


Abstract 3 Title: Accessible and Appropriate Instrumentation and Data Collection
Presentation Abstract 3:

Experiences of participants with disabilities and/or who are non-English speaking compelled us to identify and design accessible instruments to improve quality of data collected.  To measure learning in IDATA, a project where blind and visually impaired students engage with astronomy data, online survey tools required testing to ensure screen reader compatibility and appropriate navigation options. Measuring computational thinking (CT) required creation of non-visual CT problem-solving contexts because most existing assessments are visually oriented.  For Aprendiendo de familias: Learning Math Talk from PreK Spanish Speaking Families, which aims to close the gap in kindergarten readiness of children without access to affordable PreK, quick and manageable mobile applications are being developed to collect poll, quiz, video, and user data from caregivers.  By increasing evaluation comfort of non-English speaking immigrant parents, we increase the possibility of getting sufficient and nuanced data so we can address the needs of this important population.


Abstract 4 Title: Analysis, Reporting, and Dissemination to Support Change and Empowerment
Presentation Abstract 4:

Data analysis requires a broadening participation mindset, even with well-designed and executed studies.  Small studies warrant checking individual outliers to identify how demographic variables or prior experiences qualify findings.  For example, probing anecdotal data for instances of success in Zoombinis, a game to support computational thinking, revealed that some more engaged, higher performing students were on the autism spectrum.  As a result, targeted future studies of gaming to support these students are underway. In a large-scale elementary math study, fidelity of program implementation significantly closed the achievement gap in high need districts.  Probing further, administrative barriers were shown to influence teachers’ consistency of program use. 

By sharing examples from reports/dissemination with attendees, we will encourage discussion about accessible evaluation processes and effective communication of findings about STEM experiences, and persistent challenges for under-represented learners and educators.  Together, we will consider dissemination outlets and approaches to advance educational equity in STEM.

 

 


Audience Level: All Audiences

Session Abstract (150 words): 

Responsive program evaluations can create access to educational opportunities for underrepresented students in Science, Technology, Engineering, and Mathematics (STEM).  This session will include presentations on effective practices for evaluating learning experiences and outcomes for students of color, students with disabilities, English language learners, and girls/women in STEM classrooms and informal settings, as we identify how mistakes and successes enabled evaluator-presenters to improve all phases of evaluation work—from design to dissemination.  Within each presentation, we provide examples (via report excerpts, artifacts, and video) to illuminate evaluation processes and evidence-based recommendations.  We will encourage discussion and questions about design decisions, accessible instruments to improve data collection and quality, and analytic approaches that yield more complex results, enabling project partners and stakeholders to value and work toward inclusivity.  Lastly, we will discuss dissemination approaches and outlets that enable evaluators to share findings and advocate for broadening participation in STEM.