Evaluation 2018: Speaking Truth to Power

View Printable Version

Giving an active voice to respondents through effective survey design

Session Number: 1427
Track: Quantitative Methods: Theory and Design
Session Type: Panel
Tags: quantitative methods, questionnaire design, Survey Design, Survey Methodology, Survey Research
Session Chair: Zehra Mirza, Mirza [Evaluation Officer - Institute of International Education (IIE)]
Discussant: Laura Pryor [Social Policy Research Associates]
Presenter 2: Carina Omoeva [Director, Research - FHI 360]
Presenter 3: Ann Doucette [Professor - The Evaluators' Institute]
Presenter 4: Zehra Mirza, Mirza [Evaluation Officer - Institute of International Education (IIE)]
Session Facilitator: Mirka Martel [Head of Monitoring, Evaluation and Learning - Institute of International Education]
Second Author or Discussion Group Leader: Ann Doucette [Professor - The Evaluators' Institute]
Third Author or Discussion Group Leader: Carina Omoeva [Director, Research - FHI 360]
Fourth Author or Discussion Group Leader: Zehra Mirza, Mirza [Evaluation Officer - Institute of International Education (IIE)]
Time: Nov 02, 2018 (04:30 PM - 05:30 PM)
Room: CC - 21

Abstract 2 Title: How do evaluators measure educational inequities in surveys to ensure data comparability?
Presentation Abstract 2:

Measuring equity in surveys is a widely-recognized challenge for education systems and, more broadly, for societies, and the new global agenda makes improved equity in education an important focus in the Sustainable Development Goals. To address this challenge, the Education Equity Research Initiative developed recommendations on improving surveys, which include standardized modules for collecting SES, ethnicity, orphanhood, displacement status modules, as well as the new Washington Group module on disability. The presentation will focus on ways to streamline definitions of equity dimensions, improve data comparability, and address measurement of equity dimensions commonly overlooked using these modules. This presentation will be of particular interest to those considering what more we could reasonably do to measure inequality and work towards greater equity in and through education. In doing so, the presentation stresses the important role of development programming in building equity at the community level and sees this as an essential element of efforts to achieve the global equity agenda.


Abstract 3 Title: Response Options and Scales – What do they really reveal?
Presentation Abstract 3:

While we struggle to ensure that the language and content of survey items is understandable and unbiased, far less attention is given to developing reliable response options associated with those items. We assume that in using survey rating scales, respondents construct a mental schema for determining the appropriate response for each survey item, and systematically use this schema in responding to items throughout the survey – establishing choice boundaries that differentiate between strongly agree and agree, often and almost always. These scales are then used to segment samples in terms of their strength of agreement, their judgment about duration and frequency, and their factual accuracy, etc. This presentation will address how item response theory (IRT) measurement models more accurately detect and assess the measurement error associated with response scale options. The presentation will include examples from evaluation outcome/impact studies, where secondary analyses of the data using IRT revealed that disappoint findings were attributable to measurement error and not to the program; as well as offer alternatives to the heavily relied on rating scale response option.


Abstract 4 Title: Improving the Robustness of Surveys to Evaluate Fellowship Programs
Presentation Abstract 4:

As with all development interventions, assessing the impact of higher education fellowship programs requires the use of objective and well-structured surveys. IIE’s Alumni Tracking Study for the Ford Foundation International Fellowships Program (IFP) developed two global alumni Surveys to collect data from 4,300 alumni, across 22 countries. This presentation will share how IIE’s approach to survey design has evolved as part of the tracking study, to capture more nuanced data, and provide inclusive response categories to survey participants. As this longitudinal tracking study relies on advanced quantitative methods, cultural sensitivities were considered in survey design, especially when addressing topics such as political unrest and barriers for employment. Survey response scales were also modified to ensure that respondents speak to any negative impacts the fellowship may have had on their personal and professional trajectories. An integrated approach to designing the second alumni survey, one that was informed by qualitative fieldwork, has proven to be most effective in giving a voice to the survey participants, especially for a study population that comprises of individuals that were historically excluded from higher education opportunities.


Audience Level: Beginner, Intermediate, Advanced, All Audiences

Session Abstract (150 words): 

Surveys are a popular tool for evaluators since they carry the most weight in the purview of sponsors that are seeking quantifiable evidence of their program’s impact. Survey design, however, is not always guided by principles that promote equity and truth. This panel will speak to a range of methodological approaches to survey design that advance inclusiveness and truth such as the phrasing, and ordering of questions, as well as drafting response options on scales that are unbiased to promote honest responses. Panelists will reflect on the various ways in which their organization creates surveys to collect data that is all-encompassing and reliable, but also sensitive to respondents.



For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.



Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 8, 2018. Email cancellation requests to registration@eval.org. All refunds are processed after the meeting. After October 8, 2018 all sales are final. For Evaluation 2018, international attendees and presenters who encounter complications due to the international travel environment will have up to 30 days after the event to request a refund and submit appropriate documentation. No administrative fee will apply for the international requests.