Evaluation 2017: From Learning to Action

View Printable Version

Innovative Qualitative Approaches to Capturing Complexity: Three Case Studies

Session Number: 2682
Track: Qualitative Methods
Session Type: Multipaper
Session Chair: Jeremy Braithwaite [Evaluation Manager - Social Solutions International, Inc.]
Presenter 1: Ellen Walker [Research Associate - Social Solutions International, Inc.]
Presenter 2: Karen Chen [Senior Monitoring and Evaluation Specialist - Social Solutions International, Inc.]
Presenter 3: Hillary Eschenburg [Research Assistant - Social Solutions International, Inc. ]
First Author or Discussion Group Leader: Jeremy Braithwaite [Evaluation Manager - Social Solutions International, Inc.]
Second Author or Discussion Group Leader: Karen Chen [Senior Monitoring and Evaluation Specialist - Social Solutions International, Inc.]
Third Author or Discussion Group Leader: Hillary Eschenburg [Research Assistant - Social Solutions International, Inc. ]
Fourth Author or Discussion Group Leader: Ellen Walker [Research Associate - Social Solutions International, Inc.]
Time: Nov 09, 2017 (08:00 AM - 09:00 AM)
Room: PARK TWR STE 8205

Abstract 1 Title: Multi Qualitative Methodologies in a Call Center Evaluation
Presentation Abstract 1:

Call center evaluations have historically employed multiple methodologies to capture data pertaining to the impact, reach, and quality of these operations. Interview methodologies were the primary qualitative measure implemented in the majority of call center evaluation studies (An et al., 2006; Gould et al., 2012; Schauer et al., 2008; Spaulding et al., 2013). However, a more recent one evaluation of a teenage pregnancy hotline augmented interview and focus group activities with agency site visits to glean additional insights on program goals and training needs (Wei et al., 2010). Building from previous empirical literature, the presenter designed a unique evaluation of a women’s health call center that incorporated multiple qualitative methodologies (in-depth interviews, staged quality assurance calls, and on-site observations) that provided a more holistic assessment the impact, reach, and effectiveness of the call center.


Abstract 2 Title: Using Most Significant Change (MSC) to Capture Outcomes in International Human Rights Programs
Presentation Abstract 2:

The field of evaluation practice of international democracy and human rights programs, has struggled to measure impact and success, particularly for programs operating in restrictive societies and countries under democratic transition. As noted in Krisha Kumar’s 2013 publication, Evaluating Democracy Assistance, democracy and human rights programs face unique evaluation challenges not found in other foreign assistance programs. The path to democracy is not linear. Success can be incremental and may take years, even decades. One donor funding international programs that address human rights issues has required its implementing partners to incorporate Most Significant Change (MSC) as one of their monitoring and evaluation tools to capture and report on programmatic outcomes. As a new M&E technique, both the donor and implementers struggled with MSC. But ultimately, MSC proved to be an effective evaluation tool that captured more substantive programmatic outcomes. The presenter will share some lessons learned and best practices.


Abstract 3 Title: Strategies for Securing High Response Rates to Qualitative Questions in Online Surveys
Presentation Abstract 3:

The evaluators have identified strategies for securing high item response rates to qualitative questions (which have notoriously low response rates) and ensuring methodologically sound data points. As part of a needs assessment of a global organization’s workforce as part of the redesign of its leadership training curriculum, the evaluators administered a large, online survey that consisted of both quantitative and qualitative indicators. Between 63 and 73 percent of respondents answered two of the open-ended questions, resulting in approximately 1,750 responses for one question and more than 2,000 responses for another. The high response rates for these qualitative indicators provided uniquely rich data that the evaluators capitalized on. The presenter will share her insights for garnering more qualitative responses to online surveys. She will also discuss how the evaluators overcame the distinctive challenge of analyzing large sets of raw, qualitative data while ensuring highly robust and valid data points.


Theme: Learning to Enhance Evaluation
Audience Level: All Audiences

Session Abstract (150 words): 

Qualitative methodology is a powerful, data collection tool for addressing complex issues, which are often times hard to quantify. But analysis for qualitative data can feel overwhelming. Evaluators can struggle with how to provide meaningful results, whilst maintaining high scientific rigor. In this session, evaluators from Social Solutions International, Inc. will share case studies from three evaluations that incorporated innovative qualitative methodologies and discuss the best practices and challenges with each methodology. The first case study is an evaluation that combined multiple qualitative methods to provide a more holistic assessment. The second is a case study using Most Significant Change (MSC) to capture outcomes from organizations implementing international human rights programs. The third case study discusses an evaluation that secured high response rates to qualitative questions while still ensuring robust and valid data. An engaging discussion will help attendees develop strategies for developing qualitative methods for their own evaluations. 



For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.



Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 16, 2017. Email cancellation requests to registration@eval.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 16, 2017 all sales are final. For Evaluation 2017, international attendees and presenters who encounter complications due to the international travel environment will have up to 30 days after the event to request a refund and submit appropriate documentation. No administrative fee will apply for the international requests. The $50 fee will be waived for registrants who planned to travel into the US and experienced international travel issues.