Evaluation 2017: From Learning to Action

View Printable Version

Bayesian approach in evaluation

Session Number: Quant5
Track: Quantitative Methods: Theory and Design
Session Type: TIG Multipaper
Session Chair: Haiyan Bai [Associate Professor - University of Central Florida]
Presenter 1: Robert A. Petrin [Vice President - Ipsos Public Affairs]
Presenter 2: Claudia Gonzalez Martinez [Senior Economista - Econometrica, Inc.]
Presentation 1 Additional Author: Joseph Zappa [Ipsos Public Affairs]
Presentation 1 Additional Author: Meghana Raja [Assistant Statistician - Ipsos Public Affairs]
Time: Nov 09, 2017 (11:30 AM - 12:15 PM)
Room: Marriott Balcony A

Presentation Abstract 1: Evaluation researchers often used mixed effect models to analyze data from repeated measures designs.  These models generally require that the distribution of the program outcomes being analyzed are reasonably known and follow a fixed random variable distribution over time.  These models do not accommodate a situation common in longitudinal evaluations – namely those where the intervention influences measurement error across the observation period.  Finally, conventional mixed effects models do not readily allow researchers to model more than outcome measure means.  The result is a loss of the richness inherent to longitudinal evaluations, which compromises program learning. This paper presents Bayesian quantile regression (BQR) to remedy many of these challenges.  The empirical portion of the paper includes simulations to illustrate the advantages of BQR in longitudinal evaluation studies.  These simulations are followed by an application of BQR to data from a women’s entrepreneurship program in South Africa using a five-wave, repeated-measures design.
Presentation 1 Other Authors: Dominick Hannah; Ipsos Public Affairs
Abstract 2 Title: Estimating and Attributing Impact in the Presence of Confounding Initiatives and Spillover Effects
Presentation Abstract 2: We develop an approach that combines Empirical Bayesian (EB) methods with a Conditional Demand Analysis (CDA) framework for attributing impact in situations with confounding initiatives and spillover effects that difficult identifying control groups.
Theme: Select one
Audience Level: None

Session Abstract (150 words):  Bayesian approach in evaluation

For questions or concerns about your event registration, please contact registration@eval.org or 202-367-1173.

For questions about your account, membership status, or help logging in, please contact info@eval.org.

Cancellation Policy: Refunds less a $50 fee will be granted for requests received in writing prior to 11:59 PM EDT October 16, 2017. Email cancellation requests to registration@eval.org. Fax request to (202) 367-2173. All refunds are processed after the meeting. After October 16, 2017 all sales are final. For Evaluation 2017, international attendees and presenters who encounter complications due to the international travel environment will have up to 30 days after the event to request a refund and submit appropriate documentation. No administrative fee will apply for the international requests. The $50 fee will be waived for registrants who planned to travel into the US and experienced international travel issues.