Evaluation in health department settings: Examples from state/local STD programs

Session Number: 1490
Track: Government Evaluation
Session Type: Panel
Session Chair: Brandy L. Maddox [Centers for Disease Control]
Discussant: Marion Carter [Team lead for program evaluation, Division of STD Prevention - Marion Carter]
Presenter 1: Nicole Olson Burghardt [Epidemiologist - California Department of Public Health]
Presenter 2: Roxanne Ereth [STD Control Epidemiology Program Manager - AZ Dept. of Health Services]
Presenter 3: Mary Roach [Michigan Department of Community Health]
Presenter 4: Kirsten Durzy [Evaluation Specialist - New Hampshire Department of Health ]
Presenter 5: Trang Nguyen [Epidemiologist - San Francisco Dept of Public Health]
Time: Oct 27, 2016 (08:00 AM - 09:30 AM)
Room: International South 5

Abstract 1 Title: Increases in adherence to gonorrhea treatment recommendations in three California local health jurisdictions
Presentation Abstract 1:

Rationale: To counter emerging drug-resistant gonorrhea (GC), adherence to recommended treatment guidelines is essential. Treatment monitoring in the state surveillance system indicated that improved provider adherence and reporting were needed. The California Department of Public Health partnered with local health jurisdictions (LHJs) to increase treatment adherence and reporting.
Evaluation Question: Did contacting low-performing providers result in improved treatment adherence at the provider and LHJ level?
Methods: From March-December 2015, three LHJs selected and contacted local providers that were incorrectly treating GC cases and/or were not reporting GC treatment using visits, phone calls, and/or letters.
Findings: All LHJs increased treatment adherence (32.6% absolute change from 2013 to 2015, p<.0001) and data completeness. All contact methods had a positive impact at the provider and LHJ level.
Barriers: This activity required local resources and may not be scalable elsewhere. Certain contact methods were more resource intensive (visits) than others (phone).
Positive Gains: By prioritizing low-performing providers, all three LHJs improved treatment adherence and reporting.

Abstract 2 Title: Evaluation of Efforts to Increase DIS Partner Services in Arizona
Presentation Abstract 2:

In 2015, the Arizona STD Control Program (STDCP) conducted an evaluation to determine the effectiveness of new trainings designed to increase Partner Services for Primary and Secondary syphilis (P&S) cases.
Evaluation questions were designed to measure: partners elicited and interviewed; time to interview; duplicate and linked records; and missing data. For comparison, baseline measures for 2014 were calculated for partner, cluster and disease intervention indices (PI, CI, DII, respectively). Preliminary Results showed that all indices increased: DII by 30.0%, PI by 65.9%, and CI by 466.6%.
Barriers to evaluation included a lack of knowledgeable staff to develop evaluation plans, changes in county processes and miscommunications between state and county that interfered with evaluation plans, and the ability to obtain accurate measures. However, the evaluation experience improved relationships when data was shared, measures increased, and missed opportunities for disease intervention were identified.

Abstract 3 Title: What’s Next? Engaging Program Staff in Participatory Evaluation Process: Lessons Learned from Michigan’s STD Syphilis Partner Notification Program
Presentation Abstract 3:

In 2015, with the number of staff and resources dedicated to reducing primary and secondary syphilis, Michigan’s STD Program chose to assess the barriers and facilitators to productive client interviews. A mixed-method participatory evaluation plan was implemented, including client surveys, DIS focus groups, interview observation forms, and a partnership survey. The primary evaluation questions were:
1. What barriers and facilitators influence the individual’s decision to interview with a DIS?
2. What strategies result in a productive partner service interview?
3. What barriers and facilitators strengthen existing relationships between community providers and the State STD team?

While numerous facilitators and barriers to a productive interview were identified via the evaluation, perhaps the most important lessons learned came from the process. The participatory approach employed by the Evaluator provided the basis for staff to take ownership of the project, utilize the findings, and create actionable and sustainable plans for program improvement.

Abstract 4 Title: Targeting evaluation efforts in STD programs: New Hampshire's experience
Presentation Abstract 4:

The New Hampshire STD program is evaluating the use of Public Health Detailing (PHD) to (1) increase knowledge of STD testing and treatment guidelines and recommendations and (2) increase the implementation of practice and policy modifications to support STD services in medical provider offices across NH. The evaluation process includes assessment of an on-site educational visit where information about the practice is collected and two follow-up surveys 90 days apart, which collect information about knowledge shift, intention to modify services and whether any actual changes have occurred as a result of the detailing visit. To date, NH has provided PHD services to over 200 providers. Initial results of the evaluation indicate a strong shift in knowledge of STD testing and treatment guidelines as well as an intention to make practice modifications to support increased STD services.

Abstract 5 Title: Targeting evaluation efforts in STD programs: San Francisco's experience
Presentation Abstract 5:

The San Francisco Department of Public Health (SFDPH), one of 6 local jurisdictions funded directly by CDC for STD prevention, conducts more direct service and program activities than state health departments often do. Developing a targeted evaluation plan at first felt onerous due to our limited experience in conducting formal evaluations. The CDC guidance tempered our concerns by providing clear requirements and examples of simple yet robust plans. Despite suggestions of how to select an activity to evaluate in one year, SFDPH expanded its activity mid-year from 2 clinics to 10 clinics because of opportunities to engage with the local health network. But, by taking advantage of having influence with a larger provider community, more stakeholders got involved, leading to delays in decision-making and action. CDC’s flexibility to adjust expectations as local issues arose is important, but continually redefining short-term goals for the sake of evaluation should itself be evaluated.

Audience Level: All Audiences

Session Abstract: 

This panel will focus on evaluation in health departments, using examples from sexually transmitted disease (STD) control programs.  Like other public health programs, STD programs are complex, diverse, and changing, yet typically lack staff with much evaluation expertise.  As part of its funding for STD programs, the Centers for Disease Control and Prevention (CDC) implemented a “targeted evaluation” requirement, to foster quality evaluation that was still flexible to each jurisdiction’s needs and capacity.  The results have been mixed, but largely positive.  The panel will include a presentation by  5 STD programs and a discussion by CDC. Each will describe their experiences with this requirement, including the process for targeting evaluation, barriers, facilitators, and outcomes obtained.  The purpose is to illustrate ways that relatively small health department programs have planned, executed, and used evaluation and to relay lessons learned for others seeking to support and conduct evaluation in resource-constrained public sector settings.