Skip to content
ALL Metrics
-
Views
-
Downloads
Get PDF
Get XML
Cite
Export
Track
Open Letter

Using developmental evaluation to implement an oral pre-exposure prophylaxis (PrEP) project in Kenya

[version 1; peer review: 2 approved with reservations]
PUBLISHED 15 Oct 2020
Author details Author details

Abstract

Oral Pre-Exposure Prophylaxis (PrEP) is highly effective in lowering HIV transmission risk. The Bill and Melinda Gates-funded Jilinde Project was designed to identify the best ways to introduce and support PrEP services in Kenya for female sex workers, men who have sex with men, and adolescent girls and young women. We chose Developmental Evaluation (DE) as a core project approach because our goal was not just to recruit 20,000 new PrEP users, but to learn how to deliver PrEP effectively to optimally benefit users in a complex, dynamic, resource-limited setting. This paper describes how we incorporated DE into the Jilinde Project, and shares experiences and lessons learned about the value of DE in PrEP service implementation in a real-world situation.

With the Ministry of Health, Jilinde developed consensus about the structure and roll-out of PrEP services. The DE evaluator, embedded in Jilinde, designed and implemented the five-step DE methodology—collect, review, reflect, record and act—according to a core set of project guiding principles. The paper describes how we operationalized the five elements, citing findings reported and actions taken reflecting on the data. It summarizes challenges to DE implementation, such as uneven uptake and competing demands, and how we addressed those challenges.

Used consistently, DE helped adapt and refine PrEP services, improve service access, reach target audiences and improve continuation rates. The look, feel and yield of our DE efforts evolved over time, increasingly integrated into existing systems and providing deeper and richer understandings, and we learned how to better implement DE in the future.

This case study provides practical guidance for using a DE approach in program design. The DE process can be used successfully working with partners on a common complex public health challenge within a dynamic environment in a way that feeds back into and improves programs.

Keywords

PrEP, Developmental Evaluation, HIV prevention, key populations, Sub-Saharan Africa, public health practice, program design, intervention development

Introduction

What is Oral PrEP?

Oral Pre-Exposure Prophylaxis (PrEP) is a proven highly effective intervention for lowering the risk of HIV transmission. When taken as prescribed in accordance with the recommended dosing regimen during periods of HIV risk, oral PrEP is over ninety percent effective at protecting HIV-negative individuals against acquiring HIV infection (Baeten et al., 2012; Desai et al., 2017; Grant & Glidden, 2016; McCormack et al., 2016; Thigpen et al., 2012; Van Damme et al., 2012). Subsequently, WHO issued guidelines recommending PrEP “for people at substantial risk of HIV infection as part of combination HIV prevention approaches” (WHO, 2016). Biological protection requires sufficient dosing preceding the anticipated HIV exposure and thereafter according to the dosing regimen. Users infrequently experience side effects, which are mostly minor, but adherence is critical to the drug’s protective effect.

The Jilinde Project

Because this approach to prevent HIV infection is quite new, the best ways to reach those who can benefit most, and support those who choose PrEP services to enroll and continue using PrEP effectively over time, are not well understood. To learn more, the Bill & Melinda Gates Foundation (BMGF) funded ‘Jilinde’ which is a Swahili word that means “protect yourself," a five-year (2016–2021) project in Kenya, to demonstrate and document an affordable, effective and sustainable model for scaling up oral PrEP as an HIV-prevention intervention under low-resource, real-market conditions. The project is implemented by a consortium of five partners: Jhpiego (lead), Kenya’s National AIDS and STI Control Programme (NASCOP), International Centre for Reproductive Health-Kenya (ICRHK), Population Services Kenya (PS Kenya) and Avenir Health. The Jilinde Project specifically targets enrolling female sex workers (FSWs), men who have sex with men (MSM) and adolescent girls and young women (AGYW) into oral PrEP services (Were et al., 2018; Were et al., 2019; Were et al., 2020)

Why choose Developmental Evaluation (DE)?

We chose DE as a core project approach because our goal was not just to recruit 20,000 new PrEP users, but to understand and implement the best, most effective PrEP services, within a complex, dynamic environment. We wanted to know the best ways to introduce people in each target group to PrEP services, and how to support them to continue using PrEP effectively over time.

The DE approach to intervention design, improvement and evaluation was developed by Michael Quinn Patton (Patton, 2011). It is deeply and continuously informed by those the intervention is meant to serve and fundamentally rooted within the context of beneficiaries’ communities. DE is methodologically agnostic, employing a range of rigorous qualitative and quantitative methods, and supplementing with other available data sources, for example, routine monitoring data, service statistics, key informant insights, and document reviews. DE implementers add data sources as they identify knowledge gaps to be addressed. DE has been defined as “an approach to understanding the activities of a program operating in dynamic, novel environments with complex interactions. It focuses on innovation and strategic learning rather than standard outcomes” (Patton, 2011). DE is particularly useful to develop innovative solutions under conditions of uncertainty and complexity, which is why we chose DE for the Jilinde Project (see Box 1).

Box 1. Features of Oral PrEP Conducive to DE in Jilinde

  • PrEP is an emerging HIV prevention approach; though it is proven effective, no standards for PrEP scale-up exist.

  • The context is complex with multiple stakeholders; DE is an approach designed for complex systems, requiring participant feedback and engagement, iterative probing, sensing, and responding.

  • The national, international, and local context is fluid regarding PrEP; DE will adapt to changing context and environment where no stable benchmarks exist.

  • PrEP would be integrated onto existing delivery platforms, the sustainability and neatness of fit depending on adoption and resourcing by anchor HIV programs.

  • PrEP’s delivery is largely under the purview and stewardship of health system players who might exhibit biases.

DE has been used in a variety of settings to address a range of complex challenges (Patton et al., 2016), and donors’ interest in using DE to address global health challenges appears to be increasing. For example, USAID funded several DE Pilot Activity projects (DEPA-MERL) within the US Global Development Lab to test the approach to more efficiently iterate and fine-tune interventions under complex conditions. Similarly, this BMGF-funded project employed the DE activities described here to harvest as much learning as possible about PrEP services throughout its life.

The purpose of this paper is to describe how we incorporated DE into the design and adaptation of the Jilinde Project, and share experiences, early findings, and lessons learned about the value of DE in PrEP implementation in real-world situations. We will describe how we implemented and used DE at different levels to determine the best way to implement oral PrEP services to reach and retain the high risk individuals in public and private sectors in Kenya and to make PrEP services more acceptable and easier to access.

The Jilinde Project

Setting

The Jilinde Project was designed to provide PrEP to 20,776 high-risk individuals in Kenya across three “clusters”: Nairobi (which included Nairobi, Machakos and Kiambu Counties), Lake (Kisumu, Migori and Kisii Counties) and Coast (Mombasa, Kwale, Kilifi, and Taita Taveta Counties). To reach this target, the Jilinde Project identified 93 sites--including 45 public health facilities, 12 private health facilities and 36 drop-in centers (DICEs). Although the project had set an aspirational target to initiate over 20,000 clients on PrEP, the core mandate was to learn as much as possible about how to deliver PrEP effectively to optimally benefit users in resource-limited settings.

Intervention description

Jilinde’s implementation approach was to create demand for PrEP services, facilitate the delivery of PrEP services, and document and disseminate the lessons learned during implementation. Under the umbrella of the Ministry of Health, Jilinde project staff worked closely with stakeholders to develop consensus about how to structure and roll-out PrEP services. The project was launched in September 2016 and since then the project facilitated whole-site orientation for the 93 sites, contributed to training of 667 national and county trainers of trainees (TOTs) and 1,140 health providers on PrEP services. By end of June 2020, 35,524 individuals had initiated PrEP (18,447 FSW; 5,219 MSM; 3,389 AGYW; 3,851 individuals in sero-discordant relationships; and 4,618 individuals from the general public) through the project. One of the key data sources is the routine health facility service statistics that are collected monthly using nationally approved registers and reported electronically through the existing health information management system. The project supports collation and analysis of data and facilitates meetings to discuss the findings.

Implementing DE

The Jilinde Project recruited an internal DE evaluator, who was embedded in the project (AM). The DE evaluator had previous experience implementing DE and a deep understanding of the project, the context, and the problem Jilinde was designed to address. His primary role was to lead the DE efforts across the project at all levels and champion its implementation, although he also had other program and research responsibilities. The DE evaluator was integrated into the daily work of the project, involved in all project planning, and engaged in discussions about implementation, donor requirements, data collection and reporting. Our DE approach was deliberate but opportunistic. That is, because Jilinde had only one dedicated DE evaluator across three clusters and 93 sites, the DE evaluator leveraged the most willing and interested partners and colleagues at each level—community, facility, county, and project—nurturing and supporting engagement along the way. DE review sessions were referred to as “pause, reflect and document” sessions, in an effort to simplify the process and enable the project team to connect easily with the concept of DE which they initially perceived to be complex and difficult. These DE sessions were conducted in routine monthly, quarterly, or semi-annual performance review meetings sponsored by the project or by other partners. The DE evaluator trained and supported interested stakeholders in the DE process and methods. The DE approach was also evolving; as needs changed, capacity of partners changed and donor requests changed, the way DE was carried out also changed accordingly.

Implementing DE required following a clear set of steps that the project developed early on. Figure 1 summarizes the five elements of Jilinde’s overall DE methodology: collect, review, reflect, record and act. We describe how each of the five elements was operationalized, citing actual findings reported and actions taken as a result of reviewing and reflecting on the emerging data.

dfadf38c-f172-433c-9dd8-8bc235f783ac_figure1.gif

Figure 1. Principle elements of Jilinde’s developmental evaluation process.

1. Collect. The first step in the iterative process required pulling together available data. We included data that were formal and informal, quantitative, and qualitative, primary, and secondary, depending on the level of reflection, data availability and stakeholder interests. Typically, this included service use statistics, feedback from clients, key informant interviews with health providers and mobilizers, and meeting minutes. For example, during a county “pause, reflect, and document” session, attendees who would include county-level health managers, health information officers, facility and DICE program managers and representatives, and Jilinde staff participants would review the data together. They broke down the number of clients tested for HIV, counseled about PrEP, those who initiated PrEP, and those who continued using PrEP at different months after initiation (as evidenced by prescription refills). They also queried for reasons PrEP clients discontinued services, and under what conditions they returned. Before the sessions, data were collected and compiled to be ready for systematic review. At cluster level, stakeholders generally collected similar service data, but disaggregated those data to allow analysis of, for example, progress to target and populations reached by target group (FSW, MSM, AGYW; see example in Figure 2).

dfadf38c-f172-433c-9dd8-8bc235f783ac_figure2.gif

Figure 2. Pre-exposure prophylaxis (PrEP) cascade disaggregated by population type illustrating actionable differences.

Implementers at the cluster level were also interested in outcomes such as initiation and continuation by service delivery model—private, public and DICEs (see Figure 3). At the program level, synthesized data from utilization-focused evaluations including client exit surveys, focus group discussions with providers, potential and actual users, mobilizers, and data from individual in-depth and key informant interviews with AGYW, parents with AGYW and relevant stakeholders were reviewed. Additionally, program stakeholders reviewed emerging relevant evidence from other local PrEP programs and from global implementers (e.g., implementation research or routine program data).

dfadf38c-f172-433c-9dd8-8bc235f783ac_figure3.gif

Figure 3. Percent of pre-exposure prophylaxis initiations by service delivery model.

2. Review. During the second step, stakeholders critically reviewed all the findings from relevant data. Though “review” and “reflect” are described separately here, in practice they happened together and iteratively, particularly during the “pause, reflect, and document” sessions. For activity, facility-level and county-level sessions, stakeholders met in person, while stakeholders for region- and project-level sessions participated either in-person or virtually. Session membership evolved according to the groups’ current needs, but typically included community, facility or project leaders, PrEP clinicians, data specialists and representatives from the Jilinde Project (see Table 1). Sessions generally occurred monthly or quarterly, lasted from 1–6 hours each, and usually included refreshment (lunch or coffee/tea and snack) provided by the project or by other implementing partners. The DE evaluator or a designee typically facilitated the session, coordinating in advance closely with key facility and county stakeholders to ensure full participation and encourage participants to bring complete, relevant data to the review session. During the session, frontline workers presented and interpreted their own data. This was followed by an open discussion allowing other delegates attending the session to share their interpretations (part of “reflection”) and to provide suggestions about implementation changes required to optimize desired outcomes. Each group also discussed outcomes of the new actions they took since the previous session. After all groups finished presenting their data, a facilitator (typically identified before the session based on their interest in DE) guided an activity to highlight common themes, identify new data needs, and propose names of new participants who should be invited for the next session.

Table 1. Review session participants.

FacilityCountyCluster/Project
Facility/DICE director

Facility PrEP coordinator

Facility PrEP providers

Facility data analyst

Occasionally: Jilinde DE evaluator

Community mobilizers

PrEP beneficiaries
County-level MOH officials

County-level M&E/data staff

Facility PrEP coordinators

Facility PrEP providers

Jilinde DE evaluator

Jilinde M&E officers

Jilinde program officers

Occasionally: PrEP beneficiaries
Jilinde Project Director/COP

Jilinde DE evaluator

Jilinde cluster Leads

Jilinde cluster-level data analysts

Jilinde technical advisors

Jilinde program officer(s)

Consortium partner advisors
and managers
Focus: Improving PrEP services,
uptake, continuation rates;
creating sustainable demand
Focus: Strengthening coordination
and quality of PrEP services; better
reporting and data use
Focus: Improving DE process
and supporting the project
towards achieving evolving goals

In the first year of the project, the DE evaluator took advantage of opportunities to introduce DE and the “pause, reflect, and document” process, and to provide supportive coaching and mentorship at each level, guided by the six principles listed in Box 2. The DE evaluator designed a cascading approach focusing mentoring and capacity building first on project staff within cluster teams, who in turn coached and mentored other staff within the local project team, and then county and facility staff. The DE evaluator first identified project staff members with keen interest in DE and provided quarterly mentorship sessions lasting about two hours. Then the DE evaluator arranged for the staff member to co-lead DE sessions, conducted real-time practical coaching during active DE sessions and convened feedback sessions to reflect on performance. Mentees were supported by the DE evaluator to lead several DE sessions before they graduated to execute sessions independently. DE sessions were also reinforced at regional level by Cluster Managers (regional staff playing administrative roles for the project). In this way, the DE evaluator built local capacity within the team, and early adopters mentored others. The DE evaluator employed a team approach to create new DE champions, bringing together and drawing on trainees’ unique efforts and skills to improve the process.

Box 2. Principles Guiding DE Implementation in Jilinde Project

  • Ensure attendance from the widest range of stakeholders

  • Encourage active participation from each attendee

  • Integrate DE into ongoing structures and processes rather than creating new ones

  • Build sustainable DE capacity at all levels

  • Encourage frequent, routine but low-burdensome review events

  • Ensure the voices of beneficiaries are heard at all levels

3. Reflect. During step three, participants dedicated time to thinking—individually and together—about the meaning of the data reviewed. The extent to which the review session was meaningful and useful—and DE as an approach was effective—depended directly on the quality and degree of reflection. We found that many groups were accustomed to using reflection routinely in their work, but reflection was often unstructured, unsystematic, unrecorded, and not informed by all available relevant data. To structure, guide and support the reflection component of the review, we used these four simple questions: 1) What do the data mean? 2) What other data do we need? 3) Who else should be invited to the review? 4) Did we make our planned changes? For example, at one county session, one health facility team presented the facility’s routine service data. They also presented supplementary data they had collected by talking to each client who dropped out of service to find out why he or she dropped out. Looking together at the reasons clarified where facility staff should focus energies next, what type of outreach and what type of support might help, and what changes were needed in service delivery. Other facilities commented that they also needed such supplementary data and needed to find a way to collect them so they too would have the advantage of understanding clients’ needs better.

Deepening the quality of reflection at review sessions was a significant focus of the DE evaluator, and how this was done evolved over time. For example, the DE evaluator observed that during county meetings participants were reluctant to draw attention to gaps and provide critical analysis. Sitting with peers, and sometimes supervisors, raising issues that could be considered highlighting poor performance was perceived as inconsistent with cultural norms. The DE evaluator adjusted the sessions, for example hanging poster paper with key reflection questions, and asking participants to write their observations on sticky notes and add them to the appropriate poster, which provided a degree of anonymity to participant comments. Then the facilitator organized posted comments in broad themes and engaged participants to discuss comments and themes. Session attendees were also encouraged to reflect throughout the session and to document these reflections on a ‘parking lot’ space, which the facilitator shared with the group for discussion. These adjustments in the process improved both the amount and depth of participation.

Reflection also changed and deepened, becoming richer and more useful as part of the developmental process itself, and for the purpose of yielding the desired results for the range of stakeholders involved in the PrEP scale-up. First, as users identified new data sources, discussions expanded to include those new data and their implications. Second, as government and program priorities changed, reflection focused on different areas. For example, there was a shift in discussions about how to optimize HIV risk screening in some sites when PEPFAR’s policy pivoted from mass HIV testing to targeted testing in order to increase HIV testing yield. Additionally, the observation that AGYW PrEP uptake was persistently low changed conversations to focus specifically on AGYW and those least covered by the intervention. Third, as anticipated, the landscape within which PrEP services were being provided changed, with the entry of new players and donors, demanding their inclusion in the learning process.

4. Record. As described in step 3, though many groups were used to reflecting on their work—if not their data—they were not used to documenting their reflections. We shared with the groups a streamlined template with six simple questions to guide documentation of the main points from the review session (see Box 3).

Box 3. Recording Questions

  • What data sources were examined?

  • What are the main findings of interest?

  • What are the implications of the findings?

  • What changes does the group recommend?

  • What other data would be helpful?

  • Who else should attend the next meeting?

Recording reflections and planned actions to improve programs was meant to serve several purposes. First, the document was the source for a common recollection across members of the review group. Second, the information documented formed the basis for planning related actions. Third, the document helped members feel accountable to implement those changes they decided on together.

Although we encouraged systematic recording, more often than not groups - particularly at facility level - did not record their findings, and this challenge persisted throughout the project. Unless someone was identified and accountable to record and share the information, participants tended not to record deliberations. But at the county level, for example when the county AIDS and STI coordinator was the convener, recording was virtually guaranteed.

5. Act. The DE approach is meant to ultimately lead to meaningful adjustments in implementation, in line with the changing needs of the community, and within the complex and fluid context of that community. In selecting changes, we encouraged groups to focus on those changes considered by the group to be highest priority, achievable, and expected to make the biggest impact. However, we also encouraged groups to identify “low-hanging fruit”—positive changes considered easy to make.

In Jilinde’s case, DE guided action at each level for which it was used. For example, facilities providing specialized services to AGYW had to face trying to understand the very low PrEP initiation numbers—far below their own targets and below the Jilinde Project targets. Public facilities were also registering extremely low initiation yield despite use of significantly high resources. Through reflection and discussion, and after several months trying to bolster the prevailing approach, public health facilities decided to completely overhaul the service delivery model to be more attractive to users. The new approach involved introduction of peer educators in the public health facilities who conducted assisted referrals, coupled with community outreach to safe spaces to address intra-facility referrals and access challenges.

An example of actions taken at county level was diversification of service channels. Outside of DICES, most PrEP implementation was being done in comprehensive care clinics (CCC) for HIV services. Staff noticed that people were shying away from these clinics because only a few providers were trained to provide PrEP, making it appear to be an intervention for only a few, thereby stigmatizing PrEP services. In response, county health managers requested implementing partners to facilitate training of additional providers and diversification of service delivery points to include maternal, child health and family planning clinics and outpatient clinics so that services were more widely available and viewed as “routine.” To address the need for more service providers to offer PrEP services in the expanded service delivery points, the project pivoted toward offering modular facility-based training, which was cheaper per trainee compared to residential training which had been used previously.

At the cluster and overall project levels, we made several program changes stemming from our DE approach. One example derived directly from reflecting on data shown in Figure 3. At the start of the project, we had a commitment to reach the widest group of clients with PrEP services in DICES, public and private facilities. Based on initial data, we redoubled efforts in public and private facilities. However, month after month we saw PrEP initiations stagnate in these facilities compared to DICEs. By the third year, we decided to scale down direct technical assistance by Jilinde staff to public and private facilities –transitioning them to MOH support – and instead, focused intense support to DICEs. This eventually culminated in reaching project targets for key populations by the end of the third year, and also improved continuation rates in some of the intensively supported sites.

In another example, when reviewing initiation numbers for FSW and MSM, we were happy to observe that we were on track to achieve targets early. However, our continuation rates were much lower than we had hoped. We had conversations with facility staff and with clients who dropped out, about challenges and reasons for dropping out. After several months of increased follow-up support to new clients to increase continuation rates, the Jilinde team decided to try something new: slow down enrollment for clients during targeted outreach and conduct new initiations with continuation in mind. We focused on creating PrEP-specific clinic days, initiating PrEP within the current pool of clients attending other DICE services. Here, the focus was on initiating fewer clients with high commitment to PrEP while still providing the opportunity for those uncertain to commit to PrEP to initiate at their convenience. We provided pre-initiation adherence counseling to support clients to make informed choices. In the process, clinicians gauged clients’ commitment to PrEP and assessed specific continuation indicators. This also allowed us to examine closely whether we were initiating the right clients. Following this pivot, we still achieved our targets, continuation rates continued to improve, and we learned important lessons for implementing daily PrEP.

Discussion

Through the use of DE, we were able to successfully adapt and refine PrEP services for lFSW, MSM and AGYW seeking services to reduce the risk of HIV transmission. Used consistently over time, DE yielded insightful lessons and innovations which helped Jilinde improve service access, reach target populations in expected numbers, and improve continuation rates. The look, feel and yield of our DE efforts evolved over time, becoming increasingly integrated into existing systems and providing deeper and richer insights. In the process, we gained useful insights about how to better implement DE in the future.

DE guiding principles

Patton describes eight guiding principles essential for DE (Patton, 2010). Each principle was present in varying degrees in Jilinde, though the project was not designed intentionally so (see Table 2). Considering that Jilinde was created as a “learning laboratory”, closer attention to these guiding principles during the original project design and throughout the project would have strengthened the effectiveness of Jilinde’s DE approach.

Table 2. Examples of the eight developmental evaluation (DE) guiding principles as expressed in the Jilinde Project.

DE Guiding PrincipleExemplified in the Jilinde Project
Developmental purpose: DE is used specifically to
continuously refine and improve the intervention as the
environment, needs and context change.
The donor explicitly requested that the intervention evolve toward one
that could be scaled. Jilinde’s purpose was to introduce PrEP into the
market, and iteratively adapt to make services acceptable, show the
approach worked, and find a scalable model.
Evaluation rigor: Data and other information used in the
DE are collected using strong scientific methods; the data
have integrity, and data sources are valid for the purposes
intended.
Jilinde collected and analyzed multiple types of (quantitative and
qualitative) data available at multiple times in the project. We
developed standard indicators, which improved comparability across
programs. Data collection tools were developed under the leadership of
the national government, thus easily adopted. Data sources continued
to change, and the data system evolved, providing more consistent and
higher quality data over time.
However, monitoring data drawn from the district health information
had inherent limitations of routine data systems such as quality. Also,
there were no counterfactuals to decrease bias and improve rigor.
Utilization focus: The DE is not an academic exercise.
Instead, the results are meant to be used directly and
immediately by those most involved and most affected by
the intervention.
The DE approach, the project and all data were explicitly designed to
enhance the evolution of the intervention so that more people could
be reached more effectively with PrEP services. The intervention was
developed
for eventual use in the public sector, with sustainability and government
ownership in mind. This meant, for example, supporting county health
officials to coordinate and facilitate “pause, reflect, and document”
sessions over time.
Complexity perspective. DE is chosen because the
intervention context is complex, and that complexity is
reflected in how DE is carried out, e.g., who is invited to
participate, when and where reflections take place, and
what new data are added.
Increasingly, the PrEP landscape has multiple players, handling different
pieces, and making contributions that would inherently impact success.
At the same time, many individual, community, and health system
challenges had to be surmounted for services to be made accessible
to clients. Therefore, Jilinde ensured the varied players’ interests were
considered, everyone was at the “DE table”--particularly beneficiaries—to
ensure that services were provided in the friendliest environment
possible.
Innovation niche. The problem addressed by the
intervention requires innovative solutions to make a real
difference and those innovative solutions are welcomed.
PrEP services were operating in a new space with interest from multiple
stakeholders, which required an iterative process. The Jilinde’s DE
approach provided fertile ground to brainstorm and test new ideas,
rapidly sifting through a raft of new ideas, prioritizing, applying,
assessing, and dropping unsuccessful ones.
However, the larger public health system was not very responsive to
rapid change, which meant eventual changes were often slow to be
adopted.
Systems thinking. The DE approach considers the range of
interconnected players, perspectives and processes and
promotes actions consistent with an understanding of
those connections.
DE facilitators used systems thinking to determine the composition of
meetings to represent all the critical players, while creating room for
others. Facilitators tried to point out how every part of the program
had a vital contribution. They also nurtured a spirit of not apportioning
blame when data illuminated gaps, examining the gaps as created by the
system instead of the people.
Co-creation. The details of the DE process and the
intervention itself—from who is involved, to how, when and
what data are considered—are built and refined with shared
responsibility across the range of intervention stakeholders,
including program implementers, beneficiaries, donors, and
government officials.
Jilinde used co-creation throughout the project, starting with intervention
development during which “co-design workshops” brought together
target population members with program developers to identify target
prototypes for demand creation and service delivery. Providers were
involved in discussions to address health systems issues. However, only
later in the project were beneficiaries consistently present during review
meetings to provide testimonies of service impact and needs from their
perspectives.
Timely feedback. Information about how the intervention is
working and about the surrounding changing context, are
available as they are needed to inform the DE process.
Obtaining timely feedback was both an early focus and consistent
challenge for Jilinde. This improved over time as the feedback process
became more streamlined, and as facilitators continued to articulate
why timely feedback was vital. Supporting all players to comprehend the
value of timely feedback took substantial project time and effort.

Nonetheless, application of the eight principles was critical to the DE process and attaining the goal of designing a scale-up model that was fit for purpose in the Kenya context. For example, considering the principle of timely feedback, because we integrated review meetings into already-existing routine meetings, we often faced the need to discuss competing topics. As meeting facilitators, we had to incorporate other emerging non-PrEP-related priorities. Other delays frequently occurred because we were implementing in government space. Having anticipated these challenges more directly, we could have focused more specifically on creating systems to support timely feedback from project outset.

Lessons learned from using DE for PrEP program implementation in Kenya

Through the Jilinde Project, we learned several lessons about using DE to scale-up PrEP among high-risk groups in Kenya. First, and most importantly, we learned that DE can be a powerful tool to engage stakeholders in finding the best solutions to current and ongoing implementation challenges. From the larger Jilinde Project perspective, this meant DE was successfully used to determine what is working in some counties and sites, and how to improve processes and outcomes. When done well, DE put those with the most at stake—beneficiaries, service providers, ministry of health officers, PrEP project managers—in the driver’s seat. This provided them opportunities to define and redefine both the problem and the desired outcomes. The review sessions normalized candid conversations about challenges, helping staff at all levels feel increasingly comfortable expressing opinions. The results did not always align with how some key stakeholders defined the problem and proposed solutions. For example, county and health facility leadership conveniently positioned PrEP at the CCC because they believed it was easier to manage and account for commodities, and because CCC staff were trained to deal with HIV and ARVs. However, the DE process revealed low initiations when PrEP was positioned in the CCC. Through engaging the county and facility leadership in the DE process over time, they were able to see the value of and allow for diversified PrEP service provision through a range of service delivery points, thereby improving uptake and continuation rates.

Second, the DE process can be learned and used at multiple levels, including drop-in centers, health facilities and at the county level. Supported with the right systems and tools, DE was a feasible tool to use. However, in most cases adoption did not happen spontaneously. Providers, program directors and county health managers are busy, with PrEP and the DE approach competing with other health priorities and approaches. Allocating time specifically for DE activities was a critical indicator of success. Even then, project staff found that to sustain any gains required intense and tactful advocacy and support.

Third—and related to the second—building DE capacity, structures, and systems takes time and effort. DE is a new concept; users first need to be exposed to the concepts, helped to turn concepts into concrete systems and actions, and then supported to practice using DE before they feel comfortable. Teams need time and experience—and need to see a real benefit to them and their programs—before they embed reflection routinely as a core component of their work. Because Jilinde had only one dedicated DE evaluator, who had multiple responsibilities—as is common in such a role (Baldwin & Lander, 2018)--he selected the most fertile ground for building capacity, structures, and systems. This meant helping to integrate DE first where routine monthly data review meetings already existed, or where facility and county managers were used to soliciting feedback from beneficiaries. Implementers emphasized the advantage of integrating DE into project implementation from the very beginning to establish the “pause, reflect, and document” cycle early rather than trying to add it later.

As we proceeded with implementation, we gained a better understanding of DE capacity gaps and in response, enrolled the Jilinde team in relevant online courses offered by the University of Washington to build data driven decision-making. In the Jilinde Project, the foundation on which DE was being layered was quite fluid. It would have been better to develop foundational DE capacity of key champions before starting the project and avoid “building the ship as we sailed” and the DE process should have been introduced at the project start-up phase when time was available.

Fourth, we learned that simpler is better. Because DE is new to many and the context for implementing DE for PrEP services is crowded and complex, we tried to make DE as accessible and streamlined as possible. This was done through; i) demystifying the language, substituting “developmental evaluation” with “pause, reflect, and document”; ii) distilling the recording template to answer three questions instead of the initial six: [What data are we considering? What are the key findings and what can we do about it? What additional data or people would help us better understand the situation?]; and preparing and promoting the use of easy-to-understand-and-use reference materials for teams until they gain confidence with DE.

Reflections and next steps

Example of utilization focus. The lessons we learned about effective PrEP implementation among those at high risk of HIV transmission in Kenya are being shared with local and national stakeholders. Lessons learned from Jilinde have continued to inform the national roll-out of PrEP services. For example, national communication materials were developed through insights generated by Jilinde, research conducted by Jilinde and service delivery lessons have illustrated the need for diversification and integration of services which has informed the piloting of innovative integration models for PrEP services. Jilinde project learnings have also been shared with other countries with high HIV burden. These include eSwatini, Lesotho, Namibia, Mozambique, and Tanzania which have sent their in-country teams to Kenya to learn from the Jilinde Project about designing their own PrEP programs. We believe DE could be a powerful tool in similar contexts to address this and other complex emerging public health issues.

Principles-focused evaluation. We did not start the project with explicit operating principles that guided our DE (Patton, 2008; Patton et al., 2016). However, retrospectively asking ourselves “what operating principles guided us?”, revealed obvious strongly held implicit principles (see Box 2). At project design stage, the project team proposed an “agile learning methodology”, and only crystalized the DE approach when the project started. If we had articulated these guiding principles early on, we could have shared them with partners and used them to create a simple checklist early on to ensure fidelity of implementation and address deviations continuously. Ultimately, this could have helped consolidate the approach earlier and made it easier to replicate the approach.

Unanswered questions and next steps for DE with PrEP. Implementing DE in the context of PrEP programs in Kenya answered some questions and raised others. As stated above, because of donor requirements and limited resources, we took a “diffusion of innovation” path, opportunistically staggering growth toward a critical mass of competent evaluators, and we demystified the DE language along the way to make DE more accessible. This raised several questions. First, how can we accommodate adaptations to the DE methodology without losing the benefits? How do we maintain “fidelity” of the process? To apply DE to the complexity of introducing PrEP required ongoing adaptation (over time and to the context) of the implementation principles. It remains unclear about the extent to which adaptations can continue without negating the value that DE is meant to deliver.

Second, how do we make the DE process as “light” as possible while retaining utility? Because the public health sector in LMICs are under stress with competing demands and insufficient resources, helpful as DE may be in this context, it must be as low-burden as possible.

Third, what additional reference tools could be useful to build capacity to use the approach and support DE implementation? The paucity of simple, easy to use, self-paced and inexpensive reference materials or learning content which can be availed to interested mentees still remains a challenge when trying to build a critical mass of DE users. We propose that a certifiable course delivered through current innovation teaching approaches (preferably online or mobile based) can mitigate this gap.

Finally, how could we have adapted our DE approach to allow Kenya MOH to adopt the approach to scale-up and continue to develop the national PrEP program? The application of DE continues at the national PrEP technical working group (TWG) and the decentralized units, and reflection sessions continue to inform changes to the program. The continuing and expansive application of DE within and beyond these structures depends on the good will and vision of the leadership at the different levels at the ministry of health. This also applies to the spread of the use of DE in other sectors.

This case study provides practical guidance-- through use of concrete examples--for using a DE approach in program implementation. We found that the DE process can be used quite successfully working closely with government partners on a complex public health challenge within a dynamic environment in a way that feeds back into and improves programs. More generally, we believe this example suggests that DE is a powerful tool that could be used in global public health to quickly evolve and refine interventions in complex settings to meet the needs of a variety of clients over time. Possible areas might include introduction of other new HIV interventions such as the vaginal ring, vaccines, and the injectable forms of PrEP; addressing global epidemics; and creating innovations in addressing complex wellbeing challenges, such as poverty reduction programs. There are still many unanswered questions in global health, and new challenges and innovative solutions are always emerging. DE should be considered as a general approach to designing and implementing evolving projects allowing innovations to emerge in relation to population-defined needs.

Data availability

No data are associated with this article.

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 15 Oct 2020
Comment
Author details Author details
Competing interests
Grant information
Copyright
Download
 
Export To
metrics
Views Downloads
Gates Open Research - -
PubMed Central
Data from PMC are received and updated monthly.
- -
Citations
CITE
how to cite this article
Fogarty L, Musau A, Kabue M et al. Using developmental evaluation to implement an oral pre-exposure prophylaxis (PrEP) project in Kenya [version 1; peer review: 2 approved with reservations]. Gates Open Res 2020, 4:158 (https://doi.org/10.12688/gatesopenres.13184.1)
NOTE: If applicable, it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 15 Oct 2020
Comment
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations - A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions

Are you a Gates-funded researcher?

If you are a previous or current Gates grant holder, sign up for information about developments, publishing and publications from Gates Open Research.

You must provide your first name
You must provide your last name
You must provide a valid email address
You must provide an institution.

Thank you!

We'll keep you updated on any major new updates to Gates Open Research

Sign In
If you've forgotten your password, please enter your email address below and we'll send you instructions on how to reset your password.

The email address should be the one you originally registered with F1000.

Email address not valid, please try again

You registered with F1000 via Google, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Google account password, please click here.

You registered with F1000 via Facebook, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Facebook account password, please click here.

Code not correct, please try again
Email us for further assistance.
Server error, please try again.