Skip to content
ALL Metrics
-
Views
-
Downloads
Get PDF
Get XML
Cite
Export
Track
Study Protocol

Performance monitoring and accountability: The Agile Project’s protocol, record and experience

[version 1; peer review: 1 approved, 1 approved with reservations]
PUBLISHED 25 Mar 2020
Author details Author details

Abstract

The Performance Monitoring and Accountability 2020 (PMA2020) project implemented a multi-country sub-project called PMA Agile, a system of continuous data collection for a probability sample of urban public and private health facilities and their clients that began November 2017 and concluded December 2019.  The objective was to monitor the supply, quality and consumption of family planning services.  In total, across 14 urban settings, nearly 2300 health facilities were surveyed three to six times in two years and a total sample of 48,610 female and male clients of childbearing age were interviewed in Burkina Faso, Democratic Republic of Congo, India, Kenya, Niger and Nigeria.  Consenting female clients with access to a cellphone were re-interviewed by telephone after four months; two rounds of the client exit, and follow-up interviews were conducted in nearly all settings.  This paper reports on the PMA Agile data system protocols, coverage and early experiences.  An online dashboard is publicly accessible, analyses of measured trends are underway, and the data are publicly available.

Keywords

Family planning, service delivery, clients, supply, quality, consumption

Introduction

When monitoring and evaluating (M&E) the performance, progress and impact of large-scale population-level interventions, the standard practice in developing country settings has been to rely on cross sectional surveys conducted by third parties usually at the beginning and the end of the project to provide information on changes in outcomes of interest. These specialized surveys are usually extensive in scope and rich in detail and instrumental for global monitoring (Boerma & Sommerfelt, 1993), but their deployment is also resource- and time-intensive. When findings from these surveys become available, their dissemination is often much later than needed to modify the design or continued implementation of the project. Moreover, it is often necessary to relate the findings with contextual information from other data sources in order to gain insights on why and how the project succeeded or not. In short, the traditional M&E approach is not designed for tracking and acting on performance results on an ongoing basis (Nordberg, 1988; Rowe, 2009).

At the same time, health information systems, while increasingly digitized, are constrained in the types of measures available, their selective coverage of facilities and clients, accuracy of gathered data, and timeliness of reporting. Efforts to eliminate these deficiencies are growing, especially to address prevention and control of large-scale epidemics of infectious diseases (Braa et al., 2007; Walsham & Sahay, 2006). The lack of systematic and programmatically relevant, continuous and timely information available at subnational levels, however, has posed a formidable challenge to nimble and effective project decision-making and response (see Guenther et al., 2014; Maina et al., 2017). PMA Agile was designed to move away from the traditional M&E approach by establishing a near-continuous monitoring system that collects, links and aggregates data at different levels on a focused set of indicators in a cost-effective manner. The Agile system was developed to reduce the lag time between steps in a project learning process: recognizing a project’s information needs, identifying sensitive performance indicators, collecting relevant primary and secondary data, analyzing the collected data, producing actionable insights, and enabling the use of the insights to adjust and fine-tune programs.

PMA Agile evolved out of a combined interest of the research and evaluation staff at the Bill and Melinda Gates Foundation and the PMA1 project at the Johns Hopkins Bloomberg School of Public Health to develop an innovative data system that could track performance of two large projects2 at the subnational level, in this case urban areas and their poor. In Agile’s first year, the selection of urban geographies for monitoring was dependent on these two projects’ own plans for locating their program resources. It later became clear that Agile would need to proceed more independently with geographic selection, than originally planned in order to realize its objectives in the awarded project period. The eventual selection of Agile geographies did not affect its design or four objectives which are to:

  • 1. Provide a flexible, continuous and cost-effective data collection system that can triangulate with routine and other survey information;

  • 2. Serve as an adaptable, replicable M&E platform for program implementation addressing needs of the urban poor;

  • 3. Measure core indicators that reflect program performance at the health facility and client levels;

  • 4. Promote actionable findings to enable evidence-based decisions by government officials, non-government stakeholders and researchers.

This paper describes the protocols, record and experiences, to date, of PMA Agile to accompany the findings that are already available on public dashboards. Because the data will become publicly available, this description provides important background to inform current and future users.

Protocol

Urban settings for PMA Agile

Cities have become home to growing, underserved poor communities. More than half of the world’s population currently lives in cities and this urbanization is accelerating to 70 percent by 2050, especially in Africa and Asia. Cities benefit from economic growth but their governments struggle to accommodate rising demands for services. Reaching urban women and girls with reproductive health services has become a social welfare imperative.

Cities also offer the advantages of spatial proximity to health services with transportation systems, population density, and structural networks among individuals and institutions to support a private health sector that ranges in type from retail kiosks and pharmacies dispensing over the counter medications to specialist doctors and hospitals that cater to more elite client needs. To monitor the implementation of any large-scale health initiative aimed at reducing urban disparities and improving health equity requires gathering information on the private, as well as public, health sectors. Social marketing projects largely target urban populations to reach a market that enables subsidizing commodities for the poor. They function by providing a range of high-quality, affordable, and novel brands of contraceptive and sexual health products to the market through well established distribution and supply chain mechanisms.

Selection of PMA Agile sites

PMA Agile activities officially began November 2016 with the first set of five country and 12 city geographies decided in July 2017: Burkina Faso (Ouagadougou and Koudougou); Democratic Republic of Congo (DRC) (Kinshasa); India (Ferozabad, Indore, Puri and Shikohadbad/Tundla cities); Kenya (Kericho, Migori and Uasin Gishu counties); and Nigeria (Lagos, Ogun and Kano). PMA Agile site location was based on collaborating with existing PMA implementing partners and considerations of local intervention activities, government health administration interest and willingness to act on results and input from Gates Foundation staff. Capitals of two Francophonic countries (Abidjan in Cote d’Ivoire and Niamey in Niger) were held for future consideration, with Niamey subsequently added in early 2019 as the 14th PMA Agile site.

Implementing partners (IPs)

IPs’ capacity and connections with local stakeholders were key for the successful dissemination and actionability of PMA Agile results. The implementing partners were the Center for Research, Evaluation Resources and Development and the University of Ibadan for Nigeria, the Indian Institute for Health Management Research in India, the Institut National de Statistiques of the Government of Niger, the Institut Superieur de Sciences de la Population in Burkina Faso, the International Center for Reproductive Health in Kenya, and the University of Kinshasa School of Public Health for DRC.

Data elements

PMA Agile’s data system has four main elements: a baseline and quarterly follow-up health facility survey, a semi-annual client exit interview survey of male and female clients, a follow-up phone interview of consenting female clients, and a youth survey based on respondent-driven sampling. Table 1 provides an overview of each element’s purpose, mode of administration, sample, eligibility criteria, target sample size and periodicity3.

Table 1. Features of elements for PMA Agile data system.

ElementPurposeMode of
administration
SampleEligibility
criteria
Target sample
size
Periodicity
Health facility
survey
Measure availability
and status of
key indicators
of contraceptive
service delivery
Face to face interview
with facility manager
or knowledgeable
informant
Probability
sample of urban
public and
private health
facilities
Registered
health facilities
220 public and
private per
Agile site
Quarterly
Client exit
survey
Service experience
and satisfaction
Face to face interview Systematic
sample of clients
using facility
services
Female clients
age 18 to 49 and
male clients age
18 to 59 upon
completion of
visit
10 clients per
selected facility
Semi-annual
Client follow-up
survey
Measure any
change in
contraceptive status
and satisfaction
with services
Telephone follow-upFemale clients Baseline clients
who consent and
provide phone
number(s)
All eligible
clients
Semi-annual
(in following
quarter)
Youth
respondent
driven sample
(YRDS)
Measure
contraceptive
procurement
among youth
Computer-assisted
self-administered
interview
Respondent
driven sample in
3 selected urban
cities
Unmarried
females and
males age 15 to
24 recruited by
seeds
Abidjan - 2000
Nairobi - 1300
Lagos - 1300
One time

Sample selection and size

Health facility or Service Delivery Point (SDP)4

Respondents for the SDP questionnaire were primarily the in-charge/manager of the health facility; however, once the respondent has given consent for the SDP to participate, other personnel at the facility occasionally contributed answers based on expertise/knowledge of the subject matter. These other respondents may include medical staff, pharmacists or accountants. All SDPs that participate in the baseline SDP survey become eligible for subsequent quarterly follow-up surveys.

The size of the SDP sample was determined using the proportion that provides three or more contraceptive methods. In Kenya, the first PMA Agile site, this proportion was 77% of SDPs based on data from five earlier rounds of PMA surveys. With 80% power, alpha of 0.05, and allowing for a 5.5% margin of error, the required simple random sample size was 204 health facilities. After allowing for 15% non-response, the sample size for SDPs was fixed at 220 across all Agile sites and evenly divided into 110 public and 110 private facilities. Lists of registered public and private health providers were obtained from relevant official authorities. The lists included facility names, type of facility and addresses. The facilities were stratified by public and private and then the proportionate distribution of facility type was calculated. The 110 facilities in each sector were then randomly selected. If a site had fewer than 110 facilities, all facilities in that sector were selected to be surveyed5. This panel of SDPs was then visited quarterly for follow-up surveys. Preliminary field checks were made to assess the accuracy of the lists but more often, if a sampled facility at baseline was found to be non-existent, closed or transformed into another type of facility, it was replaced with another facility of the original type drawn from the list.

Mobile-phone based survey forms, akin to those used by PMA (see Zimmerman et al., 2017), were developed to consent and interview the in-charge or owner of the health facility on a quarterly basis. The baseline questionnaire or form is about 30 minutes in duration with the quarterly follow up about 15–20 minutes. Consent rates for baseline and retention rates for continued quarterly survey participation have been relatively high across sites, as will be seen below. Incentives were not given to SDP survey participants. However, in Nigeria, retention of the participation of private health facilities over time required providing an additional incentive (a PMA Agile-branded wall clock).

Client exit interviews

The CEI survey was aimed at capturing the service experiences of adults seeking ambulatory health care. It targeted interviews with 10 clients per sampled facility. This number is based on a sample power calculation using a modern contraceptive prevalence of 50%, assumed to be fixed across all Agile sites, a margin of error of 3% and design effect of 2. This resulted in a sample size of 2106 clients which divided by 220 health facilities resulted in 10 clients per facility.

Eligibility criteria for the CEIs were: female clients 18 to 49 years old or male clients age 18 to 59 years. Clients were recruited systematically or sequentially by the field interviewer (known as the resident enumerator or RE) as they exit the sampled SDPs over the course of one or two interview days. The RE was provided the average daily client volume for the SDP, obtained during the baseline survey and a sampling interval. For example, if the SDP saw an average 150 clients per day, the RE was given a sampling interval of 15 to select 10 clients. The RE used a random start number between 1 and 15 and began recruitment with the Nth client who exited. REs worked in pairs at large health facilities, such as hospitals, and also position themselves at the outpatient and primary care clinics for survey recruitment. At small facilities, such as pharmacies, the same systematic selection procedures were followed, and REs could work in pairs depending on client volume. CEIs were generally completed outside the pharmacies.

Clients were approached to participate after they received or attempted to receive care from the SDP. Trained team enumerators introduced themselves, explained the Agile survey to clients and consented the client to participate. Clients consenting and completing the survey were provided with $1 equivalent in cell phone airtime or offered a material good of equivalent value as compensation for their time.

The CEI was approximately 20 minutes in length and collected information on client experience and satisfaction with the health site’s services, with family planning content prioritized. The CEI was fielded in the second and fourth quarterly surveys (Q2 and Q4) each year, or two times over a 12-month period. Participation rates were above 50% in all settings and ranged from 4% non-consent in Kenya to 35% non-consent in Nigeria among female clients (data not shown).

CEI Follow-Up

To assess contraceptive practice, only female clients were recruited for the CEI phone follow-up survey. Upon completion of the CEI, the female client was asked if she was willing to participate in a follow-up interview to occur in approximately 4 months. If she consented, she was asked for a primary and secondary phone number (cell or landline) at which she could be reached. Often female clients provided their male partners’ phone numbers and the script used at the beginning of the call was general to avoid disclosing any confidential health behavior.

The phone follow-up interview asked about the female’s adoption (among those who were not using a method at the time of the CEI) and continued use or switching of contraceptive methods and continued satisfaction with the health facility visited. The four-month interval was selected to enable re-supply of short-term methods such as the three-month injectable and to optimize on retention of the client sample. In the absence of much published literature on participation rates for follow-up surveys administered by telephone in developing country settings, it was expected that approximately half of the client sample would be female and that half would consent to the phone follow-up, leading to approximately 500 clients re-interviewed. In actuality, the average proportion of CEIs with females across the 14 sites and all rounds was 64.7% and was highest in Niamey, Niger at 92.5% and lowest at 32.1% in Puri, India. Follow-up participation rates ranged from 37.3% in Shikohadbad-Tundla, India to 96.6% in Migori, Kenya, with an average of 70.2%.

The RE team set aside specific days to conduct the CEI follow-up in a project office. They were provided their individual list of consenting females, typically ones they had interviewed themselves, and phone numbers to call. Direct touch-dialing enabled the RE to avoid having to enter (or mis-enter) the client’s phone number. The relatively high retention rates across sites is undergoing analysis of the underlying factors. One related factor appears to be recognition by the female client of the RE who originally interviewed her and thus willingness to be re-interviewed.

Youth RDS Survey

The Youth RDS Survey (YRDSS) was borne out of a need to measure contraceptive awareness, procurement and use among urban adolescents and youth as they enter a period of probable sexual activity. The target sample was unmarried female and male youth ages 15–24 years. Capturing this information from youth clients at health facilities, especially unmarried females, was likely to be biased due to social and familial sanctions on sexual activity and contraceptive use. In this age group, it is suspected that youth may be procuring contraceptives via other means, making their use effectively “hidden” to clinic staff and compromising the accuracy of clinic-based survey measures. Their sexual partners, relatives or other adults may be assisting with procurement. As a set of special studies, PMA Agile collaborated with youth-serving organizations in Abidjan and Nairobi, and a third has recently been launched in Lagos, to survey unmarried youth using respondent-driven sampling (RDS) methodology. This sampling method, which can be adjusted post-enumeration to weight to a known population distribution, takes advantage of youth networks for rapid recruitment and reach into diverse communities.

The sample sizes were powered on the estimated modern contraceptive prevalence level for unmarried females 15–24 years obtained from the most recent PMA2020 survey. “Seed” respondents recruit three additional respondents, who each recruit another three until the desired sample size and gender balance, which was monitored daily, was reached. The survey was self-administered on a tablet, with an attendant RE available to guide the respondent’s use. The findings were disseminated in country once the technical report and briefs on selected topics were produced6. All Agile questionnaires were translated (and back-translated) into the local languages when required.

Outcomes measured

PMA Agile is indicator-driven, i.e., it measures the core indicators in the service supply, quality and consumer demand environments known to influence and be of value to program officials, contraceptive and other health practices, such as commodity stock flows, client volume, client satisfaction or medication or product use adherence. Key indicators for PMA Agile are listed in Table 2, and are grouped under dimensions of supply, service quality, and demand. Additionally, it can incorporate new measures in any subsequent round of data collection desired by local stakeholders.

Table 2. PMA Agile data system components, indicators and health access dimension addressed.

Data unit/IndicatorDimension
Health facility
Provision of different FP methodsSupply
Commodity methods in/out of stockSupply
Monthly client volumeSupply
Commodities distributed/sold in past monthSupply
Commodities received/purchased in past monthSupply
Trained providers present at time of visitSupply
Reports data to Health Management Information SystemSupply
Community outreach activities conductedSupply
Fees chargedSupply
Provision of other Reproductive Health (RH) commoditiesSupply
Provision of other Sexual and RH services (MCH, HIV)Supply
Client
Satisfaction with FP services/providerQuality
Current use of contraception
Type of method used
Method obtained if came for FP visitQuality
Counseled on side effects, additional methodsQuality
Provided follow-up/return informationQuality
Willingness to return/refer relative or friendQuality
Out-of-pocket costs for FP servicesQuality
Intention to use in future (for non-users)Demand
Exposed to Behavioral Change Communications on FPDemand
*Additional project-specific indicators are included on a site-specific
basis.

Ethical approval

Agile data collection protocols were reviewed and approved by the Johns Hopkins Bloomberg School of Public Health Institutional Review Board and the in-country counterpart review board: Kenyatta National Hospital-University of Nairobi Ethics Research Committee; National Health Research Ethics Committee of Nigeria; MOH-Burkina Comité d'Ethique pour la Recherche en Santé; University of Kinshasa School of Public Health Institutional Review Board; Indian Institute for Health Management Research Ethical Review Board; MOH- Niger Comité National d'Ethique pour la Recherche en Santé. All participants provided verbal consent in accordance with country-specific approved consent procedures. Written was not deemed necessary because the study was determined to be of minimal risk.

Training and data collection

Recruitment of resident enumerators

Desired attributes of resident enumerators include: completion of secondary school, English or French literacy, local language fluency, residence in the selected city, a minimum of 21 years of age, not a paid health worker, not a health activist, no physical restrictions in conducting fieldwork, familiarity/experience with cell/smart phones, and personal awareness and support of family planning as a health intervention. In addition, preferred personal traits include maturity, self-confidence, dependability, trustworthiness, ability to protect confidentiality and respondent privacy, and social interaction skills. Recruited REs receive one week of hands-on intensive training, a smartphone and airtime.

RE/field supervisor teams

Agile field teams were composed of six to eight interviewers and one or two field supervisors. Each city had one field team. The field supervisor assisted in the baseline selection and recruitment of SDPs to participate in the surveys. S/he also supported and oversaw the systematic sampling of clients at SDPs and their follow-up phone interviews. Each interviewer was assigned approximately 25–35 SDPs to interview each quarter depending on the geography and conducted 250–350 CEIs and another 150–200 phone interviews every six months. Field staff were salaried and retained for the entire Agile data collection period.

Mobile phone data collection and transmittal

The collection of SDP and client data was completed with a smart mobile phone. All countries except Nigeria used JHU collect forked from the ODK collect community version 1.4.8 for the first two quarters of data collection. For all subsequent quarters of data collection, countries downloaded the latest version of the community ODK collect application as available on the Android PlayStore, prior to data collection for each quarter. Community ODK Collect versions used for data collection ranged from v1.17.1 to v1.23.3. Nigeria used the available latest versions of the application Survey CTO ranging from v2.40 to v2.60 through their 6 quarters of data collection. Nigeria also used community ODK collect v.1.17.1 and v1.25.1 to leverage its dialer app feature for the phone follow-up surveys conducted in the their 3rd and 5th quarter of data collection, respectively. Each RE was provided a basic smartphone with good functionality in Android OS (level 4.1 or higher) with adequate memory and GPS receiver having 6-meter accuracy. The smart phone had the enumeration templates to be used to record the information for each SDP. The RE uploaded each case record from the interviews to a secure cloud-based server after the interview was completed. If there was no network reception at the end of the interview, the RE stored the interview on the phone until she reached network availability and then transmitted the record to the server.

Data are initially stored on a Google Cloud Server with access retained only by designated members of the data management and PI team. Data are downloaded off the cloud server daily to a secure server maintained either by the partner institution or the Agile project at Johns Hopkins University (JHU). Once data collection within a round was finished, all data were deleted from the cloud server and maintained only within the in-country partner’s and JHU’s systems.

Figure 1 illustrates the data collection and transmission cycle. A quarterly cycle can take between 11 to 17 weeks.

1bde25b2-07a4-4fed-bb76-4e61509f170f_figure1.gif

Figure 1. Schematic of PMA agile data collection, transmission, archiving flows.

Implementation schedule

Figure 2 provides an overview of the surveys implemented in each Agile site by year, and then in Table 3 by round and coverage of SDPs, CEIs and CEI follow-ups. The estimated population of each Agile city is also provided for context.

1bde25b2-07a4-4fed-bb76-4e61509f170f_figure2.gif

Figure 2. PMA Agile Data Collection Schedule.

Table 3. Coverage characteristics of PMA Agile data system.

Country/SitePopulation
estimate
LYA
(000s)*
QuarterDates#
SDPs
# CEIs# Female
CEIs
% female# female
follow-
up CEIs
% follow-up
Burkina Faso
Ouagadougou2,531
(2018)
1March 2018-May 2018212
2August 2018-October 20182051774106359.9
3February 2019-April 201921287682.4
4June 2019-September 2019172157688656.2
5October 2019-November 201919166074.5
Koudougou92 (2012)1March 2018-May 201857
2August 2018-October 20185752544985.5
3February 2019-April 20195732371.9
4June 2019-September 20195044937282.9
5October 2019-November 20195526370.7
Democratic
Republic of
Congo
Kinshasa13,171
(2018)
1December 2017-January 2018200
2March 2018-June 20181971636105864.7
3September 2018-November
2018
18976672.4
4February 2019-April 20191861857121965.6
5June 2019-August 201918483468.4
India
Ferozabad,
Uttar Pradesh
604 (2011)1February 2018-April 2018109
2July 2018-August 2018103104550548.3
3November 2017-Jan20189917033.7
4February 2019-May 20199796748750.4
5June 2019-August 20199630562.6
6September 2019-December
2019
961008
(583
females)
58357.8
Shikohadbad
and Tundla
1February 2018-April 201877
2July 2018-August 20187473724933.8
3November 2017-Jan2018689337.3
4February 2019-May 20196867928542.0
5June 2019-August 20196817661.8
Indore,
Madhya
Pradesh
1,994
(2011)
1April 2018-May 2018131
2July 2018-September 20181281239
(535
females)
53543.2
3November 2017-Jan201811924044.9
4February 2019-May 201911497549250.5
5June 2019-August 201911026353.5
6September 2019-December
2019
10899250651.0
Puri, Orissa201 (2011)1April 2018-May 20189749998.6
2August 2018-October 20188979430738.7
3November 2017-Jan20188315650.8
4February 2019-June 20198069922632.3
5June 2019-August 20197812957.1
6September 2019-December
2019
7566321332.1
Kenya
Kericho902 (2019)1November 2017-Jan2018204
2March 2018-August 20182001973143972.9
3October 2018-December 2018198118682.4
4February 2019-June 20191921926130767.9
5July 2019-September 2019202115288.1
6October 2019-January 20202072070125560.6
Migori1,116
(2019)
1November 2017-Jan2018205
2March 2018-August 20182032011151175.1
3October 2018-December 2018199146096.6
4February 2019-June 20192032030147072.4
5July 2019-September 2019205140795.7
6October 2019-January 20202042040139968.6
Uasin Gishu1,163
(2019)
1November 2017-Jan2018209
2March 2018-August 20181911858148179.7
3October 2018-December 2018180129587.4
4February 2019-June 20191781750138579.1
5July 2019-September 2019184127992.3
6October 2019-January 20201821810128971.2
Niger
Niamey1,214
(2018)
1April 2019-June 2019180
2July 2019-Octoberr 2019178101293692.5n/an/a
3
Nigeria
Kano2,828
(2006)
1November 2017-Jan 2018215
2March 2018-August 20182041715120270.1
3September 2018-November
2018
20374862.2
4February 2019-May 20192011816129071.0
5June 2019-August 2019198100477.8
6September 2019-November
2019
1971780115464.8
Lagos9,112
(2006)
1November 2017-Jan 2018201
2March 2018-August 20181941606129480.6
3September 2018-November
2018
19185065.7
4February 2019-May 20191851487118479.6
5June 2019-August 201918491277.0
6September 2019-November
2019
1791417110177.7
Ogun3,751
(2006)
1Jan 2018-March 2018217
2March 2018-August 20182111707125973.8
3September 2018-November
2018
20972857.8
4February 2019-May 20192021696131677.6
5June 2019-August 201920293370.9
6September 2019-November
2019
2001538120078.0
Total/Average2314486103390764.11870770.2

*These population estimates are obtained from official census sources (Kenya) when possible but can be dated (Nigeria).

Data quality monitoring, cleaning, preparation for analysis

Data cleaning and quality monitoring

Use of ODK allows constraints and limiters to be built into the questionnaire minimizing entry errors. The date and time stamps from ODK and GPS coordinates allowed determination of the locations and times of data collection. Since these were monitored on real time basis, where unusual patterns are seen, messages for correction were sent to the team and corrective actions taken as needed.

Data cleaning checks occurred throughout data collection and after completion. Analytic routines (e.g., with Stata *.do files) were prepared and executed and generated data quality indicators that were reviewed further for outlier or illogical values by both the in-country survey IPs and the PMA Agile team at JHU. Table 4 illustrates for [Agile site] one routine for weekly monitoring of completion status for three types of Agile data. Data managers at IPs tracked progress toward reaching the sample targets on a daily basis and worked with field supervisors to trouble shoot as needed.

Table 4. Illustrative table of weekly survey monitoring process for Kericho in Kenya.

SDP (Quarter 4)
Report weekSample sizeCompletedPartially
completed
RefusedNot at facility/
Respondent Absent
Other
20/02/2019880000
27/02/201923150000
6/3/20193180000
13/03/201941100000
20/03/201954130000
27/03/201964100000
3/4/201974100000
10/4/201986120000
17/04/20199150000
24/04/20199860010
1/5/2019122200040
8/5/2019161350013
15/05/201916640010
22/05/201917550103
29/05/201919870583
05/06/2019213130002
12/06/201922390100
19/06/201922620010
Total192071611
Client (Quarter 4)
Report weekSample
size
CompletedIneligiblePartially
Completed
RefusedOther
20/02/2019000000
27/02/201960600000
06/03/20191851250000
13/03/20193651800000
20/03/20195161510000
27/03/2019617992000
03/04/20197401162050
10/04/2019828850030
17/04/2019865360010
24/04/2019967984000
01/05/20191023560000
08/05/20191067440000
15/05/201912732041010
22/05/201914301512040
29/05/201915831530000
05/06/201917481593030
12/06/201919061571000
19/06/20191960511020
Total1925160190
Client follow-up (Quarter 5)
Report weekSample
size
CompletedParticipant
Not
available
Phone
Switched
off/No
Answer
Wrong NumberOther
17/07/20191821820000
24/07/20193862040000
31/07/20195311450000
07/08/2019591600000
14/08/2019645540000
21/08/20197531080000
28/08/2019835820000
04/09/20199351000000
11/09/2019993580000
18/09/20191070770000
25/09/20191131610000
02/10/20191148171000
09/10/201911524728713
Total1152828713

Data analysis

Dashboards

Once data files were cleaned for duplicate records, outlier or illogical values, or missing records, another set of analysis files generated the pre-selected core performance indicators, such as the proportion of SDPs reporting method-specific stockouts at the time of survey. The indicator metrics were integrated into a digital database, aka “dashboard”, which displayed the quarterly indicator data and trends therein. Public users could then view the performance statistics for the SDPs and clients on separate dashboards. Special dashboards with password access were prepared for the two large projects, TCI and DKT International. The dashboard was built such that participating SDPs could also access their own data using a unique ID provided to them. This would allow them to view their indicator data over time in relation to others in the sample (all with identities masked).

Figure 3A illustrates quarterly trends in one of the dashboard indicators—average number of client visits in the past month for specific contraceptive methods for urban Kericho county in Kenya among public and private SDPs (top and bottom panels respectively). Fluctuations are evident over the six quarters. Public SDPs do not dispense emergency contraception. An increasing trend in monthly client visits for injectables and higher numbers of client visits in quarters 3 and 6 for implants are visible. Tracking client visits in public facilities is indicative of demand and can be juxtaposed with stockout levels over the same quarters to assess net performance of the commodity supply chain and meeting client needs. Private consumption of contraceptives (seen in the lower panel of Figure 3A) shows more fluctuation. The average number of return client visits appears relatively stable, except for injectables, while those for new client visits is greater, especially for ECs and implants in early quarters.

1bde25b2-07a4-4fed-bb76-4e61509f170f_figure3.gif

Figure 3.

Sample screens from PMA Agile dashboard of quarterly average number of client visits by method for public and private health facilities (A) and contraceptive prevalence among two rounds of client exit surveys in Kericho county, Kenya. Dashboard URL for (A): https://www.pmadata.org/pma-agile-dashboard-kenya-sdp (Note that public facilities did not dispense emergency contraception at time of surveys); Dashboard URL for (B): https://www.pmadata.org/pma-agile-dashboard-kenya-cei.

Figure 3B illustrates the client indicators over two rounds of data collection in Kericho. Modern contraceptive prevalence (mCPR) is assessed among all clients as seen from the client dashboard example in Figure 3B. Although mCPR is usually measured for childbearing aged females only, the figures here are for both female and male clients, where the latter have a female partner age 15 to 49. The mCPR is 61.7% in the first round conducted in Quarter 2 and 67.8% in Quarter 4. Differences over the two quarters by client age and method mix are also shown.

The dashboards for SDPs and client indicators are publicly accessible for the 14 settings in the 6 countries at www.pmadata.org/technical-areas/pma-agile. After accessing the dashboard of interest, users can filter the indicators for each setting, public/private sector, type of facility or for clients by background characteristic (e.g., gender, age) and facility type at which the interview was conducted.

Data dissemination

Table 5 provides an overview of PMA Agile’s dissemination activities, which are described herein.

Table 5. Dissemination activities for PMA Agile.

Activity typeObjectiveFrequencyURL to exampleBeneficial resultsDisadvantages
Stakeholder
meeting
Share latest findingsQuarterly Stakeholder reaction to
findings for confirmation and
actionability
Stakeholder expectation of
indefinite support
PMA Agile
dashboard
Provide online, real-time
access to latest quarterly
results
Continuoushttps://www.pmadata.org/pma-agile-dashboard-
kenya-sdp
Ease of referral to new
stakeholders to access results;
ease for updating
Access limited by weak
internet connectivity in
some settings
PMA Agile briefsShare summary of indicator
trends for stakeholder
meetings at advanced project
phases
Occasionalhttps://www.pmadata.org/sites/default/files/
data_product_results/PMA%20Agile-BF-
Ouagadougou-Client-FR.pdf
Easy absorption of material in
conventional printed format
User retention of hardcopy
Labor requirements
Technical reportsProvide written summary of
special data initiatives
Occasionalhttps://www.pmadata.org/publications/abidjan-
youth-respondent-driven-sampling-survey-
abidjan-yrdss-final-report
Ease of online access
Focused topic
Labor requirements
Agile partner
meeting
Share lessons learned
across sites and updates on
protocols going forward
Annual Built network capacity and
cross-site learning
Professional
conferences
Share selected findings in
depth with research and
practice audiences
OccasionalOpportunity to exchange
insights with communities of
research and practice, often
visually augmented
Low priority
Journal
publications
Share selected findings in
depth with research and
practice audiences
Occasional Contribute to evidence base
and implementation research
base
Publication offers permanence
and archival benefits
Medium priority

Stakeholder meetings

Agile carried out a range of dissemination activities to promote data utilization. Foremost among these were stakeholder meetings organized by the IPs usually co-sponsored with the local public health department. In addition to national and local government health officials, health staff from non-governmental and research organizations and from international projects and donor agencies were invited. An example where a close connection was forged for data utilization was the use of PMA Agile data for the family planning cost implementation plans for Kenyan counties. These stakeholder meetings also enabled local confirmation of measured and observed trends in the indicators, as was voiced in Ogun state, Nigeria, and Ouagadougou and Koudougou in Burkina Faso.

In addition to the above two main dissemination efforts, PMA Agile has produced summary briefs on SDP and client indicator trends, capturing the dashboard information, to disseminate at stakeholder meetings7. These have been necessary where internet connectivity is weak and helpful in expanding knowledge about the measures and their interpretation. Technical reports are prepared for the YRDS and other special studies; and as quarterly data have accumulated, has begun a series of analyses for journal publication and conference presentations. Three annual partners meetings have been held to exchange findings, best practices and lessons learned.

Cost

The costs of externally sponsored survey data collection efforts are often difficult to obtain and may not cover the same set of cost elements, e.g., personnel, transportation, training. Up to the time of this report, PMA Agile expended $2,736,681 on IP subawards to support personnel (central, data management, field workers), training, travel, equipment (smartphones, server), supplies, airtime, and other incidental data collection costs. Institutional indirect cost rates varied between 10 to 15%. IPs report their costs monthly enabling calculation of average costs per type of survey and over time. These are shown in Table 6. Unit costs for health facilities vary from $120 in Niger to $374 in DRC in Quarter 1 and tended to be slightly lower in Quarter 3. Client interview unit costs ranged from $9 in Niger to $40 in DRC in Quarter 2 and declined in most countries by Quarter 4. Client follow-up interviews ranged from $29 in Burkina Faso to $69 in DRC with the costs not yet known for Niger. The large sample sizes for client interviews makes them relatively cost-efficient. The total local costs for four quarters of data collection was $303,428 in Burkina Faso with two sites, $464,155 for Kinshasa, DRC, $424,550 for 3 cities in India, $1,070,938 for 3 counties in Kenya, and $1,082,773 for 3 cities in Nigeria.

Table 6. Estimates of PMA Agile in-country costs per data type and over four quarters.

Unit cost ($)
Country/Data type# of units in
baseline*
Q1Q2Q3Q4
Burkina Faso
  Health Facility269186203175144
  Client exit interview23012014
  Client follow-up119829
DR Congo
  Health Facility200374371349365
  Client exit interview16374033
  Client follow-up76669
India
  Health Facility337274186170202
  Client exit interview30771818
  Client follow-up56659
Kenya
  Health Facility618255240295268
  Client exit interview56882324
  Client follow-up394138
Niger
  Health Facility18012090N/AN/A
  Client exit interview15229N/A
  Client follow-upN/A
Nigeria
  Health Facility633301325268185
  Client exit interview50343519
  Client follow-up232655

*Baseline is defined as the first quarter each respective survey was implemented. For SDPs (Q1), for CEI baseline (Q2), for CEI follow-up (Q3).

†SDP units lost to follow-up are included in the quarterly and total average costs.

‡Q1 costs include start-up/prep costs

The total PMA Agile award from the sole funding agency, The Bill & Melinda Gates Foundation, was for $4,993,285 including indirect costs of 10% for the grant period November 15, 2016 to May 31, 2020. Subawards were budgeted at $2,743,626 (63% of total direct costs). The YRDS studies ranged from $142,676 to $220,987 over the three sites and are not included in the unit cost calculations.

Discussion

Establishing and maintaining a well-functioning M&E system that routinely collects data on the supply chain systems and management (Mukasa et al., 2017) to report on the performance of public and private family planning programs plays a critical role in addressing gaps in access to contraceptive information and services in low- and middle-income countries (LMICs). In this paper, we described the design, organization and implementation of a reliable and standardized M&E system that regularly collected, linked and aggregated data at different levels on a focused set of indicators in a cost-effective manner. Facility and client data can be linked, enabling an appreciation of the consumer environment (see Ahmed et al., 2014; Larson et al., 2019). The system has made a needed contribution in rapidly producing survey estimates of the indicators at a sub-national level using mobile phone technology and a dedicated small team of enumerators and supervisors.

The PMA Agile platform demonstrated it is possible to regularly collect data on over 2300 Service Delivery Points (both public and private) to track client volume and commodity sales and stock outs. It interviewed nearly 34,000 female and 16,000 male clients of childbearing age irrespective of their current contracepting status and reached over 18,700 of the female clients for follow-up phone interviews, across 14 urban settings in six countries, Burkina Faso, DRC, India, Kenya, Niger and Nigeria and within a 26-month timeframe. The platform also rapidly posted indicator data on 14 publicly accessible dashboards at the SDP and client levels for stakeholders to view and monitor program progress. To understand procurement of contraceptive supplies by unmarried young persons, three youth respondent-driven sampling surveys in three major cities were also conducted. By design, the PMA Agile system leveraged stakeholder engagement early in planning, implementing and monitoring family planning services. It provided valuable learning tools for health workers and less expensive means for program managers to obtain local and actionable information to improve city services. At stakeholder dissemination events, local public health providers and officials often confirmed the results’ profiles as aligning with their own perceptions.

Despite its innovations and strengths, PMA Agile also encountered implementation challenges. In several geographies, the local teams had to resolve issues related to incomplete master SDP lists. Facilities were found to be not operational, closed, or their addresses had changed by the time of the survey. In a few settings the distinction between a public or private SDP was blurred in practice. The systematic sampling of clients in advanced facilities often required a second enumerator, where client volume could slow completion of interviews and where otherwise casual interviewing could incur other types of information bias, selection bias and sampling error (Eisele et al., 2013). One critical perspective missing from PMA Agile’s provision of a total assessment of the health system’s performance in family planning is that of providers. Their interactions with clients, counseling skills and technical competence are important to evaluate (Hutchinson et al., 2011; Solo & Festin, 2019) and could be added to the platform on a regular basis. Resources permitting, this addition could be a useful means to assess human resource needs.

The PMA Agile platform was designed to be replicable, expandable and adaptable obtaining data to scale with the potential to be linked with population, spatial, administrative and other types of information for district-level planning. It has been implemented following standardized protocols with strict quality control across all aspects of sample selection, data collection, analysis and dissemination. Ideally external data systems should complement publicly established ones and not duplicate effort or require new resources. However, in the case of family planning, the quality of LMIC health information systems data have typically been weak and confined to government facilities. Since considerable contraceptive care is obtained from private providers, neglecting measurement of this sector’s contribution can significantly bias the understanding where access gaps exist. The PMA Agile platform can also support implementation research and as such, its potential will hopefully be exploited in the coming years.

Data availability

De-identified data from PMA Agile are publicly available from each individual country. To request PMA Agile data, please email the relevant country-specific address: Burkina Faso Agile Data Request burkinafaso.agile.data@pma2020.org; DRC Agile Data Request drc.agile.data@pma2020.org; India Agile Data Request india.agile.data@pma2020.org; Kenya Agile Data Request kenya.agile.data@pma2020.org; Niger Agile Data Request niger.agile.data@pma2020.org; Nigeria Agile Data Request nigeria.agile.data@pma2020.org.

There are no restrictions on who can apply to access the data. Those interested in using the data will be asked to complete a form that includes the purpose of the analysis, and confirmation of various data use considerations.

Notes

1 Agile retains the core innovation of PMA (formerly PMA2020), where women are recruited from or near the selected enumeration area and trained to collect data using smartphones on a repeated and quick-turnaround basis (Zimmerman et al., 2017).

2 At the time, these were The Challenge Initiative, and urban-focused family planning initiative, located in the Bill & Melinda Gates Institute for Population and Reproductive Health at Johns Hopkins Bloomberg School of Public Health, and an expansion social marketing project implemented by DKT International.

3 Questionnaires for each can be accessed at https://pmadata.org/technical-areas/pma-agile.

4 We use health facility and SDP interchangeably.

5 In India the number of urban primary health centers was very small in each site. The private sector sample size was accordingly increased.

6 These can be accessed on the PMA Agile webpage, for example https://www.pmadata.org/sites/default/files/2019-07/English_CI-YRDSS_Report_FINAL.pdf

7 An example of the Burkina Faso SDP brief can be accessed at https://www.pmadata.org/sites/default/files/data_product_results/PMA%20Agile-BF-Ouagadougou-SDP-French2.pdf. A client indicator brief is also available in English and French as well.

Comments on this article Comments (0)

Version 2
VERSION 2 PUBLISHED 25 Mar 2020
Comment
Author details Author details
Competing interests
Grant information
Copyright
Download
 
Export To
metrics
Views Downloads
Gates Open Research - -
PubMed Central
Data from PMC are received and updated monthly.
- -
Citations
CITE
how to cite this article
Tsui A, Anglewicz P, Akinlose T et al. Performance monitoring and accountability: The Agile Project’s protocol, record and experience [version 1; peer review: 1 approved, 1 approved with reservations]. Gates Open Res 2020, 4:30 (https://doi.org/10.12688/gatesopenres.13119.1)
NOTE: If applicable, it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.

Comments on this article Comments (0)

Version 2
VERSION 2 PUBLISHED 25 Mar 2020
Comment
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations - A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions

Are you a Gates-funded researcher?

If you are a previous or current Gates grant holder, sign up for information about developments, publishing and publications from Gates Open Research.

You must provide your first name
You must provide your last name
You must provide a valid email address
You must provide an institution.

Thank you!

We'll keep you updated on any major new updates to Gates Open Research

Sign In
If you've forgotten your password, please enter your email address below and we'll send you instructions on how to reset your password.

The email address should be the one you originally registered with F1000.

Email address not valid, please try again

You registered with F1000 via Google, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Google account password, please click here.

You registered with F1000 via Facebook, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Facebook account password, please click here.

Code not correct, please try again
Email us for further assistance.
Server error, please try again.