Running a Randomized Control Trial in a Real Health System

Alleviate Poverty and Promote Economic Growth Fund
February 16, 2017

Innovations for Poverty Action

By Anne Karing and Arthur Baker (Photo credit: BERI)

Editor’s note: This cross-post originally appeared on the Behavioral Economics in Reproductive Health Initiative (BERI) blog

Together with the Ministry of Health and Sanitation of Sierra Leone and CEGA’s Behavioral Economics for Reproductive health Initiative, IPA Sierra Leone has launched a new randomized control trial (RCT) to evaluate the impact of social incentives in the form of colored bracelets on demand for antenatal care (ANC) and delivery with a doctor or nurse.

In 2016, we wrote a post on our efforts in piloting the intervention to test our assumptions and finalize our research design. In January 2017 we launched the experiment. Our hands-on experiences have taught us a huge amount about running an RCT embedded in a real health system, using administrative data, and working with government health workers. The nurses who deliver ANC are both implementing the project, and recording all outcome data. This has major benefits, but also presents challenges. This post will explain our final design and talk about these challenges.

Government clinics provide ANC on a weekly or monthly basis. Most women in Sierra Leone make at least one ANC visit, but in some districts as few as 57% of women complete the recommended number of visits. Further only 54.4% of women give birth with a skilled attendant.

Design
We randomized 90 clinics, about 7% of Sierra Leone’s basic health facilities, into three arms. Women in all arms were sensitized on the importance of ANC and skilled delivery-assistance. In Arm 1, no bracelets are given and ANC services are delivered as normal. In Arm 2, women receive a bracelet when they come for their first ANC visit. Women can choose their preferred color. The bracelet is exchanged for an identical one during later visits. These bracelets do not show how many visits a woman has made.

In Arm 3, women can signal that they came timely for the recommended number ANC visits. At the first visit, women receive a pink or purple bracelet, depending on whether they come on time for their first visit. This bracelet is exchanged for a yellow if they make 5 or more visits, including one in the 8th or 9th month, and a multi-colored bracelet if they deliver with a skilled attendant. The bracelets allow women to show to their communities that they look after their and their baby’s health

We want implementation of this intervention to be as realistic as possible. This means that the nurses who provide ANC services are the ones giving out the bracelets and recording outcome data, exactly as they would if the project were scaled up. The objective is to make this study more useful to policy-makers. However, this presents considerable challenges. For our findings to be valid, we need to verify two assumptions. First, that nurses implement the intervention consistently across all clinics. Second that nurses accurately record visits.

Implementation
People need to understand what the different bracelets mean, so that wearing the bracelets women can illustrate that they are following their ANC schedule. This means that more than 180 nurses who deliver ANC services in 90 clinics need to consistently give out the right bracelets, at the right times, with the right explanation.

To make implementation work, we developed simple implementation materials, which we honed during a number of months of piloting. During the launch at each clinic, field staff train nurses, and support them to implement the program the first time. After the launch, clinics receive regular monitoring visits. during these visits, field staff conduct surveys with a random sample of women to make sure they understand the bracelets. They also collect data on ANC attendance and bracelet distribution from weeks when they weren’t present, to make sure that bracelets are given out properly while field staff aren’t there to observe.

Data collection
Our main outcome data comes from administrative clinic records, instead of survey data. The main reason is that administrative data is recorded by nurses at the point of delivery, not self-reported later on. This means it does not suffer from recall bias or social desirability bias on the part of pregnant women, which could be a problem with surveys. Digitizing administrative data is also a lot less expensive than survey data.

However, nurses are not enumerators. Recording data is not their only responsibility, and nurses may forget to record ANC visits or record visits that did not take place to show high clinic attendance. To detect such irregularities, field staff regularly collect data from several different sources at clinics. Comparing these different sources, and back checking them with surveys, means our outcome will be the most accurate and detailed data on ANC take-up for these clinics. This data has relevance well beyond our experiment. The Ministry of Health Sierra Leone is working hard to improve data on healthcare, but it is difficult and expensive to evaluate the accuracy of various data sources. Our data will give the Ministry a clearer picture of how accurate their data sources are.

If the evaluation succeeds, it will achieve two things. First, we will have a better understanding of the role of social signaling in health decision making. Second, we will be able to make recommendations on how to consistently implement the intervention and improve the accuracy of administrative data. Both will be crucial factors for an implementation at scale.

View original article

Top stories from Alleviate Poverty and Promote Economic Growth Fund

Your support makes these stories possible.

Invest in a better world