Asking the right question to get the right policy

  • Blog Post Date 04 April, 2016
  • Perspectives
  • Print Page

There is consensus in the development community on the importance of bridging the gap between researchers and practitioners; however, misaligned incentives underlie this gap. In this article, Pande, Moore and Dodge of Harvard Kennedy School, explain how bringing policymakers together with researchers to work more iteratively ensured that data from MNREGS - the world’s largest public works programme - became accessible and relevant to those who use it.

If the development community agrees on one thing, it is the need to ‘bridge the gap’ between researchers and practitioners. If it agrees on a second thing, it’s that misaligned incentives underlie this gap.

The usual narrative goes like this:

Policymakers are under pressure from higher-ups in their ministries who, in turn, face pressure from officials with an eye on the upcoming election. This creates demand for short project cycles and evidence that is specific to their little corner of the world.

By contrast, researchers seek to make a major contribution to their field. This reduces their incentives to investigate quirky local phenomena and puts them on a much longer timeline than policymakers.

But, the usual narrative presumes that the only task at hand is to identify and test the policy solution. Before you arrive at the solution, you need to identify the problem. If you find a way to bring researchers and practitioners together at the stage of problem identification, you may align their incentives and create valuable collaborations.

While no easy task, our recent work provides some perspective on how it can function. With support from DFID’s (Department for International Development, UK Government) Building Capacity to Use Research Evidence , our team based out of Harvard Kennedy School’s Evidence for Policy Design (EPoD) is working with the Government of India to make administrative data from the Mahatma Gandhi National Rural Employment Guarantee Scheme (MNREGS) – the world’s largest public works programme – usable by programme officials, researchers, and the public.

MNREGS benefits up to 50 million households at an annual cost of nearly US$5.47 billion. When we began our collaboration in 2013, MNREGS had one of the largest databases for a social programme in the developing world. But the website providing access sprawled over thousands of pages and required extensive knowledge to operate.

As we worked alongside technicians at India’s National Informatics Centre (NIC) on what would become the MNREGA Public Data Portal , we identified factors that had effectively buried this gold mine of data. Predictably, some related to capacity (overstrained servers, few computers), and others to bureaucratic quirks (measures against cronyism had prevented hiring of technical staff).

But, a key set of factors related to organisational structure. After witnessing frequent mistimed communications between bureaucrats and technicians, we introduced principles of agile software development, where developers quickly produce prototypes, gain early and frequent feedback from the client, and then proceed through cycles of cooperative iteration.

This experience has influenced how we view project cycles. To those who fund impact evaluations and other late-stage aspects of research, we argue the case for investing in the stage of joint problem discovery with the policy partner. Tweaking the problem definition upfront can lead to a very different – and potentially more positive – evaluation outcome. The delivered policy solution will be more likely to work at scale and match policymakers’ needs.

Our embedded collaboration has, for now, produced a durable product. Since the launch of the Data Portal, the Ministry itself has maintained and updated the platforms, making them more robust and versatile. As we write, one component of the Portal alone, the Reports Dashboard, is approaching 150,000 views by over 100,000 users in one year.

The impact evaluation of the Data Portal must wait for future stages where we will measure how policymakers are using the data. But the nature of the collaboration makes it more likely our results will be policy relevant. And both researchers and policymakers can reap the benefits of a functioning product in the meantime.

This article first appeared on Alliance for Useful Evidence on 14 October 2015.

No comments yet
Join the conversation
Captcha Captcha Reload

Comments will be held for moderation. Your contact information will not be made public.

Related content

Sign up to our newsletter