Never mind the policymakers, it is the policy wonks that researchers should be engaging with…
Perhaps one of the laziest terms used by the research and policy community across sectors is ‘policymaker’. Research funding bids, how to guides, blogs, academic papers and policy briefs are all awash with references to the ubiquitous policymaker. And before you point it out – yes I am guilty of it also. Who exactly are these policymakers and how do they use research evidence? This is the question the ESRC-DFID Impact Initiative for International Development Research asked in a scoping study of evidence use behaviours amongst those working to reduce global child poverty and inequality.
A study of evidence use behaviours
We interviewed a range of senior development professionals working in global organisations (you can watch the final cut below) such as the Word Bank and the UN’s Economic Commission for Africa, in bilaterals such as the Department for International Development (DFID), in INGOs and in government ministries as well as research organisations. This all took place in Ethiopia during the Putting Children First conference where 160 experts from a range of development organisations gathered to discuss how to support a global campaign to end child poverty.
On the research side interviewees were generally upbeat. They spoke of a movement to address child poverty and a more holistic view in achieving this emerging. There was also strong support for governments to take research agendas forward based on the majority view that locally generated data has more traction than studies coming down from multilaterals and international organisations.
Interviews with a range of senior development professionals working in global organisations which took place in Ethiopia during the Putting Children First conference.
Connections not collections
When it came to asking about how the relationship between research and policy processes could be strengthened things got a little more complicated. The two areas that seem to be of interest are the use of evidence to:
1.Inform programme design
2.Support advocacy and influencing at a national and international level.
However, some interviewees were less keen on advocacy and instead emphasised learning. This could be the ‘advocacy bogeyman’ striking again which I have written about before. Whatever the emphasis, the point that everyone seemed to agree on was that longer term partnerships were key. Or as one respondent put it: ‘it’s about connections not collections.’
What also fell out of the interviews was a push back on the concept of the policymakers being the primary audience for research and evidence. This means very little when you consider the diversity of policy actors, practitioners, donors and activists. One government official wryly explained there was only one policymaker in his government department and that was the Secretary of State herself.
To be fair, advocacy organisations seem pretty good on the whole at stakeholder mapping and power analysis to inform their policy communications. Many research programmes also claim to have a pretty good idea of who they are targeting with their research. Despite these pathways to impact approaches the generic policymaker still crops up continuously. Although this is still better than the even sillier ‘research end user’.
Our respondents were strongly divided over the assumed wisdom that researchers must produce very short and simplified briefings with specific recommendations for non-academic audiences. While a few stuck to the dominant narrative that the ‘busy policymaker’ finds short and pithy briefings particularly useful others felt that the role of researchers was to influence the influencers with far more nuanced materials. One respondent went further insisting that researchers did not dumb down their findings and felt able to challenge dogma. ‘That’, he said emphatically ‘is what research is for’.
Getting to know your policy wonks from your practitioners
The central challenge seems to be that you need to go that extra mile to acquire a more nuanced understanding of your audiences’ diverse roles in change processes. You shouldn’t really be lumping everyone together from: members of parliament who are engaged but sometimes under-informed; ministers who might be well briefed but have other priorities; advisors who are highly politicised; development agency policy staff who are well read but time poor; or NGO practitioners who are technical experts but highly focused on one specific intervention.
When engaging other experts, such as policy wonks located in the World Bank, or DFID or Save the Children or practitioners in UN agencies and civil society organisations, researchers should be seeking to support them to make better use of evidence. This might mean directing them to the most appropriate papers before jumping straight into trying to provide the answers to their questions. After all, as one interviewee said: ‘we don’t always know which questions to ask’.
We have also been conducting similar scoping studies of evidence use in Education and Health and we have come across some similar issues that suggest a need for clarity on who researchers think they need to target and how they best approach this engagement. Our interviews challenged the idea that no one reads academic papers and everything has to be presented as a policy brief. In fact, it was suggested that in an age of perceived fake news academic papers may be making a bit of a comeback.
We are all knowledge intermediaries
It has also emerged across our three studies that building the capacity of these knowledge intermediaries, (no one we spoke to actually defined themselves as such) whose own jobs relate to identifying, assessing and repackaging evidence for a range of audiences, is an essential part of strengthening evidence use.
So perhaps we need to stop going on about policymakers all the time and focus much more on supporting those working in these intermediary roles. Some come from an academic background themselves and others are very capable policy analysts. However, many lack the time and the capacity needed to be effective knowledge brokers in their own institutions as do some of the researchers trying to engage them. There is of course the most important issue of all: how the political and organisational context affects evidence use but I’ll save that for another blog.
Five things we learnt from our study of evidence use behaviours
1.Be careful of the overly instrumental use of evidence – some agencies just go from the evaluation of one programme to another without much real learning
2.Evidence developed in partnership is better ‘connections not collections’
3.Get ownership from the start – include those you hope will be influenced in the research process itself – it is ‘research as development not for development’.
4.Beware of dumbing down – you can make research accessible to non-academic audiences without over simplification and some audiences might be more capable than you think at assessing evidence.
5.A lack of human resources around knowing of what research is already available and making use of it is a major barrier to evidence informed policy and practice. All power to the knowledge intermediaries and the knowledge brokers. This sometimes applies to researchers also who often fulfil these roles in a knowledge system.
Blog post by James Georgalakis.
James is the Director of Communications and Impact at the Institute of Development Studies (IDS), with overall leadership of the Institute’s communications, impact and engagement work. James works with the Director and IDS members and partners to lead the development and refinement of major strategic goals in this area.
James has worked in the not-for-profit sector since 2000, predominantly in media and advocacy communications roles. He joined the international development sector in 2005. Since joining IDS in 2010 he has played a key role in ensuring that IDS research has a real and lasting impact on policy and behaviours and strengthening the positive external perceptions and identity of the Institute. He is the Director of the Impact Initiative.
Republished with kind permission. January 2018
Blog originally published in The Impact Initiative