Color blocks in a a square with four separated out

commentary

(The Royal Society)

November 30, 2018

Evidence Synthesis—Behind the Scenes

Photo by Anfisa Kameneva/EyeEm/Getty Images

by Sarah Giles, Susan Guthrie, Catriona Manville

In September, the Royal Society and RAND Europe published an evidence synthesis on the impacts of ammonia emissions from agriculture on biodiversity (PDF). As well as aiming to provide a useful summary of the evidence to inform ongoing policy discussions, the study was intended to test the Royal Society's recently developed principles for good evidence synthesis for policy.

The principles are intended to provide underpinning guidance and standards for producing an evidence synthesis to inform policymaking. They should be applicable to any topic, any context and any scale of evidence synthesis. But how did they hold up in practice?

Air quality is a tricky policy issue—it crosses both health and environment related domains and the industries and behaviours responsible for the source of emissions are often not those that suffer the consequences. Synthesising evidence on an air quality topic was therefore a perfect test for our principles!

To make things even more challenging, we also wanted to work to a deadline, so we set ourselves a time limit of three months for the study. In the end, this was enforced by a hard deadline—the report had to be ready for submission to a consultation on the UK's clean air strategy in mid-August.

The principles for good evidence synthesis for policy (PDF) were an effective and useful tool to guide and inform our methodological approach and test our progress throughout the project.

The below bullet points summarise some of these benefits, as well as main lessons-learnt and the challenges and trade-offs we encountered.

  • The value of involving policymakers
    This is a key aspect of the ‘inclusive' principle and we found engaging with the relevant Defra and Public Health England teams to be extremely valuable. This was particularly relevant at the outset of the synthesis to help us refine the exact study question and ensure the synthesis would be useful and used. The time and effort involved in this aspect was relatively minimal and by tweaking the exact framing just slightly, this made all the difference in terms of usefulness and impact.
  • The role of good templates to ensure consistency and reduce bias
    This was particularly important when applying the principles to a relatively rapid three-month synthesis. Related to the 'rigorous' principle, the way that we screened and read a lot of literature quickly and systematically was by using staff from wider Royal Society and RAND Europe teams to help with this aspect. Templates were absolutely vital to ensure that the same information was extracted from each paper and from each expert interview.
  • The value of non-subject experts
    The fact that none of the core project team had an academic background in air quality or ammonia was actually very beneficial for reducing bias and ensuring accessibility. We relied entirely on our systematic screening of the literature and expert interviews and review to build up a picture of the evidence, and then presented this evidence clearly, in lay terms. We had no natural bias towards particular research groups, papers or schools of thought and didn't naturally use subject-specific jargon.
  • Some aspects of the evidence synthesis process are inherently prone to bias
    It is well known that rapid reviews are more prone to bias, mostly because they cannot possibly hope to identify and review 100 per cent of all relevant papers on a topic in a rapid timescale. They cut corners by not double-screening papers or double-extracting data. However, the rapid timescale can be very valuable when evidence is required rapidly to inform policymaking. Two areas that this synthesis highlighted as particularly bias-prone were identifying experts for interview, and the process whereby additional papers are selected to review based on references cited in the identified papers. We attempted to minimise this by interviewing experts based predominantly on well published authors, identified from the literature review. However the process of identifying additional references for review seems inherently biased, as it relies on the reviewer making a judgement call on whether they think a reference they have come across is worthy of consideration or not.
  • The value of a summary document
    Writing a short report that is accessible is not easily compatible with being fully transparent and setting out every complexity, uncertainty and area of contention. It was also challenging, in a short timeframe to write a very concise synthesis document. We got around these challenges by producing a summary and providing more detail in the main report, and even more—particularly on the methodological detail—in annexes.

Overall, the principles were a helpful tool to support the development of what we hope is a useful and relevant policy report. Based on our experience we expect that, despite some of the challenges and trade-offs, the principles should be relevant and applicable to all types of evidence syntheses for policy. We found that they provided a useful framework for discussion and we referred back to the principles on a weekly basis as we completed this synthesis.

The Royal Society is continuing to use the principles on further evidence syntheses over different timeframes and using different techniques—for example, as part of the new Living Landscapes programme. RAND Europe also continues to develop new opportunities for evidence syntheses across a range of fields.


Sarah Giles is a senior policy adviser at the Royal Society. Susan Guthrie and Catriona Manville are research leaders at RAND Europe.

This commentary originally appeared on The Royal Society on November 28, 2018. Commentary gives RAND researchers a platform to convey insights based on their professional expertise and often on their peer-reviewed research and analysis.