Examining proposal evaluation processes used in Horizon 2020

Benchmarking phrase on face of pocket watch, photo by tashatuvango/Adobe Stock

tashatuvango/Adobe Stock

Broadly speaking, proposal evaluation processes in Horizon 2020 were fair and transparent. However, there may be scope to improve consistency and the feedback provided, and to reduce the burden and conservatism of the process.

What is the issue?

Public research and innovation (R&I) funding is central to the research system because it helps develop the R&I workforce, creates infrastructure for doing research and enables the conduct of research studies. The main method for informing decisions on the allocation of research funding is peer review. However, this has received criticism in recent decades, for issues such as waste and burden, bias against some types of research or researchers, and – perhaps most significantly – failure to effectively safeguard the quality and integrity of research. It is therefore essential to assess proposal evaluation processes to ensure that grant evaluation systems are achieving their desired aims, enabling them to support the best R&I.

How did we help?

Horizon 2020 (H2020) is the eighth framework programme of the European Union and was the main mechanism for funding research and innovation in Europe, with nearly €77 billion of funding available over seven years (2014 to 2020).

RAND Europe conducted a comprehensive review of proposal evaluation processes across H2020 to inform the development and adaptation, where required, of the approach taken to proposal evaluation in the next R&I framework programme, Horizon Europe.

The research questions underpinning this work are the following:

  1. How relevant is the current proposal evaluation system to achieve the objectives of H2020?
  2. How efficient is the current proposal evaluation system?
  3. How effective is the current proposal evaluation system?
  4. What are the lessons learnt and areas for improvement for the future? What needs to be adapted to accommodate the novel features of Horizon Europe?

Our approach to address the research questions consisted of a comprehensive review of the literature on proposal evaluation processes, an analysis of the burden and effectiveness of existing processes in H2020 and seven case studies looking at proposal evaluation processes used by international R&I funders.

What did we find?

Our study, sponsored by the European Commission’s Directorate-General for Research and Innovation, found that, broadly speaking, proposal evaluation processes were fair and transparent. However, there may be scope to improve consistency and the feedback provided, and to reduce the burden and conservatism of the process. Specifically:

  • Work is needed to encourage more female applicants to the framework programmes. Although the evaluation processes are broadly fair across H2020 applications, the main imbalance in award holders is by gender, largely driven by the difference in application rates.
  • Most of the burden of the proposal evaluation process falls on the applicants. Based on a survey, we find that around 80-85% of the burden falls on the applicants.
  • Two-stage application processes are typically intended to reduce burden, but they may actually increase overall burden. Our analysis finds that the time spent on preparing the full application is relatively similar for both single- and multi-stage proposals.
  • Novel approaches, such as double-blind review and lotteries, could reduce burden and bias and support more innovative research.
  • More flexible use of consensus meetings may reduce reviewer burden and also enable more flexibility in the use of reviewers.
  • There may be scope to increase the clarity and consistency of the application of funding criteria, for example through a clear framework for assessment and better alignment across actions.
  • Horizon 2020 is broadly transparent compared with international benchmarks, but there may be scope to build on this further, particularly at the pre-call stage.
  • Peer review–based proposal evaluation processes tend to be conservative, and there are indications this may apply in the case of H2020.
  • Feedback could be more focused on improvement and learning and more clearly structured around the evaluation criteria.

What can be done?

  1. Conduct regular review of the fairness of the process. Our analysis suggests that proposal evaluation processes are fair, but case studies suggest that it is good practice to review fairness regularly and present data openly to the R&I community. More comprehensive information should be collected on diversity and personal characteristics on the application forms.
  2. Encourage more female applicants. This could include the possibility of declaring career breaks and have these considered within the assessment process, providing training and guidance to reviewers on consideration of gender, and ensuring appropriate participation of women in assessment panels and other leadership positions.
  3. Limit the use of multi-stage processes.
  4. Explore and experiment with novel approaches, such as double-blind review and lotteries. Such approaches have the potential to reduce bias and conservatism in proposal evaluation processes and are likely to be acceptable to the R&I community provided they are transparently implemented.
  5. Streamline the use of consensus meetings. A remote format (as currently used) is likely to save time and money and be more environmentally sustainable, without having any impact on outcomes.
  6. Improve the clarity of assessment criteria and ensure that the evaluation process is centred around them.
  7. Consider more targeted use of reviewers depending on their expertise. This could include increasing the number of reviewers in some cases and targeting their assessment to specific aspects of the proposal and to criteria that align with their expertise.
  8. Explore the scope to use novel technologies to improve the process.
  9. Provide constructive feedback, especially for unsuccessful applicants.
  10. Consider routes to ensure innovative R&I is supported, such as targeted funding streams. The evidence on how to reduce conservatism in proposal evaluation processes is limited, but one commonly used approach is to create targeted funding streams specifically for innovative or risky R&I.