European societies expect science, technology and innovation to promote economic growth and improved wellbeing. Research alone is insufficient. Its translation into discoveries of new processes, services and products is vital. And evaluation of the impact of scientific and social sciences research can lead to better funding decisions.
The factors that lead to innovation are complex and evolve. We use evidence to untangle complexity, recognise connections, and open channels for new thought. Our research aims to maximise the gains that science, technology and innovation can bring to society and the economy.
To inform the development of the guidance and criteria for the preparation of impact case studies in Research Excellence Framework (REF) 2021, we examined the case studies submitted to REF 2014 and identified quantitative indicators of impact. We then developed guidance for how these indicators could be standardised for potential use in REF 2021.
Citizen science-based idea management platforms offer a digital social space to generate, discuss, refine and evaluate ideas. Additionally, a variety of methods exist for exploring expert consensus, many of which are Delphi-based. Researchers provide a practical overview of online approaches to opinion gathering.
Researchers are developing recommendations for the College of Policing and the Education Endowment Foundation on how to assess evidence from research studies and make evidence-informed recommendations for practitioners and policymakers.
Stakeholders and 'customers' say the Eastern Academic Health Science Network's TSU—which helps implement innovative solutions to challenges faced by health and care providers—has supported them with their pilots and network building. An evaluation provides recommendations for further improvements.
Using the GCSCC’s Cyber Security Capacity Maturity Model, researchers developed a proof-of-concept toolbox that presents guidelines and approaches for government officials and cybersecurity practitioners interested in cybersecurity capacity building.
Systematic reviews are often time-consuming and costly. In the second of a three-part learning report series, RAND Europe researchers outline how crowdsourcing can make the systematic review process more efficient without sacrificing quality.
The Canadian Institutes of Health Research asked RAND Europe to update the 2009 study on grant peer review to provide a more widely applicable source of evidence around the strengths and weaknesses of peer review for grant funding assessment.
Through a rapid review of the available literature and interviews with four experts, researchers compiled a practical overview of crowdsourcing in citizen science. The report highlights the benefits of crowdsourcing, best tools to use, and best practices.
Researchers from RAND Europe and Open Evidence are using state-of-the-art knowledge and data to understand the media literacy and online empowerment issues raised by algorithms in online media services and platforms.