Analysis of Research and Education Indicators to Support Designation of Academic Health Science Centres in England
RAND Health Quarterly, 2013; 3(2):2
RAND Health Quarterly, 2013; 3(2):2
RAND Health Quarterly is an online-only journal dedicated to showcasing the breadth of health research and policy analysis conducted RAND-wide.
More in this issueIn April 2013, the Department of Health (DH) announced an open competition to designate Academic Health Science Centres (AHSCs) in England. To support the current competition, the DH commissioned RAND Europe to compile and analyse various types of publicly available data and quality assessments in the domains of medical research and health education.
This article presents the results of this analysis in the form of summary “tables of excellence,” focusing on medical schools/academic partners likely to seek AHSC status. A detailed bibliometric analysis of health-related research publications has also been carried out and is presented. In addition, the article provides an overview of the publicly available data and outlines the significant caveats to using the data to produce education and research rankings for institutions.
Given the various caveats and the requirements to balance two domains of activity (research and education), the ranking methodology presented in this article can be used in an “advisory” capacity to provide a general indication of the quality of the candidate AHSC institutions. The analysis is intended to assist potential applicants in deciding whether to submit a pre-qualifying questionnaire as part of the procurement process, and subsequently to inform the deliberations of the selection panel for the AHSCs.
The Department of Health (DH) announced an open competition in April 2013 to designate Academic Health Science Centres (AHSCs) in England. The DH awarded AHSC status to institutions in England for the first time in March 2009. To support the current competition the DH commissioned RAND Europe to compile and analyse various types of publicly available data and quality assessments in the domains of medical research and health education. This article primarily focuses on medical schools/academic partners likely to seek AHSC status but, where available, an analysis of research and education quality metrics has also been presented for NHS institutions in England.
This report presents the results of the analysis in the form of summary “tables of excellence.” A consolidated table showing the research and education domain ranking lists is presented below (Table 1). To provide an impression of the relative performance of the listed institutions across the research and educations domains, a colour code is applied to each indicator ranking list in the table.
It should be noted that the analysis is not intended to provide a definitive shortlist of institutions. There are a series of caveats about the methods we have used that should be borne in mind when reviewing the results presented. Our analysis is perhaps overly selective in that we have only examined the performance of universities with medical schools, whereas partnerships of any NHS provider/university in England that judge themselves able to demonstrate characteristics for AHSCs and to meet the published designation criteria may submit a pre-qualifying questionnaire. A more serious concern is that most of the indicators are proxies for quality. Because we are depending on pre-existing indicators, the unit of analysis for which the indicator is compiled is sometimes not the same as our unit of analysis. Furthermore, some of the indicators that have been used as quality measures in either the education or research domains are actually basket indicators that span both domains. Related is the issue of multiple counting of indicators across the research and education domains. Another key issue that must be noted while interpreting the results is that using rankings emphasises the differences between institutions that are similar in performance and vice versa.
Possibly the most significant weakness of the approach is where we have had to combine sub-indicators for each institution to provide a single institutional indicator for ranking. In these instances, we have had to generate our own rankings by devising a way to combine an institution's scores into one indicator. Because there is no accepted way of combining an institution's scores, this leaves us open to the criticism that we have used an arbitrary or inappropriate method of combination. Furthermore, university ranking lists that are compiled on a rolling basis are subject to temporal fluctuations (in part due to changes in the methodologies that have been applied to arrive at the rankings). We have not carried out any time series analysis of the ranking data and have restricted our analysis to the most recently available data. Finally, we also note that there are a number of well-known limitations to bibliometric analyses and our results need to be considered within that context.
Given the various caveats and the requirements to balance two domains of activity (research and education), the ranking methodology presented in this article can be used in an “advisory” capacity to provide a general indication of the quality of the candidate AHSC institutions. The analysis is intended to assist potential applicants in deciding whether to submit a pre-qualifying questionnaire as part of the procurement process, and subsequently to inform the deliberations of the selection panel for the AHSCs.
Table 1
An Indicative Alphabetical Institutional Ranking List for the Research and Education Indicators (rank 1–5: green cell; rank 6–10: yellow cell; rank 11–20: blue cell)
The research described in this article was prepared for the Department of Health (England) and conducted by RAND Europe.
RAND Health Quarterly is produced by the RAND Corporation. ISSN 2162-8254.
Explore RAND Health Quarterly articles on PubMed