Congratulations to all of the institutions and individuals involved in what is a great achievement!
Following initial consultations in 2009 through to the deadline for submissions in November 2013, 155 higher education institutions across the UK have dedicated a huge amount of time and resources to this process.
The results published on 17 December highlighted both the quality and non-academic impact of each institution's research and marked the culmination of years of hard work.
If you would like to learn more about how Elsevier has supported the REF 2014 and how we support other national assessments, please make sure you look at these two great resources:
REF2014 Results Analysis tool
On the 16 December all of the UK Higher Education Institutions (HEIs) that submitted returns for the REF2014 had two days to analyse their results before they were made publically available.
That is why Elsevier built the REF2014 Results Analysis tool (approved by HEFCE for HEI use) – to allow instant performance assessment both nationally and relative to chosen peers, that helped establish internal and global responses.
The web-based REF2014 Results Analysis tool includes:
- a comprehensive analysis of the overall results and sub-profiles, including the impact elements
- quality, volume and combined measures
- institutional and unit of assessment (UoA) level rankings
- regional and comparator group analysis
- various download options
How it worked
- 16 December: Individual institutional results were released and individual HEIs could upload their own results to the tool.
- 17 December: Each HEI received the sector results and could then use the tool to analyse all results.
- 18 December: View and rank by proportion of staff submitted as derived from the HESA contextual data
Note: This is an initial mock-up of the tool with dummy data. The final product may vary in look and functionality.
The HESA contextual data was used in the tool to calculate the proportion of staff submitted. This was available to view and there was an option to rank submissions by this measure.
We decided not to incorporate proportion of staff submitted into the quality measures (e.g. GPA x proportion of staff submitted) to create a new metric.
Administrative users were able to create comparator groups at any point upon being given access to the tool. Any groups created continued to be available following removal of the RAE2008 results data and upload of the REF2014 data. When an administrator created new comparator groups, these were accessible to all users within their HEI.
Institutions wishing to view rankings where small, specialist institutions have been removed, were required to create a group locally - the tool will not include such a comparator group by default.
The following comparator groups were available in the tool from the outset:
- Northern Ireland
- Mission Groups
- Russell Group
- University Alliance
- Guild HE
- English Regions
- East Midlands
- East of England
- North East
- North West
- South East
- South West
- West Midlands
- Yorkshire & the Humber
We planned the following model of user roles and rights:
Elsevier created one administrative account/user per institution (this was the single named contact, whose details you will have provided when signing up for the tool). This user could:
- upload results
- create and edit comparator groups
- edit weightings/thresholds for the different measures:
- quality index
- threshold for number of submissions above a certain GPA
- threshold for number of submissions above a certain proportion of 4*/3* activity
- create other user accounts within the institution (including other administrator user accounts if needed)
- indefinite number of users
- must be created manually, one-by-one
- can create a generic institutional account
- view and download data
Non-admin users were only able to view and download data – no ability to create comparator groups, change the weightings, etc.