What is Knowledge Exchange?
Higher Education Institutions (HEIs), such as universities, teach students and undertake research that creates new and useful knowledge. But they also work with many different types of partner to ensure that this knowledge can be used for the benefit of the economy and society - this is known as knowledge exchange (KE).
These partners range from individual members of the public who may attend events organised by a university, to a multinational company partnering with a university to develop new medicines. The activities might include public events, allowing businesses to access specialist equipment or facilities, undertaking consultancy or licensing their intellectual property so others may use it. Universities often also play important roles in their local area.
Who are Research England?
Research England are part of UK Research & Innovation. We are a public body who fund Higher Education Institutions to undertake research and knowledge exchange. You can find out more about Research England on our main website.
How do I use the KEF dashboards?
A strength of the English higher education sector is its diversity. The KEF groups institutions into ‘clusters’ of peers – institutions with similar characteristics such as how much research they do and in what subject areas. Their performance is then presented alongside the average performance of this peer group. For example, you’ll see arts specialists clustered together. By not comparing everyone to everyone, you can better see how different types of institutions perform.
The KEF’s interactive dashboards are designed to provide details on a wide range of activities across seven ‘perspectives’ of knowledge exchange – from how they approach community engagement and contribute to local growth, to the volume of work undertaken with businesses.
The KEF allows you to compare the performance of an institution relative to their peers, but it is important to note that the dashboards are not intended to be used to derive a single overall ‘score’ (or rank institutions into a league table). We do not expect institutions to be above the average score for their peer group in every area; rather, the dashboards enable you to explore the data and descriptions of activity as follows:
KEF Overview level
At the perspective level, we display the performance of each institution as a decile score where each decile represents 1/10 of the KEF provider population. For example, a decile score of ‘1’ indicates that they are the top performing 10% of the KEF population.
Each perspective decile is calculated from the three-year average of a range of metrics. The selected institution’s decile score is displayed in relation to the average decile score of their peer-group cluster. You can therefore see whether the selected institution is performing above or below the average for their cluster:
Detailed information about the mathematical calculations used to determine the scores are provided under KEF Technical notes.
By tapping or hovering over a segment of the chart you can drill-down to more detailed information about the metrics that go to make up that perspective for the selected institution.
Institution context
Adjacent to the polar area chart is the ‘Institution Context’ - a 120 word summary, with a link below to read the full description. This is an introduction to the organisation, such as their mission, strategic priorities, areas of strength etc.
Cluster summary
Below the Institution context is the ‘Cluster Summary’, this is a description of the key characteristics of the cluster of the selected institution.
Perspective level
At the perspective level, the individual metrics that make up the perspective decile score are displayed as bar charts, with the selected institution shaded orange, showing:
- Three-year average across all metrics
- Three-year average for each of the metrics that make up the perspective decile.
A dotted horizontal line on each chart denotes the cluster average for that metric. Note that the bars are scaled relative to all institutions or each metric (i.e. 100% is the highest value, while 0% is the lowest, irrespective of the actual minimum and maximum values for the metric).
Metric level
At the metric level, detailed information about a single metric is displayed for the selected institution in the following formats:
- Relative performance - Scaled bar chart showing the selected institution’s three-year average compared with other cluster members.
- Trend data - Line graph showing the performance against the metric for the selected institution over a three-year period.
- Annual performance relative to cluster – Sortable table showing how the metric has changed over the last three years.
Perspective narrative statements
For two of the perspectives, Local Growth and Regeneration and Public and Community Engagement (shaded in grey), the currently available metrics are limited. We have therefore asked institutions to provide additional narrative to help explain their work in these areas.
The narrative statements can be accessed by hovering over the perspective segment and following the link provided (max 2,000 words).
The narrative statements are designed to be factual, evidenced statements, and are structured to allow comparison between institutions, with each statement providing information about their:
- Strategies and the needs they have identified
- Activities that they undertake to address the identified needs
- Results and impacts of their activities
Detailed information about the narrative statements and the guidance provided to institutions to prepare them is available on the Research England website.
Clusters
Here you can find an overview of all the KEF cluster average polar area charts, enabling you to compare the relative strengths of each. By selecting one of the cluster charts along the bottom of the screen, the page will display the membership and cluster average quartile scores for each perspective in the selected cluster.
Clustering for fair comparison
The purpose of clustering is to group the KEF participants into KEF clusters that have similar capabilities and resources available to them to engage in knowledge exchange activities.
It is important to note that the KEF clusters are not ranked in any way – the clusters are not a ranking in themselves – they are intended to promote fair comparisons between similar sorts of institutions in a very diverse sector.
More detailed information about the clustering process is provided under KEF Technical notes and the full details may be found in this report.
Find an institution
This function enables you to search for a given institution either by typing part of the name in the search bar and pressing enter, or by clicking on the interactive map of institutions by location.
Once your selected institution’s full name appears in the search results, select the name and click on the pop up hyperlink or select ‘KEF overview’ in the menu bar to see the overview dashboard for that institution.
Comparing institutions
This function enables you to select two different institutions, either from the same cluster or different ones and see their high-level decile dashboard side by side. Select or hover over individual perspectives to see further detail of the metrics for either institution. Care should be taken when comparing institutions from different clusters – you should always think about their performance relative their cluster.
How did the KEF come about?
The below timeline provides a brief overview of the history of the development of the KEF, including links to key documents published through the development process.
KEF development timeline
-
February 2022
-
March 2021
Publication of the first iteration of the KEF.
-
October 2020
Deadline for participating institutions to submit narrative statements to Research England.
-
April 2020
We publish a circular letter outlining the revised timescales for the KEF to allow for the disruption caused by Covid-19 pandemic.
-
March - April 2020
We host two webinars to assist higher education institutions who are participating in the KEF with their preparations.
-
March 2020
We publish final detail of the KEF cluster descriptions and cluster placements. We also publish templates for the submission of KEF narrative statements.
-
January 2020
We publish the KEF decisions report setting out how we will implement the first iteration of the KEF and detailed information about the data sources that will be used and the methodology for presenting the data.
-
August 2019
We publish a report detailing the outcomes of the KEF consultation and pilot workshops.
-
March - May 2019
We hold five KEF pilot workshops with 21 volunteer HEIs from across the sector.
-
January 2019
We published a consultation on proposals for the KEF. This included an invitation to participate in a series of pilot KEF workshops.
-
November 2018
We (Research England) publish a technical report of a cluster analysis of English Higher Education Institutions to inform the development of the KEF.
-
April 2018
Research England assumed responsibility for development of what was to become known as the Knowledge Exchange Framework (KEF) as part of its wider KE policy and funding remit.
-
November 2017
Minister of State for Universities, Science, Research and Innovation commissioned the Higher Education Funding Council for England (HEFCE) to provide more information about higher education institutions’ achievements in serving the economy and society for the benefit of the public, business and communities.
Who can I contact if I have further questions?
Please email KEF@re.ukri.org.