1 / 21

Research Assessment Exercise (RAE) Research Excellence Framework (REF) Quality Related (QR funding)

Research Assessment Exercise (RAE) Research Excellence Framework (REF) Quality Related (QR funding). Ellie James. Dual support system. QR funding from RAE. Research funding. Research Council grants. 2009/10 Keele’s total income for research £18.3m £6.6m from QR (RAE 2008 results)

bat
Télécharger la présentation

Research Assessment Exercise (RAE) Research Excellence Framework (REF) Quality Related (QR funding)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research Assessment Exercise (RAE) Research Excellence Framework (REF) Quality Related (QR funding) Ellie James

  2. Dual support system QR funding from RAE Research funding Research Council grants • 2009/10 Keele’s total income for research £18.3m • £6.6m from QR (RAE 2008 results) • £11.7m from research grant income • Research Assessment Exercise • National exercise, every 6 years • Purpose to assess, by discipline (UoA), the quality of research in each University, since last RAE • Process is currently based on ‘peer review’ • Results are the basis for allocating QR funding (in HEFCE grant letter)

  3. Definitions of quality levels (RAE 2008)

  4. Keele’s RAE submissions RAE 2001 • In RAE 2001 357.7 FTE staff were submitted across 27 Units of Assessment • 80% of academic staff submitted • Two thirds of Keele’s submissions rated ‘4’ or above (national to international excellence) RAE 2008 • More selective approach, with high quality focused submissions • Less staff have been submitted to fewer UoAs: 286.15 FTE staff submitted to 14 Units of Assessment • 48% of academic staff have been submitted

  5. RAE 2008 Results All Universities’ results i.e. 52,409 FTEs in 2,363 number of submissions Keele’s results i.e. 286 FTEs submitted to 14 UoAs

  6. Keele results

  7. Conclusions from RAE 2008 results Keele performed well in its high volume submissions (General Engineering, Social Policy & Business & Management) Each discipline needs to compare its results to the sector • Social policy rated 12th in the sector (out of 68). • History and Music have 20% 4* (above sector average of 17%) • Sciences did well • Law and politics performed poorly • Disappointing results for Primary Care and Astrophysics, compared to the sector (but not financially….)

  8. Overall feedback from RAE 2008 • Research environment benefited from establishment of research institutes • High proportion of early career researchers • Below average numbers of postgraduate research students • Below average research income

  9. Translating results into funding • Never know ‘rules of the game’ in advance • Each UoA is allocated a ‘pot’ £££ • HEFCE decided to weight funding as follows: • 1* attracts no funding • 2*/3*/4* funded in the ratio 1/3/9 • 2* funding reduced to 0.294 Feb 2011 • Emphasis on science based (STEM) subjects RAE 2008 much larger number of Universities were successful, means funding more widely and therefore thinly spread

  10. REF main changes from RAE • Assessing impact • Standardise three elements across UoAs • Reduce number of UoAs (from 67 to 30) and main panels (to 4) • Up to a max of 4 outputs • Submissions due 2013, assessed 2014, inform funding 2015/16 • Maybe including citations data (in some areas to help assess outputs)

  11. REF = RAE 2008 + IMPACT + citations Three elements to overall excellence, Consultation proposed: Weighting to be decided……. Impact probably less than 25% following consultation responses Outputs 60% Impact 25% Environment 15%

  12. Impact categories REF = retrospective impact RCs = prospective impact

  13. Impact pilot 29 Institutions, 5 UoAs: • Clinical Medicine • Physics • Earth Systems and Environmental Sciences • Social Work and Social Policy & Administration • English Language & Literature

  14. Evaluation framework (work in progress) 2005-09 1993-2004 Research findings Interaction with ‘users’ Influence and impact Future capacity Snapshot of activity across the ‘whole unit’ Example 1 Underpinning research Example 2 User interaction Underpinning research

  15. Impact Pilot • Submissions included: • Impact statement for each Unit • Case studies (1 per 10 FTE staff submitted) • Supporting ‘indicators of impact’ • Panels will assess • ‘Reach’ how widely the impacts have been felt • ‘Significance’ how transformative the impacts have been • Time lag issue - Research may have been undertaken 10-15 years earlier (but impact must be evident 2008 to 2012)

  16. Pilot outcomes • Weighting to be agreed after pilot complete, end 2010 • Pilot successful – panels can differentiate quality • Case study templates have since been revised • Definition of impact (and benefit) may be broadened • Impact statement for Unit likely to be excluded (or part of environment) • ‘Clarity of presentation’ is key (writing teams) • Each case study reviewed by 4 panel members (2 users & 2 academics) • Many case studies were based on individuals not groups • Impact preparations underway across sector

  17. Revised impact case study template • Short summary of the case study (Maximum 150 words) • Underpinning research (Maximum 500 words) • Provide information about the research and the specific insights that underpin the impact or benefit claimed in this case study. • References to the research • Provide references to key research outputs, any key research grants, and evidence of the quality of the research (Maximum of 10 references). 4. The contribution, impact or benefit (Maximum 750 words) • Describe the impact or benefit and how the research contributed to this 5. References to corroborate the contribution, impact or benefit (Normally maximum of 10 references)

  18. Lessons learned from pilot HEIs • Takes time to understand concept of ‘impact’ or ‘ non academic impact’ or ‘benefit.’ HEIs need to raise awareness NOW • Interim impact ambiguous compared with final impact • Standard approach emerged: • Central admin project managing submissions • Department academics leading drafting • High level committee reviewing and tactical advice • Big challenge acquiring supporting evidence (heavy reliance on personal knowledge of senior academics) • Impact with impose real additional cost • Subject specific challenges e.g. English more conceptual • Pilot HEIs advised other HEIs to start preparations now!

  19. Lessons learned from pilot Panels • Best case studies make explicit the non academic benefit from research • ‘Brief is best’ • Good case studies showed the link between research and impact and provided supporting evidence • Case studies can get high rating on either ‘reach’ or ‘significance’ (or both) • Engagement isn’t impact • Not convincing to simply state ‘distinguished Professor’ • Universities need to improve their presentation of evidence • Issues for new departments, early career researchers, small submissions • Don’t expect panels to follow up references, these are just for verification

  20. REF timetable 17 September 2010 Deadline for applications for sub-panel chairs 8 October 2010 Deadline for nominating panel members October 2010 Sub-panel chairs appointed November 2010 Reports from the impact pilot exercise December 2010 Panel members appointed Early 2011 Panels begin meeting Mid 2011 Guidance on submissions published Mid 2011 Panels consult on criteria Late 2011 Panel criteria and methods published Early 2013 Submission system operational Mid- to late 2013 Panels meet to prepare for the assessment Further nominations sought & assessors appointed Late 2013 Submissions deadline 2014 Panels assess submissions December 2014 Outcomes published

  21. Further info http://www.hefce.ac.uk/research/ref/ New UoAs http://www.hefce.ac.uk/research/ref/pubs/2010/01_10/ Pilot HEIs http://www.hefce.ac.uk/research/ref/impact/Institutions_byUOA.pdf

More Related