1 / 30

Research Assessment in the UK: Changing rules, tactics and impact.

Explore the evolving rules and tactics of research assessment in the UK, as well as their impact on institutions and individuals. Learn about the REF2021 and Glasgow's approach, and discover the outcomes and behaviors driven by research assessment.

karlag
Télécharger la présentation

Research Assessment in the UK: Changing rules, tactics and impact.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research Assessment in the UK:Changing rules, tactics and impact. Prof David Fearn, Dean for Global Engagement (Americas) UNESP - Monday 8th October 2018

  2. Overview • Introduction • History • Evolution of rules • Playing the game – tactics • Preparation • Impact • REF2021 and Glasgow’s approach • Conclusions

  3. What are the outcomes? • Grade or, more recently, quality profile. • This drives: • Reputation and league tables. • Allocation of some £1.6b unrestricted funds (£45m to UofG). (Most often, the detailed allocation formula is only decided after the results are published, and can change from year to year.) • Behaviours: • Short-term, to optimise submission. There may be a compromise between reputation and funding. • Medium-term, to improve and demonstrate research quality.

  4. REF2014 Detailed results: http://results.ref.ac.uk/ Database of impact case studies: http://impact.ref.ac.uk/CaseStudies/

  5. History

  6. Some trends • Increasing focus on rules that are transparent and comprehensive. • Increasing focus on standardisation of rules and mechanisms to ensure equality of standards between different Units of Assessment. • Increasingly formal processes for appointment of Panel and Sub-Panel chairs and members. • Improvements in providing panel members with easy access to outputs. • Changes in rules on who can be submitted, and allowing for special circumstances. For example, only 2 publications required for ECRs. • Removal of funding from lower grades. Grade inflation. • Increasing effort in institutions, analysing outcomes in detail and making preparations for the next exercise. Estimated cost for RAE2008 = £50m.

  7. Some tactics • Recruitment of “star” researchers prior to census date. • Retention of “star” researchers beyond normal retirement (on fractional contracts). • Selection of staff for submission. Particularly critical pre-2008, where, for example, the reputational and financial difference between a 4 and a 5 is huge. But what are implications for staff not submitted? • Encouragement/Incentivisation for staff to apply/agree to be panel members.

  8. REF 2014 vs REF 2021: Overview • REF 2014 • Selective return: 83% for Glasgow, typical of other RG HEIs • 4 outputs per person Reduced tariff applied to ECRs and other circumstances • Portable outputs • 36 UoAs • Score weightings: • 65% outputs • 20% impact • 15% environment • REF 2021 • Full return of staff with “significant responsibility to undertake research” • Average number of outputs per staff FTE is 2.5 • Minimum of one output and maximum of five outputs attributed to each staff member • Transition to non-portability of outputs • 34 UoAs • Score weightings: • 60% outputs • 25% impact • 15% environment

  9. REF 2021: Developmental Timetable

  10. REF 2021: General Decisions • As in REF 2014, each submission will be assessed according to: • output quality • impact • environment • Profile weightings have been revised: • Outputs 60% (down from 65% in REF 2014) • Impact 25% (up from 20% in REF 2014) • Environment 15% (unchanged from REF 2014) • All institutions will be required to provide a Code of Practice, which will include their process for selecting outputs.

  11. REF 2021: Staff • The census date for staff = 31 July 2020. • All staff with “significant responsibility for research” should be returned. • Eligible staff are those who, at the census date: • have a contract of employment of 0.2 full-time equivalent (FTE) or greater • have ‘research’ or ‘teaching and research’ as their primary employment function • are an independent researcher • and are on the payroll of the submitting institution. • For all returned staff on the minimum 0.2 FTE contract of employment a statement will be required that describes the connection of the staff member to the submitting institution. • Research-only staff: definition of ‘independent researcher’ to be clarified by the funding bodies in due course. Research assistants are not included in this category. • ORCID not mandatory but strongly encouraged.

  12. REF 2021: Outputs • Assessment period = 1 January 2014–31 December 2020. • Average number of outputs required per FTE = 2.5. Number of outputs for each submission = 2.5 x FTE of submitted staff. • A minimum of one output and a maximum of five outputs may be attributed to each submitted member of staff. • For staff who have left (including those who are retired or deceased) a maximum of five outputs may be attributed. • A staff member can be a co-author on > 5 submitted outputs, where these are attributed to other staff in the unit. • Portability: Outputs may be submitted by both the institution employing a member of staff on the census date and the originating institution where the staff member was previously employed when the output was first made publicly available. This will allow the submission of outputs by staff who have left the institution, including those who have moved into a different sector, died or retired. • Open access: Articles and conference proceedings must be deposited in the University repository no later than three months after the acceptance date.

  13. REF 2021: Impact • Period for underpinning research = 1 January 2000–31 December 2020. • Impact must occur 1 August 2013–31 July 2020. • Number of impact case studies: one case study plus one further case study per up to 15 FTE staff returned to the REF for the first 105 FTE returned. After this there will be one further case study per 50 FTE returned. • i.e. each unit will require to submit a minimum of two case studies. • The research underpinning impact cases must be assessed by the panel to be minimum 2* quality; however, there is an intention to broaden the definition of underpinning research to include a ‘wider body of work or research activity. • Case studies can be a continuation of a REF 2014 case study but they will need to be identified as a continued case study, and provide evidence of additional impact between 1 August 2013–31 July 2020.

  14. REF 2021: Environment • Template will be more structured than in REF2014 and use more quantitative data. • Template will include explicit focus on support for: • Interdisciplinary research • Collaboration with organisations beyond higher education • Impact • Equality and diversity • Open research, including where this goes above and beyond the REF open access policy requirements, and the effective sharing of research data.

  15. University REF Webpages www.gla.ac.uk/ref

  16. Preparing for REF 2021 • REF 2014 Unit of Assessment Reviews (2015–2016) • Panel-based external review of each of our REF 2014 UoAs • Lessons learned: • Increase number of 4* outputs. (Glasgow was 20/24 in RG for 4* %age.) • Enhance leadership skills • Tackle areas where improved performance is required • Create research focus • Interim Reviews (2017–2018) ONGOING • Panel-based external review of outputs, environment measures, impact preparations • Two exercises running in parallel: • Interim Research Reviews (outputs calibration, environment): 33 Panels to meet by July 2018 • Interim Impact Reviews (case studies): 5 Panels to meet by April 2018

  17. Assessment of output quality • There will be different opinions from different assessors. • Panel members will not necessarily be experts in the sub-field. • Panel members may have no more than 10-15 minutes per output. • Internal review processes need to account for this. External assessors (ideally previous panel members) can help with this. • Journal “quality” not necessarily an indicator of quality of individual paper. • Citations not a primary indicator of quality. • Outputs need to be well-written, particularly to emphasise the originality and importance of the work to a non-expert, skim-reading.

  18. Interim Research Review (Physics & Astronomy) • Key input from two external assessors (Chair and deputy from REF2014 panel). • Focus on output quality: • School selected a sample range of outputs; in total 30, spanning a range of quality. • A school panel then individually assessed each output, then conferred to give an agreed grade, on a 12-point scale. • The two external assessors similarly individually assessed then agreed on a grade. • The school and external grades were then compared. Externals scored higher, on average, but there were disagreements in both directions. • Cases of significant disagreement were then discussed in detail. • The discussion was very helpful to the school panel in its calibration. • The aim is that this will help the school in its selection of outputs for REF2021. • Key criteria: originality, significance, rigour. • Issues: reviews, follow-on papers, author contribution, interdisciplinary, citations.

  19. Impact - Definition of terms Definition of terms • Knowledge exchange: systems and processes by which knowledge, expertise and skilled people transfer between the research environment and user communities, including the general public. • Impact: the contribution of research to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia – demonstrable, evidenced, measurable change/benefit. New product Policy change New practice New model Public debate New understanding Informing education Spin-out company

  20. What did we need to show in our narratives? What did we need to show in our narratives?

  21. How did we assess our potential case studies ? How did we assess our potential case studies? Narrow reach High significance

  22. What was the process for production ? What was the process for production? • Conducted a pilot first to arrive at most effective process. • Writing team was centrally based, with support from inside the Colleges; writing was shared (in some instances academics were more comfortable writing the first draft). • Steering Group of senior academics chaired by VP Research to review and assess the case study drafts and return them for amendment. • Trawl for stories in the Colleges, taking suggestions and also looking at major grant holders. • Conducted in-person interviews to get the ‘story’ first and to identify potential impacts.

  23. Selection of final submission • Pipeline – extra stories developed in case some fell by the wayside • Look into the past (retired or relocated staff) • Consider ‘reach’ and ‘significance’ • Keep checking against assessment criteria • Those furthest along the pathway to impact • Most believable (no unsubstantiated claims) • Choose your strongest rather than trying to get a representative spread

  24. Impact • Separate internal review process. Broader-based; Panel level rather than UoA. • Each review included 4 external reviewers who had been REF2014 Impact Assessors. • For example, Panel D: 41 impact pro-formas across 12 UoAs reviewed. • Critical issues: • Writing and presentation. • Reach and significance. Including weak strand to broaden reach may reduce final score. • Evidence. • The research needs to be central to, and linked back from, the impacts.

  25. Writing and preparation • The title and100-word impact summary are critical: • the title should state the impact claimed. • the summary should contain all the critical points, and identify: • the issue or context. • the research and its contribution to resolving that issue. • the scale and importance of the change that has been achieved. • Write in plain English • and avoid words such as ‘will’, ‘might’, ‘maybe’ that could introduce doubt. • Answer the ‘exam question’ i.e. provide the requested information. • Focus on strongest impact strands. • Be creative in presenting the information; can include images, for example. • Clearly link the underpinning research to the impact.

  26. Evidence • Plan early. • Include important information in the case study itself. Don’t depend on assessor clicking through to links. • Quantitative evidence is not necessarily stronger. • Strongest cases included a variety of evidence types. Should not depend on a single supportive source. • For collaborative work, need to articulate UoG researcher contribution to impact generation.

  27. What did our impact profile look like ? What did our impact profile look like? Our narratives fell into these themes and subcategories (this includes every impact strand cited in 143 case studies)

  28. Promoting our research excellence Six broad, cross-disciplinary areas of research that have a track record of attracting major external investment Launched May 2017 www.glasgow.ac.uk/research

  29. Conclusions • Major exercise determining reputation and distribution of funding. • Know the rules. • Encourage/support your staff to be involved with assessment panels. • Encourages positive behaviours: • Quality, not quantity of outputs. • Well-written outputs. • Impact. • Are internal processes such as annual review/promotion/recruitment aligned? • And don’t forget Teaching. (We now have the TEF in the UK.) • And then there will be the KEF (Knowledge Exchange Framework).

  30. Questions. #UofGWorldChangers @UofGlasgow

More Related