1 / 0

RESEARCH PERFORMANCE: RECOGNISING, REPORTING, REWARDING

2014 INORMS Congress Washington DC, 10-13 April. RESEARCH PERFORMANCE: RECOGNISING, REPORTING, REWARDING. Michelle Duryea Manager, Research Quality and Policy ECU Office of Research and Innovation

stormy
Télécharger la présentation

RESEARCH PERFORMANCE: RECOGNISING, REPORTING, REWARDING

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2014 INORMS CongressWashington DC, 10-13 April

    RESEARCH PERFORMANCE:RECOGNISING, REPORTING, REWARDING

    Michelle Duryea Manager, Research Quality and Policy ECU Office of Research and Innovation ARMS (Australasian Research Management Society) Treasurer and Accreditation Program Training Fellow (+618) 6304 2617 m.duryea@ecu.edu.au
  2. Overview Research performance assessment mechanisms Compare and discuss global models Definitions and key indicators Research excellence, quality and impact Other drivers of performance i.e. rankings Role of research management Defining the research performance “goal posts” Incentivisingand rewarding researchers Capturing and reporting performance data Questions and discussion
  3. A bit about me… Australian Technology Network (ATN) (2005-2007) Senior Policy Officer Facilitated ATN RQF Trial Member of Impact Working Group Australian Research Council (ARC) (2008-2009) Assistant Director, Research Evaluation Policy Involved in the development of inaugural ERA guidelines Edith Cowan University (ECU) (2010 to date) Manager, Research Quality and Policy Responsible for ERA submissions and internal reward scheme
  4. Performance Assessments International models contrasted and compared: UK Research Assessment Exercise (UK RAE) UK Research Excellence Framework (REF) Hong Kong Research Assessment Exercise (HK RAE) New Zealand Performance Based Research Fund (PBRF) Australian Research Quality Framework (RQF) Excellence in Research for Australia (ERA) What about the USA? Center for Measuring University Performance
  5. Background
  6. Similarities Some “Magical” Numbers Assessment Period = 6 years REF underpinning research for impact 1993< (20 years) Selection of ‘Best’ Outputs = 4 ERA requires comprehensive reporting Nomination of 30% for peer review in certain disciplines Assessment Outcomes Ratings Scales = 5 points * Assessment Indicators = outputs quality central “the most universally recognized token of research performance.”1 Assessment Process = peer/expert review (intl) 1The Center for Measuring Research Performance, The Top American Research Universities 2012 Annual Report (p.4)
  7. Purpose Validation Justification Identification Allocation - confirming performance claims - public investment in research - areas of excellence/benchmarks - informing funding decisions/formulae
  8. Evidence Portfolios Research Outputs traditional publications and non-traditional outputs all but AUS ERA declare outputs considered on merit i.e.no output type considered of higher quality than any other Research Income – external grants & research commercialisation income Eligible/Active Researchers – staff summary/circumstances & esteem Research Students/Studentships/HDR Completions – not in ERA Contextual Statements textual descriptions/explanatory statements discuss research strategy, facilities, collaborations, environment Research Impact – case studies or applied measures engagement with end users and strategy for achieving impact
  9. Performance Indicators Research Quality citations data, peer review, competitive income Research Inputs/Activity (Productivity?) quantitative elements, total income, outputs, researchers Research Environment research training (NZ PBRF include end-user engagement) Research Impact beyond academia (not Impact Factors!) research application Peer Esteem?
  10. What is ‘research excellence’? “Asking to explain excellenceis a bit like asking a biologist whatextra-terrestrial life would look like. The answer: we don’t know it,but we will when we see it.” Claus Madsen, Senior Adviser at the European Organisation for Astronomical Research in the Southern Hemisphere2 2 http://euroscientist.com/2013/03/towards-research-excellence-rather-than-excellence-itself/
  11. What is ‘research quality’? Rarely explicitly defined in assessment guidelines implied by indicators – disciplinary differences Quality of research outputs determined by significance, originality, rigour Quality of research environment determined by strategy, training, infrastructure Quality of researchers determined by reputation, recognition, influence
  12. The impact debate Should definitions of research excellence or research quality include research impact? This year the ARC has started requiring 75-word statements on intended impacts or benefits Discovery vs Linkage Grants applications “the demonstrable contribution that research makes to the economy, society, culture, public policy or services, health, the environment, or quality of life, beyond contributions to academia.” UK REF definition includes “the reductionor prevention of harm, risk, cost or other negative effects.”
  13. Research Global Article3 3Research Global, February 2007 (pp. 8-10)
  14. University Rankings “..the rise of an industry devoted to the evaluation, assessment and ranking of academic research universities on both a national and international basis.”4 Academic Ranking of World Universities (ARWU) Shanghai Jiao Tong University, China Uses six “objective” indicators including number of Nobel Prizes and Fields Medals, citation indexed publications, number of articles in Nature and Science, highly cited researchers selected by Thomson Scientific/Science Citation Index etc. Times Higher Education (THE) World University Rankings “powered by Thomson Reuters” i.e. Citations = 30% 4The MUP Center, The Top American Research Universities 2012 Annual Report (p.4)
  15. Rankings “reputation surveys” “A defect with some of these efforts is their use of the notionof prestige to define prestige, creating a circular and self-reinforcing opinion cycle that offers little basis for understanding the substance of an institution’s reputation.The circularity occurs when a league table uses the results of a survey of consumers or experts who are asked to identify the most prestigious institutions as evidence that a university is indeed prestigious. This approach may well identify an element of public opinion, but it does not provide information on the substance of university performance that is the basis of prestige.”5 5The Center for Measuring Research Performance, The Top American Research Universities 2012 Annual Report (p.4)
  16. Role of research management How do we in research management define and communicate research performance expectationsto researchers/academics within our institutions? Is there a need to incentivise and reward research performance at an individual researcher level? If we prescribe the “goal posts” and reward the goals is there the potential for perverse outcomes/behaviours? Is the intention really about improving research quality or excellence or just maximising the institution’s performance-based funding and rankings reputation?
  17. University of Melbourne Law School Definition of Research Excellence6 substantial body of high quality work (outputs) recognition as a leading researcher in the field (esteem) research leadership in the law school or university (environment) Aspirational standards for scholars at all levels, including professors 6http://www.law.unimelb.edu.au/index.cfm?objectId=E0885610-E97D-11E0-86FF0050568D0140
  18. ECU’s ‘ASPIRE’ Scheme System for measuring and quantifying research performance – rewards activity and quality Provides guidance on that which the University values in regards to research performance Individual incentives in support of enhancing the University’s overall level of research excellence Major indicator categories:
  19. ASPIRE Performance Indicators
  20. Creative Works ERA non-traditional research outputs (NTRO’s) Includes visual arts, creative writing, music, dance, performance, theatre, film/video/tv, digital media, recordings/renderings, exhibitions, curation etc. 2015 ERA “Research Report for an External Body” Research Statements (2,000 characters) Background – field, context, research question Contribution – innovation, new knowledge Significance – evidence of excellence Discipline-Based Peer Review Panels
  21. Data collection and reporting Research performance data collectedand validated by the research office Source systems include grants management system, publications system and digital repository Data structures align with external reporting schema Data feeds into internal EIM reporting warehouse Rewards also incentivise the reporting of activities Repository supports the open access of outputs Reporting informs promotions and web profiles
  22. Summary “Has the ‘currency’ of research performance really changed?” Research managers have an important role in navigating the ‘game’ and providing guidance Expectations heavily dependent upon national, institutional, disciplinary and career contexts Hopefully we can recognise,support and nurture ‘good’research wherever it occurs...
  23. 11th -15th September 2016 www.inorms2016.org
More Related