1 / 19

Tia Loukkola Director of Institutional Development EUA General Assembly, April, 2015

Rankings from the perspective of European universities. Tia Loukkola Director of Institutional Development EUA General Assembly, April, 2015. EUA’s work on rankings in short. 2009-2010 Council working group Two reviews with main focus on ranking methodologies

tmeagan
Télécharger la présentation

Tia Loukkola Director of Institutional Development EUA General Assembly, April, 2015

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Rankings from the perspective of European universities Tia Loukkola Director of Institutional Development EUA General Assembly, April, 2015

  2. EUA’s work on rankings in short • 2009-2010 Council workinggroup • Two reviews withmain focus on ranking methodologies • 2011 Global University RankingsandTheir Impact • 2013 Global University RankingsandTheir Impact – Report II • 2012-2015 RISP project • 2014 Rankings in InstitutionalStrategiesandProcesses: Impact or Illusion? • Mapping EUA members’sparticipation in U-Multirank • 2015 Report on the experiencesfrom the first round …2…

  3. Summary of key RISP findings • While highly critical of rankings, HEIs still use rankings: • Fill information gap • Benchmark • Inform institutional decision-making • Develop marketing material • Institutional processes affected by rankings fall into 4 categories: • Mechanisms to monitor rankings • Clarification of institutional profile and adapting core activities • Improvement to institutional data collection • Investment in enhancing institutional image

  4. Conclusions from RISP • Institutions need to improve their capacity to generate comprehensive, high-quality data and information: • to underpin strategic planning and decision-making • to provide meaningful, comparative information about institutional performance to the public • Rankings can be an important ingredient in strategic planning… nevertheless, it is vital that each university stays “true” to its mission and should not be “diverted or mesmerised” by rankings • The report ends with guidelines on how institutions could use the rankings for strategic purposes

  5. Background to UMR survey • A variety of views among membership • EUA represented in the Advisory Board • First results published in May 2014 • A survey to map views and experiences of individual members on the initiative …5…

  6. EUA members in UMR *47,1% of EUA members in 2014 **59.4% of EUA members in 2015 …6…

  7. UMR in short • Multidimensional ranking with seed-funding from the EC • Indicators cover 5 areas: teaching and learning, research, knowledge transfer, international orientation and regional engagement. • Data • provided by the institutions directly • international bibliometric and patent databases • student surveys (completed by students at participating institutions) • Fields covered by UMR • 2014: business studies, electrical engineering, mechanical engineering and physics • 2015: psychology, computer science and medicine …7…

  8. Survey results: 85 universities having actively participated in dataprovision …8…

  9. Reasons for active participation …9…

  10. Views on UMR indicators …10…

  11. Resources required • Considerable resources used to provide data • Only 12% had less than 5 persons involved • Half involved 5-15 persons • 29.2% used more than 30 working days • 20% spent less than 10 days “The field data collection process is very time consuming. There were some difficulties in interpreting some definitions and to adjust them to the national context.” (University from Portugal) …11…

  12. Using results • 60 % are using UMR results for something, out of them …12…

  13. Cooperation with UMR consortium …13…

  14. Survey results: 7 universities included in UMR through publicly available data “We cannot see how U-Multirank can overcome the differences in how data is interpreted among universities and between countries.” (University from Sweden) “We had concerns about the validity of the exercise, the cost of data collection and the difficulty of quality assurance for this self-reported data.” (University from United Kingdom) …14…

  15. Survey results: 34 universities not included in UMR …15…

  16. Reasons for not contributing to the data collection …16…

  17. Key findings • There is increasing interest among EUA members to take part in UMR • Cooperation with UMR consortium worked quite well • Benefits of participation or use of UMR results unclear • Data collection required considerable resources • Concerns over validity of data following difficulties in interpreting indicators • UMR struggles with reliability and comparability of the data -> how to overcome this? …17…

  18. Conclusions • Use of rankings at institutional level not systematic • Developing institutional research capacity is vital • Would we need international or European common dataset? …18…

  19. All publications are available at http://www.eua.be/Publications.aspx …19…

More Related