1 / 23

Ranking effects upon students

Ranking effects upon students. Carmen Dobocan. Academic cooperation and competitiveness University Ranking Methodologies 17-20 of September. National Alliance of Student Organization in Romania (ANOSR) Member of European Students' Union (ESU).

calder
Télécharger la présentation

Ranking effects upon students

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Ranking effects upon students Carmen Dobocan Academic cooperation and competitiveness University Ranking Methodologies 17-20 of September National Alliance of Student Organization in Romania (ANOSR) Member of European Students' Union (ESU)

  2. The National Alliance of Student Organizations from Romania (ANOSR) - Is composed from 64 local student organizations in different university cities across the country. Mission: Acts as an informational and decision-making interface between the local level, represented by the local organizations, and the national level, the central actor of the educational environment in Romania. ANOSR’s main goal is to represent the common interests of students from Romania and to stimulate their participation in the educational, social, economic and cultural activities.

  3. The National Alliance of Student Organizations from Romania (ANOSR) • Principles: • The Principle of Democracy • The principle of legitimacy and representativeness of the students • The involvement in the decision making process • The Principle of Non-Partisanship

  4. We, the students, have witnessed an increasing interest in following and developing tools for the assessment of institutional performance. More over, at the Leuven Ministerial Conference, performance and profiling of the higher education institutions (HEI) has been set up as a priority for the new decade.

  5. Context In the context of a basic ideological framework for ranking, by which transparency and increasing quality through competition are guaranteed, students confront with particular issues that are not in the attention of the academic community.

  6. The confusion within… Rankings, classification, typology, mapping, benchmarking, assessing learning outcomes, transparency tools WHAT ARE WE TALKING ABOUT?

  7. Rankings Aim: provide easily interpretable information on the standing of higher education institutions Most widely known and used: Times Higher Education Supplement (THES)

  8. Ranking- declared purposes • Better information for increasingly mobile students and academics • Render information for the HEI themselves regarding their relative position • Render information for public authorities, prospective students and their families, employers, wider society

  9. Ranking- declared purposes • Policy Advice – to Inform (and map) Policy Decisions • Quality Assurance of Public Spending in Higher Education • Measuring the Improvement of Institutions • Identifying Trends and Developments • Harmonization pressure

  10. Parallel view - THES The THES ‘World University Rankings’ (top 200 universities) uses the following as indicators: 1. Peer Review Score – 40% (1300 academics in 88 countries) 2. Recruiter Review – 10% 3. Citations of faculty members – 20% 4. Faculty to student-ratio - 20% 5. International Faculty – 5% 6. International Students – 5%

  11. Parallel view - Shanghai Rankings The Shanghai Rankings (top 500 universities) use the following as indicators: 1. Nobel prizes and international awards -30% 2. Citations – 40% 3. Publications in ‘Science’ and ‘Nature’ Journals – 20% 4. Compensation for small universities – 10%

  12. Which of the criteria mentioned are relevent for students?

  13. Critiques of rankings • Selection of indicators: no theoretical reflection, convenient data, bias. • Weighting of indicators: no theoretical underpinning; no consistency in weights. • Reliability: newspapers like the drama of significant changes year on year; doesn’t reflect real annual changes in institutions. • Statistical insignificance: rankings based on minute insignificant differences. • Focus of attention: rankings of institutions treat them as homogeneous when there is huge variability within institutions.

  14. Danger for students • Impact on Social dimension • Hindering mobility • Harmonization pressure according to a restricted number of criteria and promotion of a single model on institution • Financial impacts • Misleading in their assessment, especially comparing the evolution in time • Shift of the focus from comprehensive QA institutional efforts to the check-list approach

  15. Danger on students • No regard for student involvement • Employers will accept only students from top universities • Students seen as consumers • Some universities/faculties not so popular could end in failure

  16. Typology and classification Typology – phase one of the project “Classifying European Institutions” funded by EC Aim: “better understanding of the various types of higher education institutions, their mission and provisions, support to the European aim of increasing student mobility, inter-institutional and university-industry cooperation, the recognition of degrees and hence the international competitiveness of European higher education”

  17. Typology and classification • Classification • Grouping HEI according to their mission profiles (it puts institutions in “mission boxes”) • Different from typology as it addresses real institutional characteristics, not conceptual ideal profiles • Based on a spider-web scheme approach • Allowing for institutional profiling and strategy development • In theory, not a ranking

  18. Conclusions Academically interesting tools, but politically very dangerous…

  19. Dangers or ‘why are we not reasonable and support typology and classification?” • Hindering student and staff mobility – difficulties to jump from one box to another • Leading to ranking interpretations • Potentially hinder diversity and cooperation between HEI • A predictable influence on financing systems of HEI • Expensive exercise with no clear contribution to further developing the EHEA

  20. QA vs RANKING → complementary rather than contradictory QA - Built on trust and trust building - It works on QUALITATIVE standards and procedures (the ESG for example), not quantitative - System of peer review for agreed standards- room for diversity of approaches equally valued - HEI as main responsible for QA and values stakeholder engagement - Processes and outcomes are both important – it's about learning - Thin, complex, multidimensional analysis- difficult to compare - Internal responsibility for each institution- a matter of self development - Community of academics and students engaging and ownerships

  21. QA vs RANKING → complementary rather than contradictory Rankings - Minimum standards are not quality (distrust) - Relativism and scoring the diversity of approaches - External control and analysis – not HEI responsible for assessment and no stakeholder involvement - Focus on measurable outputs, not on processes - Indicators and league tables in limited areas – compares what is not comparable - Students as clients needing information – a matter of satisfying consumer demands - Students as mere sources of feedback for indicators

  22. Why ranking ? What more useful information can ranking provide rather than a quality assurance report? (elaborated by a qualified agency which has a data base with all the institutional programs of HEI)

  23. Thank you! www.anosr.ro Email: carmen@anosr.ro Tel: 004 0743 021 079

More Related