A scientometric perspective on university ranking
Nees Jan van Eck
Centre for Science and Technology Studies (CWTS), Leiden University
Center of Scientometrics (CoS), National Science Library, Chinese Academy of Sciences
Beijing, China, April 12, 2019
Centre for Science and Technology Studies (CWTS)
• Research center at Leiden University
• Science and technology studies, with a considerable
emphasis on scientometrics
• About 50 staff members
• Our mission: Making the science system better!
• Research groups:
– Quantitative Science Studies
– Science and Evaluation Studies
– Science, Technology, and Innovation Studies
• Commissioned research for research institutions, funders,
governments, companies, etc.
1
About myself
• Master in computer science
• PhD thesis on bibliometric mapping of science
• Senior researcher at CWTS
– Bibliometric network analysis and visualization
– Bibliometric data sources
– Bibliometric indicators
• Head of ICT
2
VOSviewer
3
Outline
• CWTS Leiden Ranking
• Responsible use of university rankings
4
CWTS Leiden Ranking
5
CWTS Leiden Ranking
6
CWTS Leiden Ranking
• Provides bibliometric indicators of:
– Scientific impact
– Scientific collaboration
• Calculated based on Clarivate Analytics Web of Science data
7
Selection of universities (2018 edition)
• All universities worldwide with ≥1000
Web of Science publications in period
2013–2016
• 938 universities from 55 countries
8
Indicators
• Size-dependent and size-independent indicators
• Scientific output:
– P
9
• Scientific impact:
– P(top 1%) and PP(top 1%)
– P(top 5%) and PP(top 5%)
– P(top 10%) and PP(top 10%)
– P(top 50%) and PP(top 50%)
– TCS and MCS
– TNCS and MNCS
• Scientific collaboration:
– P(collab) and PP(collab)
– P(int collab) and PP(int collab)
– P(industry) and PP(industry)
– P(<100 km) and PP(<100 km)
– P(>5000 km) and PP(>5000 km)
PP X =
P(X)
P
Differences with other university rankings
• No composite indicators
• Focused on research, not on teaching
• Based purely on bibliometric indicators; no survey data or data provided by
universities
• High-quality bibliometric methodology
• Multiple views, not just a simple list
10
11
12
13
14
15
16
17
Advanced bibliometric methodology
• Field classification system
• Counting citations vs. counting highly cited publications
• Full counting vs. fractional counting
• Bibliographic database
18
19
About 4000 fields of science in the Leiden Ranking
Social sciences
and humanities
Biomedical and
health sciences Life and earth
sciences
Mathematics and
computer science
Physical
sciences and
engineering
Why count highly cited publications?
• Leiden Ranking counts number of highly cited publications (top 10%)
• THE, QS, and US News count number of citations
• Effect of counting number of citations:
20
Why count highly cited publications?
21
Why count highly cited publications?
22
Counting citations Counting highly cited publications
Leaving out Göttingen’s
most cited publication
How to handle publications co-authored by multiple
institutions?
• THE, QS, and US News:
– Co-authored publications are fully assigned to each co-authoring institution (full counting)
• Leiden Ranking:
– Co-authored publications are fractionally assigned to each co-authoring institution (fractional
counting)
23
This publication is
assigned to Leiden
with a weight of 2/4
Why use fractional counting?
• Full counting is biased in favor of universities with a strong biomedical focus
24
Choice of bibliographic database:
Is more data always better?
• Universities from China, Russia, France, Germany, etc. may not benefit at all
from having more data
• Indicators should be based on a restricted database of publications
• Leiden Ranking uses Web of Science, but excludes national scientific
journals, trade journals, and popular magazines
25
Responsible use of
university rankings
26
Responsible use of university rankings
• Ten principles for responsible use
of rankings:
– Design of rankings
– Interpretation of rankings
– Use of rankings
• Covers university rankings in
general, not only the Leiden
Ranking
27
Source: www.cwts.nl/blog?article=n-r2q274
Responsible use of university rankings
28
Source: www.researchresearch.com/news/article/?articleId=1368350
Source: https://vimeo.com/279712695
Design of rankings
1. A generic concept of university performance should not be used
2. A clear distinction should be made between size-dependent and size-
independent indicators
3. Universities should be defined in a consistent way
4. University rankings should be sufficiently transparent
29
Design of rankings
1. A generic concept of university performance should not be used
2. A clear distinction should be made between size-dependent and size-
independent indicators
3. Universities should be defined in a consistent way
4. University rankings should be sufficiently transparent
30
Do not use a generic concept of university performance
31
Composite
indicator
Do not use a generic concept of university performance
32
Do not use a generic concept of university performance
33
Design of rankings
1. A generic concept of university performance should not be used
2. A clear distinction should be made between size-dependent and size-
independent indicators
3. Universities should be defined in a consistent way
4. University rankings should be sufficiently transparent
34
Distinguish between size-dependent and size-
independent indicators
What are the wealthiest countries in the world?
35
GDP per capita GDP
Distinguish between size-dependent and size-
independent indicators
• Shanghai, THE, QS, and US News use composite
indicators
• These composite indicators combine size-dependent
and size-independent indicators
• It is unclear which concept of scientific performance
is measured
36
Distinguish between size-dependent and size-
independent indicators
37
Design of rankings
1. A generic concept of university performance should not be used
2. A clear distinction should be made between size-dependent and size-
independent indicators
3. Universities should be defined in a consistent way
4. University rankings should be sufficiently transparent
38
Define universities consistently
• leiden univ
• leiden state univ
• state univ leiden
• leiden univ hosp
• state univ leiden hosp
• univ leiden hosp
• univ hosp leiden
• lumc
• univ leiden
• leiden univ med ctr
• leiden state univ hosp
• leiden observ
• sterrewacht leiden
• acad hosp leiden
• rijksuniv leiden
• rijksherbarium
• gorlaeus labs
• leiden inst brain & cognit
• leiden inst chem
• sylvius labs
• acad ziekenhuis leiden
• leiden cytol & pathol lab
• rijksherbarium hortus bot
• ...
39
Design of rankings
1. A generic concept of university performance should not be used
2. A clear distinction should be made between size-dependent and size-
independent indicators
3. Universities should be defined in a consistent way
4. University rankings should be sufficiently transparent
40
University rankings should be sufficiently transparent
41www.issi-society.org/open-citations-letter/
Interpretation of rankings
5. Comparisons between universities should be made keeping in mind
differences between universities
6. Uncertainty in university rankings should be acknowledged
7. An exclusive focus on ranks of universities should be avoided; values of
underlying indicators should be taken into account
43
Interpretation of rankings
5. Comparisons between universities should be made keeping in mind
differences between universities
6. Uncertainty in university rankings should be acknowledged
7. An exclusive focus on ranks of universities should be avoided; values of
underlying indicators should be taken into account
44
Interpretation of rankings
5. Comparisons between universities should be made keeping in mind
differences between universities
6. Uncertainty in university rankings should be acknowledged
7. An exclusive focus on ranks of universities should be avoided; values of
underlying indicators should be taken into account
45
Interpretation of rankings
5. Comparisons between universities should be made keeping in mind
differences between universities
6. Uncertainty in university rankings should be acknowledged
7. An exclusive focus on ranks of universities should be avoided; values of
underlying indicators should be taken into account
47
Take into account values of indicators
48
Rockefeller University
Rank 1 (PPtop 10% = 28.2%)
Queen Mary University London
Rank 50 (PPtop 10% = 14.8%)
University of Bari Aldo Moro
Rank 550 (PPtop 10% = 8.0%)
Difference in PP(top 10%) is two times larger for universities
at ranks 1 and 50 than for universities at ranks 50 and 500
Use of rankings
8. Dimensions of university performance not covered by university rankings
should not be overlooked
9. Performance criteria relevant at university level should not automatically be
assumed to have same relevance at department of research group level
10.University rankings should be handled cautiously, but they should not be
dismissed as being completely useless
49
Simplistic use of rankings
50
Proper use of rankings
51
Encouraging proper use
52
Conclusions
• Rankings provide valuable information...
• …but only when designed, interpreted, and used in a proper manner
• Ranking producers, universities, governments, and news media need to work
toward shared principles for responsible university ranking
53
CWTS Leiden Ranking 2019
• Two new types of indicators, in addition to impact and collaboration indicators
• Open access indicators (based on Unpaywall data)
– P(OA)
– P(gold)
– P(green)
– P(unknown)
• Gender indicators
– A(male)
– A(female)
– A(unknown)
54
Thank you for your attention!
55

More Related Content

PPTX
CWTS Leiden Ranking: An advanced bibliometric approach to university ranking
PPTX
A scientometric perspective on university ranking
PDF
How to design a ranking system: Criteria and opportunities for a comparison
PPTX
Bibliometrische visualisaties voor het bijhouden van wetenschappelijke litera...
PPTX
Research-only rankings of HEIs: Is it possible to measure scientific performa...
PPTX
Responsible use of university rankings
PPTX
A scientometric perspective on university ranking
PPTX
Ranking universities responsibly
CWTS Leiden Ranking: An advanced bibliometric approach to university ranking
A scientometric perspective on university ranking
How to design a ranking system: Criteria and opportunities for a comparison
Bibliometrische visualisaties voor het bijhouden van wetenschappelijke litera...
Research-only rankings of HEIs: Is it possible to measure scientific performa...
Responsible use of university rankings
A scientometric perspective on university ranking
Ranking universities responsibly

What's hot (20)

PPTX
Scientometrics for research assessment
PPTX
An in-depth bibliometric perspective on China’s scientific performance
PPTX
Responsible metrics: One size doesn't fit all
PPTX
Science of science, scientometrics, and research policy: The need for quantit...
PPTX
Scientific information retrieval: Challenges and opportunities
PPTX
New developments in the CWTS Leiden Ranking
PPTX
From econometrics to bibliometrics
PPTX
Comparing scientific performance across disciplines: Methodological and conce...
PPTX
Contextualized scientometrics: What's behind the numbers?
PPTX
Ranking universities responsibly
PPTX
The need for contextualized scientometric analysis
PPTX
Ranking universities responsibly
PPTX
Scientometric approaches to classification
PPTX
Large-scale visualization of science: Methods, tools, and applications
PPTX
Open science: Implications for bibliometrics and scientometrics
PPTX
The landscape of research on research
PPTX
Citation analysis: State of the art, good practices, and future developments
PPTX
Toward open citations: Why, how, and when?
PPTX
Responsible journals: Making reading, evaluation and publishing open
PPTX
Social sciences research addressing societal challenges
Scientometrics for research assessment
An in-depth bibliometric perspective on China’s scientific performance
Responsible metrics: One size doesn't fit all
Science of science, scientometrics, and research policy: The need for quantit...
Scientific information retrieval: Challenges and opportunities
New developments in the CWTS Leiden Ranking
From econometrics to bibliometrics
Comparing scientific performance across disciplines: Methodological and conce...
Contextualized scientometrics: What's behind the numbers?
Ranking universities responsibly
The need for contextualized scientometric analysis
Ranking universities responsibly
Scientometric approaches to classification
Large-scale visualization of science: Methods, tools, and applications
Open science: Implications for bibliometrics and scientometrics
The landscape of research on research
Citation analysis: State of the art, good practices, and future developments
Toward open citations: Why, how, and when?
Responsible journals: Making reading, evaluation and publishing open
Social sciences research addressing societal challenges
Ad

Similar to A scientometric perspective on university ranking (20)

PDF
CWTS Leiden Ranking: An advanced bibliometric approach to university ranking
PPTX
Reviewing and summarization of university ranking system to.pptx
PDF
Feedback on the draft summary report
PPT
Performance and innovation culture in academic libraries: the role of LibQUAL...
PPTX
Japans University Ranking System.pptx
PPTX
QS and THE subject rankings compared.pptx
PPTX
Elsevier support to Ranking agencies.pptx
PDF
Researcher profiles and metrics that matter
PPTX
Snowball Metrics: University-owned Benchmarking to Reveal Strengths within Al...
PPTX
From Tweetations to Citations: Social Media and the Researcher
PPT
Paper 5: Study of the Model and Methodology for Institute Evaluation (Yang)
PPTX
Liam Cleere University College Dublin’s Senior Manager for Research Analytics...
PDF
Bibliometric solutions for identifying potential collaborators
PDF
Bibliometric solutions for identifying potential collaborators
PPTX
09 cmmn2 sti2014_v02
PPTX
In metrics we trust?
PPTX
Looking to the future: closing the gaps in our assessment approach
PPTX
sensitization on indexing database for PHD studentspptx
PDF
Research & Ranking
PPTX
The changing world of research evaluation
CWTS Leiden Ranking: An advanced bibliometric approach to university ranking
Reviewing and summarization of university ranking system to.pptx
Feedback on the draft summary report
Performance and innovation culture in academic libraries: the role of LibQUAL...
Japans University Ranking System.pptx
QS and THE subject rankings compared.pptx
Elsevier support to Ranking agencies.pptx
Researcher profiles and metrics that matter
Snowball Metrics: University-owned Benchmarking to Reveal Strengths within Al...
From Tweetations to Citations: Social Media and the Researcher
Paper 5: Study of the Model and Methodology for Institute Evaluation (Yang)
Liam Cleere University College Dublin’s Senior Manager for Research Analytics...
Bibliometric solutions for identifying potential collaborators
Bibliometric solutions for identifying potential collaborators
09 cmmn2 sti2014_v02
In metrics we trust?
Looking to the future: closing the gaps in our assessment approach
sensitization on indexing database for PHD studentspptx
Research & Ranking
The changing world of research evaluation
Ad

More from Nees Jan van Eck (20)

PPTX
Crossref as a source of open bibliographic metadata
PPTX
Visual exploration of scientific literature using VOSviewer and CitNetExplorer
PPTX
Intermediacy of publications
PPTX
Community detection using citation relations and textual similarities in a la...
PPTX
Visualizing science using VOSviewer based on Crossref, Microsoft Academic, an...
PPTX
Open data sources in VOSviewer
PPTX
Open data sources in VOSviewer
PPTX
Open data sources in VOSviewer
PPTX
Large-scale visualization of science
PPTX
Visualizing science based on open data sources
PDF
Accuracy of citation data in Web of Science and Scopus
PDF
Using full-text data to create improved term maps
PDF
VOSviewer: A software tool for analyzing and visualizing scientific literature
PDF
Science Mapping and Research Positioning
PDF
Advanced citation matching and large-scale cited reference extraction
PDF
Advanced bibliometric software tools for publishers and editors
PDF
Large-scale analysis of bibliometric networks
PDF
Bibliometric network analysis: Software tools, techniques, and an analysis o...
PDF
Large-scale analysis of bibliometric data sources
PPTX
On cluster stability
Crossref as a source of open bibliographic metadata
Visual exploration of scientific literature using VOSviewer and CitNetExplorer
Intermediacy of publications
Community detection using citation relations and textual similarities in a la...
Visualizing science using VOSviewer based on Crossref, Microsoft Academic, an...
Open data sources in VOSviewer
Open data sources in VOSviewer
Open data sources in VOSviewer
Large-scale visualization of science
Visualizing science based on open data sources
Accuracy of citation data in Web of Science and Scopus
Using full-text data to create improved term maps
VOSviewer: A software tool for analyzing and visualizing scientific literature
Science Mapping and Research Positioning
Advanced citation matching and large-scale cited reference extraction
Advanced bibliometric software tools for publishers and editors
Large-scale analysis of bibliometric networks
Bibliometric network analysis: Software tools, techniques, and an analysis o...
Large-scale analysis of bibliometric data sources
On cluster stability

Recently uploaded (20)

PDF
Horaris_Grups_25-26_Definitiu_15_07_25.pdf
PDF
Nurlina - Urban Planner Portfolio (english ver)
PDF
Laparoscopic Colorectal Surgery at WLH Hospital
PPTX
BSCE 2 NIGHT (CHAPTER 2) just cases.pptx
PDF
Hospital Case Study .architecture design
PPTX
Macbeth play - analysis .pptx english lit
PDF
PUBH1000 - Module 6: Global Health Tute Slides
PDF
Everyday Spelling and Grammar by Kathi Wyldeck
PDF
Myanmar Dental Journal, The Journal of the Myanmar Dental Association (2013).pdf
PPTX
PLASMA AND ITS CONSTITUENTS 123.pptx
PDF
anganwadi services for the b.sc nursing and GNM
PPTX
2025 High Blood Pressure Guideline Slide Set.pptx
PDF
African Communication Research: A review
PPTX
Power Point PR B.Inggris 12 Ed. 2019.pptx
PDF
Physical education and sports and CWSN notes
PDF
Compact First Student's Book Cambridge Official
PDF
LIFE & LIVING TRILOGY - PART - (2) THE PURPOSE OF LIFE.pdf
PDF
Farming Based Livelihood Systems English Notes
PDF
Skin Care and Cosmetic Ingredients Dictionary ( PDFDrive ).pdf
PDF
LIFE & LIVING TRILOGY - PART (3) REALITY & MYSTERY.pdf
Horaris_Grups_25-26_Definitiu_15_07_25.pdf
Nurlina - Urban Planner Portfolio (english ver)
Laparoscopic Colorectal Surgery at WLH Hospital
BSCE 2 NIGHT (CHAPTER 2) just cases.pptx
Hospital Case Study .architecture design
Macbeth play - analysis .pptx english lit
PUBH1000 - Module 6: Global Health Tute Slides
Everyday Spelling and Grammar by Kathi Wyldeck
Myanmar Dental Journal, The Journal of the Myanmar Dental Association (2013).pdf
PLASMA AND ITS CONSTITUENTS 123.pptx
anganwadi services for the b.sc nursing and GNM
2025 High Blood Pressure Guideline Slide Set.pptx
African Communication Research: A review
Power Point PR B.Inggris 12 Ed. 2019.pptx
Physical education and sports and CWSN notes
Compact First Student's Book Cambridge Official
LIFE & LIVING TRILOGY - PART - (2) THE PURPOSE OF LIFE.pdf
Farming Based Livelihood Systems English Notes
Skin Care and Cosmetic Ingredients Dictionary ( PDFDrive ).pdf
LIFE & LIVING TRILOGY - PART (3) REALITY & MYSTERY.pdf

A scientometric perspective on university ranking

  • 1. A scientometric perspective on university ranking Nees Jan van Eck Centre for Science and Technology Studies (CWTS), Leiden University Center of Scientometrics (CoS), National Science Library, Chinese Academy of Sciences Beijing, China, April 12, 2019
  • 2. Centre for Science and Technology Studies (CWTS) • Research center at Leiden University • Science and technology studies, with a considerable emphasis on scientometrics • About 50 staff members • Our mission: Making the science system better! • Research groups: – Quantitative Science Studies – Science and Evaluation Studies – Science, Technology, and Innovation Studies • Commissioned research for research institutions, funders, governments, companies, etc. 1
  • 3. About myself • Master in computer science • PhD thesis on bibliometric mapping of science • Senior researcher at CWTS – Bibliometric network analysis and visualization – Bibliometric data sources – Bibliometric indicators • Head of ICT 2
  • 5. Outline • CWTS Leiden Ranking • Responsible use of university rankings 4
  • 8. CWTS Leiden Ranking • Provides bibliometric indicators of: – Scientific impact – Scientific collaboration • Calculated based on Clarivate Analytics Web of Science data 7
  • 9. Selection of universities (2018 edition) • All universities worldwide with ≥1000 Web of Science publications in period 2013–2016 • 938 universities from 55 countries 8
  • 10. Indicators • Size-dependent and size-independent indicators • Scientific output: – P 9 • Scientific impact: – P(top 1%) and PP(top 1%) – P(top 5%) and PP(top 5%) – P(top 10%) and PP(top 10%) – P(top 50%) and PP(top 50%) – TCS and MCS – TNCS and MNCS • Scientific collaboration: – P(collab) and PP(collab) – P(int collab) and PP(int collab) – P(industry) and PP(industry) – P(<100 km) and PP(<100 km) – P(>5000 km) and PP(>5000 km) PP X = P(X) P
  • 11. Differences with other university rankings • No composite indicators • Focused on research, not on teaching • Based purely on bibliometric indicators; no survey data or data provided by universities • High-quality bibliometric methodology • Multiple views, not just a simple list 10
  • 12. 11
  • 13. 12
  • 14. 13
  • 15. 14
  • 16. 15
  • 17. 16
  • 18. 17
  • 19. Advanced bibliometric methodology • Field classification system • Counting citations vs. counting highly cited publications • Full counting vs. fractional counting • Bibliographic database 18
  • 20. 19 About 4000 fields of science in the Leiden Ranking Social sciences and humanities Biomedical and health sciences Life and earth sciences Mathematics and computer science Physical sciences and engineering
  • 21. Why count highly cited publications? • Leiden Ranking counts number of highly cited publications (top 10%) • THE, QS, and US News count number of citations • Effect of counting number of citations: 20
  • 22. Why count highly cited publications? 21
  • 23. Why count highly cited publications? 22 Counting citations Counting highly cited publications Leaving out Göttingen’s most cited publication
  • 24. How to handle publications co-authored by multiple institutions? • THE, QS, and US News: – Co-authored publications are fully assigned to each co-authoring institution (full counting) • Leiden Ranking: – Co-authored publications are fractionally assigned to each co-authoring institution (fractional counting) 23 This publication is assigned to Leiden with a weight of 2/4
  • 25. Why use fractional counting? • Full counting is biased in favor of universities with a strong biomedical focus 24
  • 26. Choice of bibliographic database: Is more data always better? • Universities from China, Russia, France, Germany, etc. may not benefit at all from having more data • Indicators should be based on a restricted database of publications • Leiden Ranking uses Web of Science, but excludes national scientific journals, trade journals, and popular magazines 25
  • 28. Responsible use of university rankings • Ten principles for responsible use of rankings: – Design of rankings – Interpretation of rankings – Use of rankings • Covers university rankings in general, not only the Leiden Ranking 27 Source: www.cwts.nl/blog?article=n-r2q274
  • 29. Responsible use of university rankings 28 Source: www.researchresearch.com/news/article/?articleId=1368350 Source: https://vimeo.com/279712695
  • 30. Design of rankings 1. A generic concept of university performance should not be used 2. A clear distinction should be made between size-dependent and size- independent indicators 3. Universities should be defined in a consistent way 4. University rankings should be sufficiently transparent 29
  • 31. Design of rankings 1. A generic concept of university performance should not be used 2. A clear distinction should be made between size-dependent and size- independent indicators 3. Universities should be defined in a consistent way 4. University rankings should be sufficiently transparent 30
  • 32. Do not use a generic concept of university performance 31 Composite indicator
  • 33. Do not use a generic concept of university performance 32
  • 34. Do not use a generic concept of university performance 33
  • 35. Design of rankings 1. A generic concept of university performance should not be used 2. A clear distinction should be made between size-dependent and size- independent indicators 3. Universities should be defined in a consistent way 4. University rankings should be sufficiently transparent 34
  • 36. Distinguish between size-dependent and size- independent indicators What are the wealthiest countries in the world? 35 GDP per capita GDP
  • 37. Distinguish between size-dependent and size- independent indicators • Shanghai, THE, QS, and US News use composite indicators • These composite indicators combine size-dependent and size-independent indicators • It is unclear which concept of scientific performance is measured 36
  • 38. Distinguish between size-dependent and size- independent indicators 37
  • 39. Design of rankings 1. A generic concept of university performance should not be used 2. A clear distinction should be made between size-dependent and size- independent indicators 3. Universities should be defined in a consistent way 4. University rankings should be sufficiently transparent 38
  • 40. Define universities consistently • leiden univ • leiden state univ • state univ leiden • leiden univ hosp • state univ leiden hosp • univ leiden hosp • univ hosp leiden • lumc • univ leiden • leiden univ med ctr • leiden state univ hosp • leiden observ • sterrewacht leiden • acad hosp leiden • rijksuniv leiden • rijksherbarium • gorlaeus labs • leiden inst brain & cognit • leiden inst chem • sylvius labs • acad ziekenhuis leiden • leiden cytol & pathol lab • rijksherbarium hortus bot • ... 39
  • 41. Design of rankings 1. A generic concept of university performance should not be used 2. A clear distinction should be made between size-dependent and size- independent indicators 3. Universities should be defined in a consistent way 4. University rankings should be sufficiently transparent 40
  • 42. University rankings should be sufficiently transparent 41www.issi-society.org/open-citations-letter/
  • 43. Interpretation of rankings 5. Comparisons between universities should be made keeping in mind differences between universities 6. Uncertainty in university rankings should be acknowledged 7. An exclusive focus on ranks of universities should be avoided; values of underlying indicators should be taken into account 43
  • 44. Interpretation of rankings 5. Comparisons between universities should be made keeping in mind differences between universities 6. Uncertainty in university rankings should be acknowledged 7. An exclusive focus on ranks of universities should be avoided; values of underlying indicators should be taken into account 44
  • 45. Interpretation of rankings 5. Comparisons between universities should be made keeping in mind differences between universities 6. Uncertainty in university rankings should be acknowledged 7. An exclusive focus on ranks of universities should be avoided; values of underlying indicators should be taken into account 45
  • 46. Interpretation of rankings 5. Comparisons between universities should be made keeping in mind differences between universities 6. Uncertainty in university rankings should be acknowledged 7. An exclusive focus on ranks of universities should be avoided; values of underlying indicators should be taken into account 47
  • 47. Take into account values of indicators 48 Rockefeller University Rank 1 (PPtop 10% = 28.2%) Queen Mary University London Rank 50 (PPtop 10% = 14.8%) University of Bari Aldo Moro Rank 550 (PPtop 10% = 8.0%) Difference in PP(top 10%) is two times larger for universities at ranks 1 and 50 than for universities at ranks 50 and 500
  • 48. Use of rankings 8. Dimensions of university performance not covered by university rankings should not be overlooked 9. Performance criteria relevant at university level should not automatically be assumed to have same relevance at department of research group level 10.University rankings should be handled cautiously, but they should not be dismissed as being completely useless 49
  • 49. Simplistic use of rankings 50
  • 50. Proper use of rankings 51
  • 52. Conclusions • Rankings provide valuable information... • …but only when designed, interpreted, and used in a proper manner • Ranking producers, universities, governments, and news media need to work toward shared principles for responsible university ranking 53
  • 53. CWTS Leiden Ranking 2019 • Two new types of indicators, in addition to impact and collaboration indicators • Open access indicators (based on Unpaywall data) – P(OA) – P(gold) – P(green) – P(unknown) • Gender indicators – A(male) – A(female) – A(unknown) 54
  • 54. Thank you for your attention! 55