Hot Sexy call girls in Moti Nagar,🔝 9953056974 🔝 escort Service
A scientometric perspective on university ranking
1. A scientometric perspective on university ranking
Ludo Waltman
Centre for Science and Technology Studies (CWTS), Leiden University
Hangzhou Dianzi University
April 16, 2019
2. Centre for Science and Technology Studies (CWTS)
• Research center at Leiden University
• Science and technology studies, with a considerable
emphasis on scientometrics
• About 50 staff members
• Our mission: Making the science system better!
• Commissioned research for research institutions,
funders, governments, companies, etc.
1
6. CWTS Leiden Ranking
• Provides bibliometric indicators of:
– Scientific impact
– Scientific collaboration
• Calculated based on Clarivate Analytics Web of Science data
5
7. Differences with other university rankings
• Focused on research, not on teaching
• Based purely on bibliometric indicators; no survey data or data provided by
universities
• High-quality bibliometric methodology
• No composite indicators
• Multiple views, not just a simple list
6
8. Selection of universities (2018 edition)
• All universities worldwide with ≥1000
Web of Science publications in period
2013–2016
• 938 universities from 55 countries
7
9. Indicators
• Size-dependent and size-independent indicators
• Output:
– P
8
• Impact:
– TCS and MCS
– TNCS and MNCS
– P(top 1%) and PP(top 1%)
– P(top 5%) and PP(top 5%)
– P(top 10%) and PP(top 10%)
– P(top 50%) and PP(top 50%)
• Collaboration:
– P(collab) and PP(collab)
– P(int collab) and PP(int collab)
– P(industry) and PP(industry)
– P(<100 km) and PP(<100 km)
– P(>5000 km) and PP(>5000 km)
17. Advanced bibliometric methodology
• Field classification system
• Counting citations vs. counting highly cited publications
• Full counting vs. fractional counting
• Bibliographic database
16
18. 17
About 4000 fields of science in the Leiden Ranking
Social sciences
and humanities
Biomedical and
health sciences Life and earth
sciences
Mathematics and
computer science
Physical
sciences and
engineering
19. Why count highly cited publications?
• Leiden Ranking counts number of highly cited publications (top 10%)
• THE, QS, and US News count number of citations
• Effect of counting number of citations:
18
21. Why count highly cited publications?
20
Counting citations Counting highly cited publications
Leaving out Göttingen’s
most cited publication
22. How to handle publications co-authored by multiple
institutions?
• THE, QS, and US News:
– Co-authored publications are fully assigned to each co-authoring institution (full counting)
• Leiden Ranking:
– Co-authored publications are fractionally assigned to each co-authoring institution (fractional
counting)
21
This publication is
assigned to
Enschede, Twente,
and Leiden with a
weight of 1/3 each
23. Why use fractional counting?
• Full counting is biased in favor of universities with a strong biomedical focus
22
24. Choice of bibliographic database:
Is more data always better?
• Universities from China, Russia, France, Germany, etc. may not benefit at all
from having more data
• Indicators should be based on a restricted database of publications
• Leiden Ranking uses Web of Science, but excludes national scientific
journals, trade journals, and popular magazines
23
26. Responsible use of university rankings
• Ten principles for responsible use
of rankings:
– Design of rankings
– Interpretation of rankings
– Use of rankings
• Covers university rankings in
general, not only the Leiden
Ranking
25
Source: www.cwts.nl/blog?article=n-r2q274
27. Responsible use of university rankings
26
Source: www.researchresearch.com/news/article/?articleId=1368350
Source: https://vimeo.com/279712695
28. Design of rankings
1. A generic concept of university performance should not be used
2. A clear distinction should be made between size-dependent and size-
independent indicators
3. Universities should be defined in a consistent way
4. University rankings should be sufficiently transparent
27
29. Do not use a generic concept of university performance
28
30. Do not use a generic concept of university performance
29
Composite
indicator
31. Do not use a generic concept of university performance
30
32. Distinguish between size-dependent and size-
independent indicators
What are the wealthiest countries in the world?
31
GDP per capita GDP
33. Distinguish between size-dependent and size-
independent indicators
• Shanghai, THE, QS, and US News use composite
indicators
• These composite indicators combine size-dependent
and size-independent indicators
• It is unclear which concept of scientific performance
is measured
32
37. Need for consistency and transparency
36
Weak correlation between citation scores in Leiden Ranking and
Times Higher Education ranking
38. Interpretation of rankings
5. Comparisons between universities should be made keeping in mind
differences between universities
6. Uncertainty in university rankings should be acknowledged
7. An exclusive focus on ranks of universities should be avoided; values of
underlying indicators should be taken into account
37
39. Uncertainty in university rankings should be
acknowledged
38
Stephen Curry, the Guardian, October 6, 2015
40. Take into account values of indicators
39
Rockefeller University
Rank 1 (PPtop 10% = 28.2%)
Queen Mary University London
Rank 50 (PPtop 10% = 14.8%)
University of Bari Aldo Moro
Rank 550 (PPtop 10% = 8.0%)
Difference in PPtop 10% is two times larger for universities
at ranks 1 and 50 than for universities at ranks 50 and 500
41. Use of rankings
8. Dimensions of university performance not covered by university rankings
should not be overlooked
9. Performance criteria relevant at university level should not automatically be
assumed to have same relevance at department of research group level
10.University rankings should be handled cautiously, but they should not be
dismissed as being completely useless
40
45. Conclusions
• Rankings provide valuable information...
• …but only when designed, interpreted, and used in a proper manner
• Ranking producers, universities, governments, and news media need to work
toward shared principles for responsible university ranking
44
46. CWTS Leiden Ranking 2019
• Two new types of indicators, in addition to impact and collaboration indicators
• Open access indicators (based on Unpaywall data)
– P(OA)
– P(gold)
– P(green)
– P(unknown)
• Gender indicators
– A(male)
– A(female)
– A(unknown)
45