Unlocking the Potential: Deep dive into ocean of Ceramic Magnets.pptx
Ranking universities responsibly
1. Ranking universities responsibly
Ludo Waltman
Centre for Science and Technology Studies (CWTS), Leiden University
Ranking best practices
Institut de Mathématique d'Orsay
June 25, 2019
2. Centre for Science and Technology Studies (CWTS)
• Research center at Leiden University
• Science and technology studies, with a considerable
emphasis on scientometrics
• About 50 staff members
• Our mission: Making the science system better!
• Commissioned research for research institutions,
funders, governments, companies, etc.
1
6. Differences with other university rankings
• Focused on research, not on teaching
• Based purely on bibliometric indicators; no survey data or data provided by
universities
• High-quality bibliometric methodology
• No composite indicators
• Multiple views, not just a simple list
5
7. Types of indicators and data sources
• Types of indicators:
– Scientific impact (citations)
– Collaboration (co-authored publications)
– New in 2019: Open access publishing (gold, hybrid, bronze, and green OA)
– New in 2019: Gender balance (male and female authors)
• Data sources:
– Clarivate Analytics Web of Science
– Unpaywall
– Gender API
6
8. Indicators
• Size-dependent and size-independent indicators
• Output:
– P
7
• Impact:
– TCS and MCS
– TNCS and MNCS
– P(top 1%) and PP(top 1%)
– P(top 5%) and PP(top 5%)
– P(top 10%) and PP(top 10%)
– P(top 50%) and PP(top 50%)
• Collaboration:
– P(collab) and PP(collab)
– P(int collab) and PP(int collab)
– P(industry) and PP(industry)
– P(<100 km) and PP(<100 km)
– P(>5000 km) and PP(>5000 km)
16. Advanced bibliometric methodology
• Field classification system
• Counting citations vs. counting highly cited publications
• Full counting vs. fractional counting
• Selection of publications
15
17. 16
Fine-grained field classification system
Social sciences
and humanities
Biomedical and
health sciences Life and earth
sciences
Mathematics and
computer science
Physical
sciences and
engineering
18. Why count highly cited publications?
• Leiden Ranking counts number of highly cited publications (top 10%)
• THE, QS, and US News count number of citations
• Effect of counting number of citations:
17
20. Why count highly cited publications?
19
Counting citations Counting highly cited publications
Leaving out Göttingen’s
most cited publication
21. How to handle publications co-authored by multiple
institutions?
• THE, QS, and US News:
– Co-authored publications are fully
assigned to each co-authoring institution
(full counting)
• CWTS Leiden Ranking:
– Co-authored publications are fractionally
assigned to each co-authoring institution
(fractional counting)
– Fractional counting weights are
determined based on the proportion of
authors affiliated with an institution
20
This publication is assigned to University of Paris-Sud with
a fractional counting weight of 5.5 / 11 = 0.5
22. Why use fractional counting?
• Full counting is biased in favor of universities with a strong biomedical focus
21
23. Selection of publications: Is more always better?
• Leiden Ranking is based exclusively on publications in international scientific
journals
• Leiden Ranking uses Web of Science ‘flagship’ citation indices:
– Science Citation Index Expanded
– Social Sciences Citation Index
– Arts & Humanities Citation Index
• Leiden Ranking excludes national scientific journals, trade journals, and
popular magazines
22
24. Selection of publications: Is more always better?
• Universities from China, Russia, France, Germany, etc. may not benefit at all
from including more publications
23
36. Selection of universities
• Only universities, no other research organizations
• At least 1000 publications in the period 2014–2017
– Only publications indexed in Web of Science (Science Citation Index Expanded, Social Sciences
Citation Index, and Arts & Humanities Citation Index)
– Only publications in international scientific journals
– Only publications classified as article or review
• Publications are counted fractionally
• 963 universities from 56 countries
35
37. Fractional counting of publications
• THE, QS, and US News:
– Co-authored publications are fully
assigned to each co-authoring institution
(full counting)
• CWTS Leiden Ranking:
– Co-authored publications are fractionally
assigned to each co-authoring institution
(fractional counting)
– Fractional counting weights are
determined based on the proportion of
authors affiliated with an institution
36
This publication is assigned to University of Paris-Sud with
a fractional counting weight of 5.5 / 11 = 0.5
39. Identifying the publications of a university:
Three challenges
• Academic hospitals
• University systems
• Tutelles
38
40. Academic hospitals
• In the Leiden Ranking, publications of an academic hospital associated with
a university are assigned to the university only if one of the following
conditions is satisfied:
– The university is mentioned as an affiliation
– The university is not mentioned as an affiliation, but the hospital is legally part of the university
– The university is not mentioned as an affiliation and the hospital is an independent legal entity, but
there is evidence of a tight integration between the hospital and the university
39
41. University systems
• University systems in which the constituent universities remain largely
autonomous (e.g., University of London, University of California, ComUEs in
France) are not included in the Leiden Ranking
• Instead, the constituent universities are included in the Leiden Ranking,
provided that they meet the selection criteria
• Publications are assigned to constituent universities, not to university
systems
• If only a university system is mentioned as an affiliation, the publication
cannot be assigned to a university
40
42. Tutelles
• Publications are fractionally assigned to all universities mentioned in an
affiliation
• We also make use of information about the organizational structure of tutelles
41
45. Ten rules for ranking universities
1. One size doesn’t fit all
2. Separate the relative from
the absolute
3. Be explicit about the
definition of a university
4. Be transparent
5. Compare and contrast
6. Acknowledge uncertainty
7. Look at the underlying data
8. Not everything that counts
can be counted
9. Know your level
10. Handle with care, but don’t
discard
44
52. Conclusions
• Compared with other university rankings, Leiden Ranking uses a much more
sophisticated bibliometric methodology
• Be aware that Leiden Ranking covers only selected aspects of university
performance!
• Need for close collaboration between ranking producers and universities to
properly deal with the complexities of the French research system
• Universities, ranking producers, and news media have a joint responsibility to
promote proper use of university rankings; we need to take a more critical
stance!
51