This document provides an overview and demonstration of the SciVal and Snowball Metrics tools. SciVal is a set of integrated modules that enables institutions to make evidence-based strategic decisions about research performance, benchmarking, and collaboration opportunities. It contains data on over 220 nations and 4,600 research institutions. The demonstration shows how to access and use the Overview, Benchmarking, and Collaboration modules to analyze the research performance of the University of Cape Town across metrics like publications, citations, collaborators, and more. It also provides guidance on selecting appropriate metrics and defining custom research areas for analysis in SciVal.
2. Agenda
• What are Snowball metrics?
• What is SciVal?
• How to access SciVal
• Overview module
• Benchmarking module
• Collaboration module
3. What are Snowball Metrics?
Snowball Metrics are a set of standard metrics to measure
research performance.
For more information: http://www.snowballmetrics.com/
6. SciVal is a set of integrated modules that enables an institution
to make evidence-based strategic decisions.
SciVal consists of three modules:
Overview - Get an overview of the research performance of
your institution and others based on output, impact, and
collaborations.
Benchmarking – Determine your strengths and weaknesses.
Compare your research institution and teams to others based on
performance metrics. Model different test scenarios.
Collaboration – Identify and analyze existing and potential
collaboration opportunities. Identify suitable collaboration
partners. See who others are collaborating with.
7. What is SciVal?
7
SciVal offers quick, easy access to the research performance of 220 nations
and 4,600 research institutions worldwide, and groups of countries and
institutions.
Overview Benchmark Collaboration
Ready-made-at a glance
snapshots of any
selected entity
Flexibility to create and
compare any research
groups
Identify and analyze
existing and potential
collaboration opportunities
12. Login
Access: www.scival.com
Same login credentials as
ScienceDirect/Scopus
can be used
Click here if you forgot
your password
Click here to create a new login
15. Summary tab – shows an
institution’s disciplinary focus.
Also answers the question:
How can my institution
evaluate the impact of its
research portfolio?
Here you can see that UCT has
averaged 9.1 citations per
publication over a five-year
time period.
17. Publications tab
– shows how
productive the
institution is by
using the
scholarly output
metric.
18. Answers the question: How can
my institution demonstrate
research excellence?
Look at Outputs in top
percentiles. Look at
Publications in top journal
percentiles.
As you can see in the chart on the left,
some 19.6% of University of CT
publications were in the
top 10 percentiles of the most cited
publications worldwide.
In the chart below, 21.6% of the
publications at UCT were
published in the top 10 journals
worldwide (measured by SNIP). SNIP
(Source-Normalized Impact per Paper)
This measures the citation impact of a
journal.
19. Answer the question: Where does the institution have the highest impact?
Select Publications, Select by Journal category.
21. ‘Field weighted citation impact’ adjusts for
the differences in citation behaviour across
disciplines and puts everything on the same
level.
The Field-Weighted Citation Impact is the
number of total citations received divided by
the total citations expected, based on the
global average for the field.
• More than 1.00 means that citations are
more than expected.
• Less than 1.00 means the citations are less
than expected.
The value of the world citation
impact is 1. So this is showing you
that over a 5 year period UCT’s
citation impact was 1.74. Citations
are 74% more than expected based
on the global average.
Citations tab – highlights institutions citation impact.
22. Collaboration tab – gives a high level overview of how an entity collaborates geographically.
23. Collaboration tab – gives a high level overview of how an entity collaborates geographically.
24. Competencies tab: This is another way to show research strengths (competencies).
Answers the question: How can my institution identify it’s research strengths?
25. Competencies tab – competencies
map provides a view of worldwide
research strengths. Bubbles
represent research areas in which
UCT is a global leader.
A competency shows where an
institution has a leading position
compared to other institutions in
terms of
• number of publications,
• number of highly cited
publications or
• innovation
• the recentness of cited
publications.
26.
27.
28.
29. Select any
desired
combination
of research
entities you
wish to
benchmark.
Select year range between
1996 and the current year.
Filter subject area
using 27 top level and
334 lower level subject
areas based on
Scopus ASJC.
33. The only metrics rule: Triangulate!
• No single metric is perfect. Always give at least 2 metrics to
provide insight into your question.
• No data set is going to give you a perfect answer.
• Reinforce your evidence-based conclusions with at least one
of, and ideally both, peer review and expert opinion.
34. Selection of appropriate metrics
• Know your Question!
1. Performance evaluation of an institution, faculty, researcher etc.
2. Demonstration of excellence
3. Scenario Modelling
• E.g. if modelling recruitment in biochemistry, might not need to worry about different
citation rates between fields because only looking at one field
35. “Non-performance variables”?
• Size
• If your doing an evaluation this is important!
• Everything is not equal
• Use size-normalised metrics and percentages
• Discipline
• Some disciplines have higher values due to their researcher behaviour
• Publication-type
• Database coverage
• Manipulation
• Time
38. Question 2
How does UCT perform relative to
Stellenbosch University in Medicine?
Taking the disciplinary focuses into account by using the filter gives me other option for metrics to
use, but why would I want to do this?
39. Field-Weighted Citation Impact
• Suitable for:
• Fair comparison of entities regardless of size, disciplinary profile and publication-types
• Avoiding the crash in recent years
• Immediate understanding of extent of brilliance (1 = average)
• Selection in case of uncertainty
• However…
• You lose the idea of the magnitude of your impact
• People might not like to see low numbers
• Always using FWCI as default severely restricts richness of information/data
• Complicated calculation that users will not be able to validate themselves
40. Defining a Research Area
SciVal offers a flexibility to create your own Research Areas, representing a field of research defined by you
Research Areas can represent a strategic priority, an emerging area of science, or any other topic of
interest using below as the building blocks:
• Search terms - Search for publication sets using keyword(s)
• Entities - Select and combine any of the below
o Institutions (+ groups)
o Countries (+ groups)
o Journals and journal categories
• Competencies - Select and combine competencies of any desired institutions or countries
41. Select “Define a
new Research
Area”
Search by keywords
Apply filters to
refine search
result
Name it and save
Define your Research Areas
43. SciVal Collaboration
Module
How can my institution find
collaboration partners?
International collaborations
can increase your impact and
visibility, which could lead
to more funding
opportunities.
How can you identify
suitable international
collaboration partners?
Which countries should we
focus on?
And which institutions are
active in which disciplines?
44.
45. There are 207
institutions in
Germany that are
active in
biotechnology but
are not yet
collaborating with
UCT in this field.
Click on the number.
A group of research intensive universities in the UK – those listed in the slide here –defined sets of metrics that could help to show research strengths by benchmarking apples with apples.
These metrics are used to make strategic decisions. They tested various methodologies and make it clear that they are not attached to any provider of data. They call these metrics ‘recipes’ and are available to institutions free of charge. The goal is to have these metrics become global standards for measuring research activities.
The snowball metrics team view these metrics as a complement to traditional metrics and not as a replacement. They provide a means of confidently answer questions and provide information on institutional strengths and weaknesses. Only the metrics that may answer your questions need to be used – you can pick and choose the ones that you want to use that will help you make informed decisions.
SciVal’s integrated modular platform allows you to configure, visualize and export information according to your specific institution’s needs and preferences, so that you can benchmark with meaning and accuracy to better understand your institution’s position relative to your peers, as well as global and domestic standards.
Access comprehensive research performance summaries of any desired research entities, identify their unique research strengths and multidisciplinary research areas.
• Retrieve at-a-glance, standardized reports instantly
• Access competency maps for all institutions and countries
Summary tab – shows an institution’s disciplinary focus. Also answers the question: How can my institution evaluate the impact of its research portfolio?
Here you can see that UCT has averaged 9.1 citations per publication over a five-year time period.
These performance indicators here are all snowball metrics and we will look at them in closer detail later.
Publications tab – shows how productive the institution is by using the scholarly output metric.
Answers the question: How can my institution demonstrate research excellence?
Look at Outputs in top percentiles. Look at Publications in top journal percentiles. As you can see in the chart on the left, some 19.6% of University of CT publications were in the
top 10 percentiles of the most cited publications worldwide.
In the chart below, 21.6% of the publications at UCT were
published in the top 10 journals worldwide (measured by SNIP). SNIP (Source-Normalized Impact per Paper) This measures the citation impact of a journal.
What are SNIP and SJR?
Source Normalized Impact per Paper (SNIP) and SCImago Journal Rank (SJR) are journal metrics.
They are used to measure the citation impact of a journal.
• SNIP (Source-Normalized Impact per Paper) - This measures the citation impact of a journal.
SNIP is normalized for the journal’s subject field, weighting citations based on the number of expected citations in that field.
• SJR (SCImago Journal Rank) - This measures the prestige of citations received by a journal. The subject field, quality and reputation of the citing journal have a direct effect on the value of a citation.
See www.journalmetrics.com for more details on SNIP and SJR.
‘Field weighted citation impact’ adjusts for the differences in citation behaviour across disciplines and puts everything on the same level.
The Field-Weighted Citation Impact is the number of total citations received divided by the total citations expected, based on the global average for the field.
• More than 1.00 means that citations are more than expected.
• Less than 1.00 means the citations are less than expected. The value of the world citation impact is 1. So this is showing you that over a 5 year period UCT’s citation impact was 1.74. Citations are 74% more than expected based on the global average.
Collaboration tab – gives a high level overview of how an entity collaborates geographically
Collaboration tab – gives a high level overview of how an entity collaborates geographically
Competencies tab – competencies map provides a view of worldwide research strengths. Bubbles represent research areas in which UCT is a global leader.
A competency shows where an institution has a leading position compared to other institutions in terms of
number of publications,
number of highly cited publications or
innovation
the recentness of cited publications.
The publication behaviour of some disciplines enables higher coverage in Scopus than others
SciVal Collaboration Module
How can my institution find collaboration partners?
International collaborations can increase your impact and visibility, which could lead to more funding opportunities.
How can you identify suitable international collaboration partners?
Which countries should we focus on?
And which institutions are active in which disciplines?
1. Let’s say that UCT is looking for a collaboration partner in Europe for its expanding biotechnology research.
2. Go to the Potential Collaboration tab in the Collaboration module. Select the Scopus journal category “Biotechnology” from the dropdown menu at the top of the page.
3. The analysis shows 1466 institutions in Europe that haven’t yet collaborated with UCT. In other words: UCT has not co-authored any publications with these institutions within the selected time period.
4. Click on Europe to see potential collaboration partners.
There are 207 institutions in Germany that are active in biotechnology but are not yet collaborating with UCT in this field. Click on the number.
7. Each orange circle in Germany represents an institution. The number inside the circle shows the
publication output at that institution within the selected field. You can also switch to a different
metric (Citations, Citations per Publication or Field-Weighted Citation Impact) to see the
collaborating institutions by citation impact.
8. Technische Univeritat Munchen stands out. There are 877 authors within biotechnology at this
institution, with 532 publications in this field.
9. Click on 532
Choose Potential co-authors to see the top 100 authors (by publication) at Technishe Universität München that are not yet collaborating with UCT authors
If you want to see who Technishe Universität München is collaborating with then you can select that university in the selection panel
and then choose collaboration and current collaboration. You can view the information in a table