Altmetrics: how librarians can support researchers in improving their impact
1. Altmetrics:
How librarians can
support researchers
in improving their impact
1
Fondazione IRCCS San Matteo di Pavia
Center for Scientific Documentation-
Dott.ssa Valeria Scotti
2. Today we are going to talk about:
2
Bibliometrics;
Alternative metrics or Altmetrics;
Altmetrics Tools;
Altmetrics application for:
Researchers
Funders and Research Evaluator
Institution
Patients
Librarians
Our trial and other correlations;
Conclusion.
3. Introduction
The problem of measuring the scientific and social impact of
research publications has been of extreme interest to
scientists and scholars since the inception of modern science,
but it has always been hard to answer..
Evaluating the importance of an article before reading it is
crucial for researchers that lack of time to read all relevant
papers.
3
4. Methodology
There are two main methodologies for the evaluation of academic
research output quality:
Qualitative: Peer review
Quantitative: Metrics
Why is measuring research quality important?
4
5. Peer review
A group of expert scholars, working in the same scientific area (peers) that
evaluate submitted research works and academics’ performance, and assess
scientific journals in a particular field.
Peer review can be used as an objective and reliable evaluation measure and is
seen by many as the “gold standard”
However:
o It is (extremely) time consuming as well as expensive;
o Experts can genuinely disagree (referees);
o It is surrounded with mysticism and may create an elite club which can be
difficult to enter.
5
6. Bibliometrics
Bibliometrics is the application of quantitative analysis and
statistics to publications such as journal articles and their
accompanying citation counts.
The main tool of bibliometrics is citation analysis:
Applies to journals (impact factor)
individuals (h-index)
and articles (citation impact)
6
8. Scientific Publishing Cycle
A crucial limit is timeliness..
8 http://www.napavalley.edu/Library/Pages/ScientificInformationLiteracy.aspx
9. Traditional Indicators limits
Not all journals are indexed in the Journals Citation Reports, a
new edition of which is updated once a year;
The inability to compare journals belong to different subject as
subject fields citations varies widely based on discipline ;
The auto-citations (an author who quotes himself) and the real
"exchange of courtesies" ;
Peer Review: it is time consuming and expensive;
Young researchers are disadvantaged since they have published
less articles than senior researchers;
9
10. Web 2.0
The development of tools even more Web 2.0
oriented has profoundly changed the scientific
communication process
Web 2.0 = 2nd generation web services
And the Library became:
Library 2.0= is a real-time library
10
11. New Tools
The main scientific communication is conditioned by web-based tools,
particularly by e-only journals.
The development of tools even more Web 2.0 oriented has profoundly
changed the scientific communication process
New tools emerge
11
New tools emerge…
http://en.wikipedia.org/wiki/File:Web_2.0_Map.svg
13. Altmetrics
Altmetrics combines the traditional Bibliometrics tool with the use
of the web
In this context, many web tools are often referred as ‘social
media’ due to their role in supporting communication and building
communities
13
15. Altmetrics terms
The term Altmetrics has
been proposed for the first
time in 2010 with a ‘Tweet’
posted by Jason Priem.
(https://twitter.com/jasonpriem/sta
tus/25844968813)
15
16. A Manifesto..
16
NO ONE CAN READ EVERYTHING…
altmetrics is the creation
and study of new metrics
based on the Social Web
for analyzing, and
informing scholarship.
http://altmetrics.org/manifesto/
October 26, 2010
17. Why new metrics?
Peer review: often slow and inefficient;
Citations: only considers who write the paper;
Impact factor: easily manipulated.
The work of researchers has shifted to the web where you can:
Counting downloads
Read the tweets and comments on Facebook, Google +,
Post a video on You Tube
All these tracks measuring the Impact on the scientific community.
17
18. D O R A (Declaration on
Research Assessment)
18
General Recommendation:
1. Do not use journal-based metrics,
such as Journal Impact Factors, as a
surrogate measure of the quality of
individual research articles, to
assess an individual scientist’s
contributions, or in hiring,
promotion, or funding decisions.
7. Make available a range of article-
level metrics to encourage a shift
toward assessment based on the
scientific content of an article rather
than publication metrics of the
journal in which it was published.
19. Bibliometrics measures citation
Scholarly Public
Recommended
Cited
Traditional
Citation
Discussed
Saved
Scholarly
engagementtype
audience
Jason Priem Altmetrics presentation -
https://docs.google.com/presentation/d/1fV8iFINfjdy7FAOk1qenrGmblQ70hJ3tki09W14Wk4k/edit?pli=1#slide
=id.i133
19
21. IMPACT
21
So How Do We Measure Impact?
More granular
More accurate
Altmetrics Measure…
(Altmetrics.org, 2010)
Influence of research
outputs
Traditional
Digital Scholarship
Influence of a researcher
22. altmetrics measures impact
Scholarly Public
Recommended F1000, listservs Popular press
Cited
Traditional
Citation
Wikipedia
Discussed Scholarly Blogs Blogs, Twitter
Saved
Mendeley,
CiteULike
Delicious
Scholarly PDF views HTML views
audience
engagementtype
Jason Priem Altmetrics presentation -
https://docs.google.com/presentation/d/1fV8iFINfjdy7FAOk1qenrGmblQ70hJ3tki09W14Wk4k/edit?pli=1#slide
=id.i133
22
23. Variety of Research
The impact of a research paper
has a flavour. It might be
champagne: a titillating
discussion piece of the week. Or
maybe it is a dark chocolate
mainstay of the field. Strawberry:
a great methods contribution.
Licorice: controversial.
Bubblegum: a hit in the
classrooms. Lowfat vanilla: not
very creamy, but it fills a need.
Heather Piwowar “31 Flavors Of Research Impact
Through #Altmetrics”
23
24. What are “altmetrics”?
“Alternative Metrics”
new ways of measuring different, non-traditional forms of impact,
potentially of non-traditional outputs.
“alternative to only using citations”, not “alternative to citations”.
complementary to traditional citation-based analysis.
24
Altmetrics in Pills
25. The data defined Altmetrics are aggregated
from various online resources:
25
26. Altmetrics sources can be categorized
Usage
HTML views, PDF/XML downloads (various sources – journal, PubMed Central,
FigShare, Dryad, etc.)
Captures
CiteULike bookmarks, Mendeley readers/groups, Delicio.us
Mentions
Blog posts, news stories, Wikipedia articles, comments, reviews
Social Media
Tweets, Google+, Facebook likes, shares, ratings
Citations
Web of Science, Scopus, CrossRef, Pubmed Central, Microsoft Academic Search
26
30. Altmetrics Tools:
There are various portals that use Almetrics. The main ones are:
Almetric.com : Collection of article-level metrics;
PLoS Article Level Metrics: series of measures that can be used to monitor
the impact of research on individual articles (instead of journals), during the
time published by PLoS;
Impact Story: Aggregates parameters from numerous resources and
generates reports for the individual researcher;
Plum Analytics: the provider of PlumX, a product that delivers a more
complete picture of research and answers questions on research impact for
everyone (including researchers, librarians, administrators, and funders)
30
31. Altmetric Tools:
There are other tools that aggregate Almetrics:
Mendeley
Academia.edu,
Research Gate,
Kudos
Reader Meter etc
31
32. Different Providers:
32
Consuming Article -level Metrics: Observations and Lessons: Scott Chamberlain. Information
Standards Quarterly (ISQ) Summer 2013 Volume 25, no. 2;
34. Altmetric.com:
Altmetric (http://www.altmetric.com) born as a London-based start-up
founded by Euan Adie in 2011.
Their mission is “to make article level metrics easy”.
The portal provides three main
- Explorer
- Altmetric for institution
- Bookmarklet
- Badge
Individual users and librarians can use Altmetric.com with a free account,
while a commercial license is required in the case of publishers, funders and
institutions
34
39. Donuts
The color and the number inside
the donut changes for on each
papers.
The colors reflect the mix of
sources on which the article was
cited. For example, blue means it
has been tweeted
40
40. Score
41
This is the quantitative
measure of the attention
given to the paper.
It considers:
•Volume (how many people interact
with it),
•Sources (what medium it is shared
in),
•Authors (who interacts with it),
of that attention
44. ImpactStory
ImpactStory is an open source web service that helps researchers to
explore and share the different impacts of all their research products.
The mission of ImpactStory lies in “helping the researchers tell data-
driven stories about their impact”, moving “from raw almetrics data to
impact profile” (https://impactstory.org/faq).
ImpactStory delivers:
Open source
free and open data, to the extent permitted by data providers
Radical transparency and open communication
The data are provided for each researcher and single item
45
46. ImpactStory..
47
9 Scopus citations.
This article has more citations than 91% of items indexed in that same year (2012).
• The bars show a range, which represents the 95%confidence interval around
the percentile
47. 48
The bars show a range, which represents the confidence interval around the
percentile
18 Scopus citations.
This article has
more citations than
96% of items
indexed in that
same year (2013).
48. Total Impact
In the summary of the altmetrics
results, BLUE blocks indicate a
measure of scholarly impact
While GREEN blocks indicate a
measure of public impact
49. Plos- ALM
The PLOS Article-Level Metrics (ALM) project started in 2009 and
tracks usage, citations and social web activity for all PLOS articles.
As an ever-growing collection, it aims to continually cover a range
of subjects, including statistical analysis of altmetrics data sources
These ALMs comprise of data points that capture the ways in
which research articles are:
read,
saved,
shared with others,
commented,
cited
51. An Example of PLOS Article-Level
Metrics on a PLOS ONE Article
52. Plum Analytics was founded in 2012 and in January 2014 was acquired by
EBSCO company;
Mission of Plum Analytics is :
“to give researchers and funders a data advantage when it comes to
conveying a more comprehensive and time impact of their output”
Analysis tool aimed at helping institutions understand influence of
researchers’ work through Altmetrics ;
Plum Analytics is the provider of PlumX.
53
64. Scientific Cycle Change
65
Adie E: The grey literature from an altmetrics perspective –opportunity and
challenges. Research Trends Issue 37 June 2014, p.23-25.
65. The Mantra..Change
BE VISIBLE…OR PERISH !
66
http://pubs.acs.org/bio/ACS-Guide-Writing-Manuscripts-for-the-Digital-Age.pdf
70. How to use Altmetrics?
71
Do not ask what Altmetrics can do for you
But….
What you can do with Altmetrics
71. Your Profile as a Scientist
• If you are an active scientist – i.e. already
published, active researcher, data generator,
early, mid- or late career there is lots to do!
• If you are a junior scientist the benefits of
investing time now will provide a strong
foundation for your future!
• So what to do??
72. Some Ideas:
Register to Orcid and import your work;
Add your own ImpactStory links in your CV or in the signature of
your email;
Altmetrics.com monitors the progress of your publications;
Create an account with Mendeley, ResearchGate, Google Scholar
etc .. to share your work and to find researchers working in your
own research field;
Create a public profile work on Twitter or Facebook, post your
work and comment on those of others;
Use Altmetrics data for proposals, grants and contests.
73
78. Real time Data..
79
Altmetrics allow
authors (and
publishers) to see
what people are
saying about
their paper and can
tell them how much
attention a paper is
receiving and where
from.
http://thomasrcox.com/2015/06/03/alt
metric-score-nears-200-in-the-7-days/
94. Library…
95
“More and more
librarians are being
called upon to help track
and report on research
impact and outputs. “
See more at:
http://libraryconnect.elsevier.com/articles/201
4-06/librarians-and-research-impact-
download-and-share-new-infographic-
0#sthash.OE3ZCGDg.dpuf
95. And What about Librarians ?
Libraries can fruitfully support the entire research cycle;
Librarians can also help researchers to evaluate the impact of their
publications and to better understand the needs of their patients;
Finally, they can play an important, additional support to their users in
three important ways:
informing about emerging conversation within the latest research,
supporting experimentation with emerging altmetrics tools,
and engaging in early altmetrics education and outreach
This role change corresponds to what Priem calls “Scholarly
communication specialist”.
96
96. And What about Librarians ?
Some examples of how librarians use Altmetrics include:
Understand the usage of digital special collections and institutional
repository content;
Collection Development;
Faculty research support;
Documenting their own professional staff!
Create a study group:
“I think…If Librarians had access to article –level data, they would
use it.”
97
103. In other words:
Knowledge of altmetrics is central to libraries’ and librarians’
new educational role: helping researchers and institutions to
understand and manipulate their own impact.
Sutton, Sarah W. (2014) "Altmetrics: What Good are They to Academic Libraries?," Kansas Library Association College and
University Libraries Section Proceedings: Vol. 4: No. 2. http://dx.doi.org/10.4148/2160-942X.1041
10
4
106. Do Altmetrics correlate with..?
Altmetric counts are low (15-24%) and not very frequent in scientific
publications, although presence is increasing
Social sciences, humanities, and medical & life sciences had the highest
presence of altmetrics
Found positive weak correlation between altmetrics & citations – reflecting
that altmetrics do not capture the same concept of impact
Altmetrics are valued as a complementary tool of citation analysis
Costas, R., Zahedi, Z., & Wouters, P. (2014). Do altmetrics correlate with
citations? Extensive comparison of altmetric indicators with citations from a
multidisciplinary perspective. arXiv preprint arXiv:1401.4321.
107. Do Altmetrics correlate with..?
MohammadiE, ThelwallM, HausteinS and LarivièreV (2014). “Who Reads
Research Articles? An Altmetrics Analysis of Mendeley User Categories”,
Academia.edupre-print.
http://www.academia.edu/6298635/Who_Reads_Research_Articles_An_Al
tmetrics_Analysis_of_Mendeley_User_Categories
Suggests that “Mendeley readership can reflect usage similar to
traditional citation impact, if the data is restricted to readers who
are also authors, without the delay of impact measured by citation
counts”
10
8
108. Do Altmetrics correlate with..?
Thelwall M, Haustein S, LarivièreV and Sugimoto CR (2013). “Do
Altmetrics Work? Twitter and Ten Other Social Web Services”, PLoSONE
8(5): e64841.
http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.
0064841
Provides evidence that altmetrics can provide intelligence on the
readership of academic research that traditional citation metrics
can’t: "It seems that altmetrics probably capture a broad, or at
least a different, aspect of research visibility and impact in
comparison to citation counts”.
10
9
111. Article published in a top-tier journal with ‘0’
citations after 2 years
Article published in a lower impact journal with
tens of citations
Which article made a bigger impact?
112. Article with many citations
Article widely discussed in the social web
Article with lots of downloads
Article discussed on networks
Which article made a bigger impact?
114. Altmetrics Advantages…
Altmetrics are user friendly, graphic, self-explaining (can be used
by non specialized readers), rapidly evolving and interacting (with
media and public or users);
Altmetrics could act as a reliable tool in evaluating both
researchers and institutions;
Altmetrics provides data in real time;
Go beyond the citation only;
Gives a context;
Helps young researchers.
11
5
115. Limits:
But there are some limits…
Firstly, there is no distinction - when dealing with citations -
between positive and negative comments;
Gaming
No standard for reporting almetrics;
Need time for development!
11
6
116. NISO: National Information Standards
Organization
11
7
http://www.niso.org/topics/tl/altmetrics_initiative/
117. In general , Almetric numbers:
Don’t represent the quality of research.
Don’t indicate the quality of individual
researchers.
Don’t tell the whole story –always look
for qualitative data as well
11
8
118. In general , Almetric numbers:
• Similar to but more timely than citations
Predicting scientific impact:
• Different, broader impact than captured by
citations
Measuring societal impact:
• Impact of various outputs
“Value all research products”
Piwowar (2013)
11
9
119. Conclusive Remarks
Altmetrics is useful and may well be considered reliable.
It can represent an interesting and relevant complement to citations
Together with traditional metrics, they can be a useful tool in guiding
decision makers when funding public research.
Nevertheless, further investigations are still needed to explore and
understand what they measure and how can they be used in research
evaluation.
Librarians could play an active role.
12
0
121. Ten principles:
1. Quantitative evaluation should
support qualitative, expert
assessment
2. Measure performance against the
research missions of the institution,
group or researcher
3. Protect excellence in locally
relevant research
4. Keep data collection and analytical
processes open, transparent and
simple
5. Allow those evaluated to verify data
and analysis
6. Account for variation by field in
publication and citation practices
7. Base assessment of individual
researchers on a qualitative
judgement of their portfolio.
8. Avoid misplaced concreteness and
false precision
9. Recognize the systemic effects of
assessment and indicators
10. Scrutinize indicators regularly and
update them.
12
2
122. Thank you for your attention!
Questions ?
Valeria Scotti
v.scotti@smatteo.pv.it
12
3