Altmetrics: Using Big Data to Measure Scholarly Impact

One of the places where librarians really have a chance to shine in the data arena is from our understanding of various measures of scholarly influence and impact.

Known as bibliometrics, these measures are founded on the concept of “who cited whom” in high-quality peer-reviewed journals. Bibliometrics have been around since the 1970s, when Eugene Garfield created the Science Citation Index, (now part of Web of Science).

The advent of many web-based means of communication has sped up the feedback process for scholarly work. This has led many to believe the old model of only looking at peer-reviewed journals for cited and citing references is not necessarily an accurate measure of the influence or impact of scholarly activity. In 2010, a group of upstart scholars and information scientists published Altmetrics: a manifesto. They believe that the measure of influence and impact can be much broader by looking at references in social media, in sites like Zotero and Mendeley, GitHub, SlideShare, websites, blogs, you name it.

An explainer of the Altmetric ' donut' along with sources that the altmetric is drawn from.
The Altmetric ‘donut’ is made up of colors that reflect the mix of sources mentioning an article. The Altmetric ‘score’ for an article will appear in the middle of the donut, and is a measure derived from volume, sources and authors, each having an assigned value. Image via

The quicker turnaround these platforms provide, they argue, provides a more immediate understanding of the influence of new research. Furthermore, there are many more types of scholarly productivity than just published articles. There are datasets, conference proceedings, presentations, software code, to name a few.  Using tools and techniques of Big Data analytics, these scholars have created a new means of measuring scholarly activity. The metrics based on this type of analysis were dubbed altmetrics.

To be honest, there are many unanswered questions about how these altmetrics are going to work, and what they truly measure. As the Altmetrics Manifesto itself states:

Researchers must ask if altmetrics really reflect impact, or just empty buzz. Work should correlate between altmetrics and existing measures, predict citations from altmetrics, and compare altmetrics with expert evaluation. Application designers should continue to build systems to display altmetrics,  develop methods to detect and repair gaming, and create metrics for use and reuse of data. Ultimately, our tools should use the rich semantic data from altmetrics to ask “how and why?” as well as “how many?” (Source: Altmetrics: a manifesto)

One really neat tool that has come out of the altmetrics discussions is ImpactStory. This tool allows a scholar or researcher to set up a profile to monitor views, downloads and usage of his or her work online. Full disclosure: I am an ImpactStory Advisor, which means I am testing the tool, showing it to others, and providing the ImpactStory team with feedback that comes my way. Here is my ImpactStory profile.

There are other tools such as Altmetric Explorer,,  and PaperCritic, to name a few.  If you think you may be interested in an academic career as a researcher, scholar, or librarian, you may want to take a look at a few of these and see what you think.

Elaine Lasda Bergman

CAS in Data Science, '15 Librarian at the University at Albany

More Posts - Website - Twitter - LinkedIn

  • PC M

    I had to share the Impact Story link with my Scholarly Communication committee members.


    Thoughtful comments – I Appreciate the info ! Does someone know if I could grab a sample a form version to fill in ?