Visibility and impact: Measuring impact with altmetrics

Introduction

Altmetrics, or alternative metrics, measure the attention and engagement that your research receives online. Unlike tradiitional metrics such as citation counts, and digital communication measures, such as views and downloads, altmetrics track a broader range of interactions, including social media mentions, news coverage, blog posts and even policy documents. As such, altmetrics can complement traditional citation data and other metrics to offer a more complete picture of research performance and impact. 

This guide explores how researchers can use altmetrics to enhance the visibility and impact of their work. Brunel administrators and researchers have access to Altmetric Explorer, a web-based platform that provides detailed insights into online attention and engagement that a research output receives. 

 

How to access Altmetric Explorer

Brunel University staff can browse Altmetric Explorer on Brunel-networked computers; and after registering with a free user account, can access it anywhere. Log in here to access Altmetric Explorer via the institutional account. First time users will be prompted to register using their Brunel email address. 

Altmetrics for impact

What is Altmetric Explorer?

Altmetric Explorer tracks the attention that academic articles and datasets receive online, and is designed for institutions, researchers, librarians and funders to help them track and analyse research impact across a range of online sources.

It includes data from:

  • Global mainstream and specialist media coverage (e.g. The Guardian, New York Times) and specialist media (New Scientist, Scientific American)
  • Reference managers such as Mendeley
  • Public policy documents (including sources such as NHS NICE evidence bank, World Health Organisation and UNESCO)
  • Post-publication peer review forums including PubPeer and Publons
  • Citations from Scopus
  • Social media like Twitter, Facebook, Google+, Pinterest and Blogs
  • English language contributions to Wikipedia

 

Understanding the Altmetric Doughnut and the Attention Score

What is the Altmetric Doughnut?

The Altmetric Doughnut is designed to help you easily and visually identify how much and what type of attention a research output has received. The Altmetric Attention Score is an automatically calculated, weighted count of all of the attention a research output has received and is presented at the centre of the doughnut. The higher the score, the greater the attention, and this score will increase in real-time. The surrounding colours reflect the mix of sources mentioning that score; blue is used for X (formerly Twitter), yellow for blogs, red for mainstream media sources. You can click on the doughnut to visit the details page for the particular research output and to see the original mentions and references that have contributed to the attention score. 

How does the Altmetric Attention Score work?

The Altmetric score is a quantitative measure of the attention that a scholarly article has received and uses 3 main factors:

  • Volume:The score for an article rises as more people mention it. Only one mention from each person per source is counted, so if you or someone else tweet about the same paper more than once, Altmetric will ignore everything but the first.
  • Sources: Each category of mention contributes a different base amount to the final score. For example, a newspaper article contributes more than a blog post which contributes more than a tweet.
  • Authors:  Altmetric considers how often the author of each mention talks about scholarly articles; whether or not there's any bias towards a particular journal or publisher and who the audience is. For example, a doctor sharing a link with other doctors counts for far more than a journal account pushing the same link out automatically.http://www.altmetric.com/whatwedo.php

Analysing attention

The Altmetric Doughnut appears in BRAD (Symplectic Elements) against every published output record containing a DOI in the system enabling individual researchers to track the attention generated by their research simply by logging in to their BRAD account and navigating to the relevant output. 

Logging in to Altmetric Explorerenables researchers and administrators at all levels to analyse the data in a variety of ways. For example, the data can be filtered by date, keyword, funder or publication; or aggregated to report on the work of a group or department. University-wide analysis is also possible by topic or research area using various publisher-defined categories. 

 

What is analysed?

Although Altmetrics are often referred to as if they are a single class of indicator, they’re actually quite diverse and include:

A record of attention: This class of metrics can indicate how many people have been exposed to and engaged with a scholarly output. Examples of this include mentions in the news, blogs, and on Twitter; article pageviews and downloads; GitHub repository watchers.

A measure of dissemination: These metrics (and the underlying mentions) can help you understand where and why a piece of research is being discussed and shared, both among other scholars and in the public sphere. Examples of this would include coverage in the news; social sharing and blog features.

An indicator of future citations: Articles that are downloaded on reference managers such as Mendeley are increasingly reliable predictors of future citation counts amongst the academic community.

An indicator of influence and impact: Some of the data gathered via altmetrics can signal that research is changing a field of study, the public’s health, or having any other number of tangible effects upon larger society. Examples of this include references in public policy documents; or commentary from experts and practitioners.

Each of these different dimensions help to provide a more nuanced understanding of the value of a piece of research than citation counts alone.  However, it is important to remember that all metrics (including impact factors and other citation-based metrics) are merely indicators.  They highlight spikes in different types of attention but are not themselves evidence of such.   True evidence of impact requires a much closer look at the underlying data; who is saying what about research, where in the world the research is being cited, reused, read and so on.

Source: https://www.altmetric.com/about-altmetrics/what-are-altmetrics/

How to use Altmetric Explorer

Key benefits

For researchers:

  1. Track publications: Use Altmetric Explorer to monitor the online impact of your published work. You can search for publications using DOIs, titles, or author names. Set up alerts to regularly check the platform to stay informed about the latest mentions of your research. By doing so proactively, you may be able to quickly respond to discussions or amplify positive coverage. 
  2. Identify key influencers: Explore who is discussing your research. Understanding who the key influencers are, such as journalists, policy makers, other researchers, can help you identify potential collaborators and inform outreach strategies.
  3. Share insights: You can include Altmetric Atttention Scores or notable mentions in your CV, grant applications or performance reviews.
  4. Engage with audiences: Understanding who is using and discussing your work, offers the opportunity to consider how to engage with them further to enhance the real-world impact of your research. 

For institutions: 

  1. Monitor impact: Altmetric Explorer allows the institution to track the impact of its research output as a whole, which can be used to demonstrate impact. 
  2. Strategic planning: data from Altmetric Explorer can be used to identify areas of research strength or to plan public engagement initiatives. 
  3. Real-time attention monitoring: By setting alerts and monitoring, positive media coverage and attention can be monitored and amplified by communications teams. 

Altmetrics: a beginner's guide

Why use Altmetrics?

  • Assess early stage impact; online activity around a paper is more likely to occur in the first few weeks after publication, in contrast to traditional metrics which take longer to accrue. 
  • Track attention for specific projects, groups or departments
  • Provide evidence of impact for reports or applications. 
  • Identify potential research collaborators.

Key features

  • Real-time tracking: Mentions of research outputs are tracked in Altmetric Explorer in real time, allowing you to monitor the immediate impact of your work.
  • Altmetric Attention Score: Each output is given an Altmetric Attention Score, a weighted count of the attention it has received across different platforms. This score helps you quickly assess the reach and engagement of your work. 
  • Mentions and attention: The platform provides detailed information about who is mentioning your research and whereInformation is categorised by source type, e.g. mentions in the news, policy documents, social media, and by geographical location. This information can be particularly valuable in disciplines which produce a greater range of non-publication outputs, e.g. arts and humanities.
  • Institutional insights: Altmetric Explorer offers aggregated data that can help the institution understand the broader impact of research output at department or institutional level, providing additional supplementary information or context to citations and can be used to benchmark performance against peer institutions. 
  • Downloadable reports: the platform can be used to generate and export detailed reports including specific mentions, attention scores and broader trends. These reports can be useful to help demonstrate impact for institutional reporting, annual reviews and even grant applications. 

See the video above for a further outline of how Altmetric works.  Alternatively you can watch a video here.

Responsible and ethical use of research metrics

When using altmetrics, including Altmetric Explorer, it's crucial to adhere to the principles of responsible research metrics, such as those outlined in the Leiden Manifesto and the San Francisco Declaration on Research Assessment (DORA) to which the University is a signatory. These guidelines emphasise the importance for the responsible, fair and transparent use of metrics. 

  1. Is the attention positive or negative? Not all attention is beneficial. Critiques or controversy can inflate metrics and may not reflect positive impact. 
  2. Is the source credible? Evaluate the credibility and relevance of the sources mentioning your research. Mentions in reputable news outlets or policy documents may hold more weight than those in less rigorous forums. 
  3. Consider discipline norms. Different fields have different engagement patterns, for example, arts, humanities and social sciences may receive more attention on social media compared to engineering or physical sciences, which might see more citations. 
  4. Use data ethically. When presenting altmetric data, the source and nature of the engagement should be accurately presented, without exaggerating its significance. 
  5. Be mindful of the privacy of individuals: It is important to consider the privacy of individuals who may be discussing or sharing research online, and avoid using engagement data in ways they might not expect. 

It is also important to interpret altmetrics in context. High Altmetric Attention Scores or frequent mentions do not necessarily indicate high-quality research or impactful research. Instead, consider the nature and quality of the engagement or mention. Altmetric Attention Scores should be used to complement, not replace traditional measures of research impact.