Altmetrics, or alternative metrics, measure the attention and engagement that your research receives online. Unlike tradiitional metrics such as citation counts, and digital communication measures, such as views and downloads, altmetrics track a broader range of interactions, including social media mentions, news coverage, blog posts and even policy documents. As such, altmetrics can complement traditional citation data and other metrics to offer a more complete picture of research performance and impact.
This guide explores how researchers can use altmetrics to enhance the visibility and impact of their work. Brunel administrators and researchers have access to Altmetric Explorer, a web-based platform that provides detailed insights into online attention and engagement that a research output receives.
How to access Altmetric Explorer
Brunel University staff can browse Altmetric Explorer on Brunel-networked computers; and after registering with a free user account, can access it anywhere. Log in here to access Altmetric Explorer via the institutional account. First time users will be prompted to register using their Brunel email address.
What is Altmetric Explorer?
Altmetric Explorer tracks the attention that academic articles and datasets receive online, and is designed for institutions, researchers, librarians and funders to help them track and analyse research impact across a range of online sources.
It includes data from:
Understanding the Altmetric Doughnut and the Attention Score
What is the Altmetric Doughnut?
The Altmetric Doughnut is designed to help you easily and visually identify how much and what type of attention a research output has received. The Altmetric Attention Score is an automatically calculated, weighted count of all of the attention a research output has received and is presented at the centre of the doughnut. The higher the score, the greater the attention, and this score will increase in real-time. The surrounding colours reflect the mix of sources mentioning that score; blue is used for X (formerly Twitter), yellow for blogs, red for mainstream media sources. You can click on the doughnut to visit the details page for the particular research output and to see the original mentions and references that have contributed to the attention score.
How does the Altmetric Attention Score work?
The Altmetric score is a quantitative measure of the attention that a scholarly article has received and uses 3 main factors:
The Altmetric Doughnut appears in BRAD (Symplectic Elements) against every published output record containing a DOI in the system enabling individual researchers to track the attention generated by their research simply by logging in to their BRAD account and navigating to the relevant output.
Logging in to Altmetric Explorer, enables researchers and administrators at all levels to analyse the data in a variety of ways. For example, the data can be filtered by date, keyword, funder or publication; or aggregated to report on the work of a group or department. University-wide analysis is also possible by topic or research area using various publisher-defined categories.
What is analysed?
Although Altmetrics are often referred to as if they are a single class of indicator, they’re actually quite diverse and include:
A record of attention: This class of metrics can indicate how many people have been exposed to and engaged with a scholarly output. Examples of this include mentions in the news, blogs, and on Twitter; article pageviews and downloads; GitHub repository watchers.
A measure of dissemination: These metrics (and the underlying mentions) can help you understand where and why a piece of research is being discussed and shared, both among other scholars and in the public sphere. Examples of this would include coverage in the news; social sharing and blog features.
An indicator of future citations: Articles that are downloaded on reference managers such as Mendeley are increasingly reliable predictors of future citation counts amongst the academic community.
An indicator of influence and impact: Some of the data gathered via altmetrics can signal that research is changing a field of study, the public’s health, or having any other number of tangible effects upon larger society. Examples of this include references in public policy documents; or commentary from experts and practitioners.
Each of these different dimensions help to provide a more nuanced understanding of the value of a piece of research than citation counts alone. However, it is important to remember that all metrics (including impact factors and other citation-based metrics) are merely indicators. They highlight spikes in different types of attention but are not themselves evidence of such. True evidence of impact requires a much closer look at the underlying data; who is saying what about research, where in the world the research is being cited, reused, read and so on.
Source: https://www.altmetric.com/about-altmetrics/what-are-altmetrics/
For researchers:
For institutions:
See the video above for a further outline of how Altmetric works. Alternatively you can watch a video here.
When using altmetrics, including Altmetric Explorer, it's crucial to adhere to the principles of responsible research metrics, such as those outlined in the Leiden Manifesto and the San Francisco Declaration on Research Assessment (DORA) to which the University is a signatory. These guidelines emphasise the importance for the responsible, fair and transparent use of metrics.
It is also important to interpret altmetrics in context. High Altmetric Attention Scores or frequent mentions do not necessarily indicate high-quality research or impactful research. Instead, consider the nature and quality of the engagement or mention. Altmetric Attention Scores should be used to complement, not replace traditional measures of research impact.