An introduction to altmetrics for librarians, researchers and academics

magnifying glass at centre of network

I’ve written about altmetrics for library and information professionals previously for this blog and why it should matter for them. Since then I have completed an edited book on the topic for Facet Publishing to help those in academia understand altmetrics better and devise effective ways of using them in their institution. 

The book looks at how altmetrics came about and their connection back to the first rumblings of the social web at the start of this century. It also looks at traditional research metrics and the wider use of bibliometrics before moving to the present day and tools like Altmetric.com, Impactstory and Mendeley. It is part practical, part theoretical and acts as a guide for LIS professionals and researchers who may have varying degrees of expertise on the topics of metrics, altmetrics, social media, mobile working and open peer review. 

About altmetrics

Altmetrics first appeared at the start of the decade and one of the messages that soon spread around the web was that it would break the vice like grip of traditional metrics. This message came from nowhere in particular, but like a Chinese Whisper it was one that changed with each communication. Most knew that traditional metrics were flawed and did not tell the whole picture, that they focused on just journal level metrics. It is one of a herd of elephants in the corner of the academic room. 

In a world that was increasingly being dissected and communicated across social media, traditional metrics only told part of the picture. Journals had impact factors and papers had citations, both of which took time to accrue and were focused wholly on journals. Whilst elsewhere on the web, journal papers, datasets, software, presentations, posters were being discussed, shared and cited but for the most part no one was aware of this activity. Papers were being published and in some conversations were taking place over the web but there was no system to feed this back to the authors of that paper. Altmetric platforms started to appear post 2010 with tools like Figshare, Impactstory, Altmetric.com and Mendeley among others taking the lead. 

Tools such as Altmetric.com look at the different outputs coming from a piece of research. For example with Altmetric.com, these can be tweets, mentions and citations in policy documents, Wikipedia, blogs and news outlets. They also monitor Facebook Shares, Mendeley saves and mentions on other social media platforms. The data is broken down into platforms, geographical locations, date periods and cross-indexed against other such similar research. Finally the research receives an altmetric score which is weighted depending on what platforms the research has been mentioned on, for example, newspaper coverage is worth more than a blog post, which in turn is worth more than a Tweet. 

Altmetrics complement traditional metrics, they don’t replace them

Critics have taken issue with the term altmetrics, implying it is an alternative to traditional metrics, one that would replace existing systems. The truth is that altmetrics are much better viewed as alternative indicators and not some wholly new system of measurement looking to usurp citations and impact scores. Yet this is something I often hear and see with those quick to point out that (as with impact scores and citations) quantity does not mean quality.

Yet with something like Altmetric.com there is a score and one that is very visible to researchers, as there is a score on ResearchGate, but for most it means little. It is another way of showing like for like and ranking research artefacts in peer groups, organisations and journals. In a world where research needs to be more accountable and transparent it is no bad thing if used correctly. It is when the scores are used as an indicator of an academic’s value that we might start to have problems. A tool like Altmetric.com gives us new insights that we could not get from impact scores and citations; it augments or complements existing research measurement. Altmetrics allow us to see data at an article level as it no longer kept papers together thanks to their impact score. It also opened up the possibilities of measuring artefacts we had previously ignored, datasets, posters, grey literature. Platforms such as Figshare allow researchers and students to share and host these ignored artefacts and as a result communicate them and receive subsequent feedback and analytics. 

Altmetrics provide feedback long missing in academia

When I first came across altmetrics in 2012 I saw it beyond the numbers and scores, I saw it as that missing piece of the jigsaw that followed what Web 2.0 brought us in 2005. It was a feedback loop that we were long missing in academia. It was obvious from around the time of Web 2.0 that there was a change afoot on the web, sadly one that academia was slow to adapt to. Conversations had long taken place between academics and the general public on public and private discussion forums, but Web 2.0 opened up the opportunities for so much more.

 Altmetrics extends the potential for communication, feedback and potential collaboration. The thing that altmetrics brings to the table above anything is the ability to see who is talking about your research, what they think and who they share it with in what part of the world. Unlike citations, impact scores and h-indexes this new data is of interest to more than just the researcher. Anyone involved, from the fund holder to director of research, picks up on the conversation. Again the worry for the manager and the academic is that they may become too hung up on that score it generates. 

Altmetrics align with the tools that help communicate research

Research is being shared more than it has ever been, the impact agenda and REF are not reliant on altmetrics, but it does help capture the communications and discussions around a piece of research.

Many researchers are taking an interest in how to best communicate their findings, but are not sure how to. I have had lots of conversations with students to professors on how best to disseminate their work to wider audiences or target specific ones. The problem many have is that they are keen to get their work out there, but rarely think about how to measure that, or respond to it once it’s flown the nest. 

As I’ve said previously to some colleagues, putting your content on the web without a strategy and hoping interested parties will find it can be like stuffing a copy of your research paper into a bottle and casting it out to sea. There is a slim chance it will be found, but you might not discover by whom, whilst the reality is that it probably won’t be found. So altmetrics is part of a bigger strategy, it aligns with the tools that help communicate the research, but it also shows you to some extent how well it has been communicated. 

This leads to another argument among detractors, that some academics will end up being driven to communicate their existing work rather than produce new pieces of research. This would leave them with a very high score on the Kardashian Index (far more tweets than citations) and, detractors argue, altmetrics would then show the best communicators not the most productive researchers. Of course engaging with scholarly communications takes up time, but it really should form part of the same processes that seminars, conferences and webinars do. It seems a missed opportunity not to embrace the tools to reach peers and audiences globally when creating research that ultimately may have some impact on them. 

Altmetrics are part of a bigger change within academia that includes open and post publication peer review. Done properly and we all benefit, done badly and we will end up with a bigger mess than we started with. Hopefully the aforementioned book will go some way to negating many of these problems and in time we will see online scholarly communications as the norm. Altmetrics can help give value to these communications as well as provide opportunities for library and information professionals to support and engage with the research community in using them.

Altmetrics: a practical guide for librarians, researchers and academics

Altmetrics: A practical guide for librarians, researchers and academics

This book, by Andy Tattersall, gives an overview of altmetrics, their tools and how to implement them successfully to boost your research output.

Or find out more on facetpublishing.co.uk.

 

Read our blog comment guidelines