Products Archives - Altmetric https://www.altmetric.com/blog/tag/products/ Discover the attention surrounding your research Wed, 18 Jan 2023 17:04:51 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.2 https://wordpress-uploads-production.s3.amazonaws.com/uploads/2022/09/cropped-altmetric-symbol-32x32.png Products Archives - Altmetric https://www.altmetric.com/blog/tag/products/ 32 32 Open Access Week: Climate Justice https://www.altmetric.com/blog/open-access-week-climate-justice/ Mon, 24 Oct 2022 10:26:00 +0000 https://www.altmetric.com/?p=4524 The annual Open Access Week has arrived once more, and the chosen theme for this…

The post Open Access Week: Climate Justice appeared first on Altmetric.

]]>
The annual Open Access Week has arrived once more, and the chosen theme for this year is Climate Justice. But what do we know about this topic, and how can we access impactful content in this area?

Typically, Open Access Week encourages academics, researchers and the wider community to learn more about Open Access and to share what they’ve learned about it with a broader audience. With Climate Justice in mind (defined as the climate crisis impacting different groups of people and countries in an unequal way), Open Access Week encourages us to gain a better understanding of how the climate crisis will impact us all, rather than letting some groups/countries face the brunt of it.

In last year’s blog post on Open Access Week, we discussed the rise in popularity of Open Access (OA) publishing following the Plan S initiative, supported by cOAlition S. The initiative aims to make all scholarly publications available in Open Access journals, on Open Access platforms or available through Open Access repositories.

Different types of Open Access

Using Altmetric Explorer, you can find out more about accessing relevant content and even filter by the four different types of Open Access:

Gold Open Access – it’s a licensed article that is immediately available to read on a journal’s website (usually under a Creative Commons license)

Hybrid Open Access – it’s a subscription based journal that can also publish individual articles as Open Access for a fee

Green Open Access – it’s a free of charge, peer reviewed version of an article archived in an institutional repository

Bronze Open Access – it’s a free to read version on a publisher’s website, but it can’t be reused, adapted or shared.

Like last year, we conducted our own case study on the Climate Justice topic using Altmetric Explorer. We wanted to see how many research outputs were Open Access compared to closed access (e.g. books or some news articles), and which types of Open Access they were.

a selection menu with a square rectangle around one option

With Altmetric Explorer, you can choose whether to search for Open Access only, and you can narrow it down to a specific Open Access type. The tool found 1,267 research outputs from the results query, but only 826 were mentioned.

Based on the top Altmetric Attention Score, we then looked at the top 100 research outputs, to go into more detail about how many were Open Access. After we narrowed it down, we discovered 47% were Open Access. Below is the specific breakdown of Open Access  types:

a bar chart

Making up the 47% research outputs there were 7% Gold OA, 18% Hybrid OA, 17% Green OA and 5% Bronze Open Access types, as compared to 53% non-OA research outputs. To further break down the research outputs, both open access and closed access had their resources online, meaning the information is easier to access and disseminate, unlike a chapter or a book.

 

Open Access

Closed Access

Articles

44%

31%

News

1%

14%

Chapters

1%

1%

Books

1%

7%

This is a prime example of the insights our Altmetric Explorer can give us about a specific topic, particularly a topic as important as Climate Justice. For more information or any specific questions you have about the tool, please get in touch.

We hope that Open Access Week 2022 has encouraged important conversations around the world about the climate crisis and we look forward to seeing what happens next.

The post Open Access Week: Climate Justice appeared first on Altmetric.

]]>
How to: Using the Sustainable Development Goals search filter in Altmetric Explorer https://www.altmetric.com/blog/how-to-use-the-sustainable-development-goals-search-filter-in-altmetric-explorer/ Fri, 22 Apr 2022 16:28:00 +0000 https://www.altmetric.com/?p=4504 The Sustainable Development Goals (SDGs) are targets for global development adopted by the United Nations…

The post How to: Using the Sustainable Development Goals search filter in Altmetric Explorer appeared first on Altmetric.

]]>
The Sustainable Development Goals (SDGs) are targets for global development adopted by the United Nations in 2015. Comprised of 17 interconnected goals, they are a universal call to action to end poverty, protect the planet, and improve the lives and prospects of people everywhere.

The SDG data is provided to Altmetric by Dimensions, who have implemented automatic classification of publications by aligning them to the goals using supervised machine learning based on extensive training sets and curated keyword searches. While this process is constantly being improved, it is still not perfect! Keep in mind that not all publications are classified with an SDG. Visit Digital Science’s SDG classification page to learn more. 

Getting started is easy

Step-by-step guide

First, click the blue edit search button at the top of the screen in Explorer to access your advanced search. You’ll see Sustainable Development Goals as a search filter option.

Please note that the location of this filter may be in a slightly different location in your advanced search interface than the image below. The location depends on your organization’s subscription..

Now, you can search for a specific SDG by name or by number.  For example, if you type in water, you’ll  see the two SDGs with water in the title. Or you can search by an SDG’s number. You can search for more than one SDG at a time and they will be ‘ORed’ together in your search. Click run search to view your results.

a selection menu with an arrow pointing to a dropdown menu and a circle around a button

Now you can see that there are over 51,000 outputs with attention that are classified with these two SDGs. If you navigate to the Research Outputs tab, and view the results as a list, you’ll see the SDG information on the right side of the screen.

a menu and a selection menu - a circle around a menu bar option, a circle around a results list icon, an arrow pointing towards a selection menu option

Going back to the advanced search, you can try combining this filter with other search filters as well. For example, you can limit by publication date, search by journal title, or search for a specific company or institution, such as Harvard University. If you run the search pictured in the image below, you can view over 220 outputs from Harvard University classified with these two SDGs and explore the attention they are receiving online.

a selection menu with a circle around one option

If you have any questions about how to use this feature in Altmetric Explorer, please contact our support team.

The post How to: Using the Sustainable Development Goals search filter in Altmetric Explorer appeared first on Altmetric.

]]>
A ‘no update’ update: setting the record straight https://www.altmetric.com/blog/a-no-update-update-setting-the-record-straight/ Fri, 27 Aug 2021 16:39:00 +0000 https://www.altmetric.com/?p=2092 Unlike the majority of our blog posts which are intended to tell you about new…

The post A ‘no update’ update: setting the record straight appeared first on Altmetric.

]]>
Unlike the majority of our blog posts which are intended to tell you about new and existing things happening at Altmetric, the purpose of this post is actually just to tell you that something has not changed.

You may have seen a blog post by Kent Anderson last week which indicated that Altmetric has changed the way we score Twitter as part of the Altmetric Attention Score. This is incorrect. We have not changed the Altmetric scoring algorithm. What we have done recently is update our documentation. Like everyone, we do this from time to time whenever we feel we can provide users with better clarity about what we do.  

Unfortunately, we are unable to post a clarifying comment directly on the original blog (only paying subscribers can comment). We hope this post clarifies the situation and eases the confusion caused by the erroneous information in the post.


Twitter weighting and the Altmetric Attention Score 

When he contacted us, our Customer Support Manager correctly informed Kent Anderson that we have made no updates to the Twitter score contributions. As mentioned above, we have refined our support documentation so that it reflects more accurately the average weighting that drives the Altmetric Attention Scores. 

For clarity, we take a number of factors into account in determining the weighting of a single tweet.  These factors include whether the Tweeter has shared that research output before and also how often they have Tweeted any output hosted on the same domain within a given period. The single tweet weightings range from 0.25 to just over 1, with the most common weighting actually being 0.25 – hence the update to the documentation to amend the Twitter scoring from 1 to the more common 0.25.

We are the first to admit that scoring is complex – both technically and intellectually. I highly recommend reading one of Euan’s early posts, ‘Gaming altmetrics’, which does a great job of explaining some of the thinking that went into establishing the Altmetric Attention Score algorithm.   

The important concept to remember is that the Altmetric Attention Score is designed as an indicator to help users identify where there is activity to explore, and to help them easily see how the volume of that activity compares between research outputs.  We have always, and continue to, emphasize the importance of scrutinising the underlying data rather than taking the score as an indicator of the value or quality of any research output. 

I hope this helps clarify any queries you may have about this topic but please don’t hesitate to contact us at support@altmetric.com if you have any more questions.

Register here to receive the latest news and updates from Altmetric

The post A ‘no update’ update: setting the record straight appeared first on Altmetric.

]]>
Leaping into the future: Why I’m excited about what’s ahead https://www.altmetric.com/blog/leaping-into-the-future-why-im-excited-about-whats-ahead/ Thu, 15 Mar 2018 12:07:00 +0000 https://www.altmetric.com/?p=2143 One day in a staff meeting, I listened to a fascinating presentation on a research…

The post Leaping into the future: Why I’m excited about what’s ahead appeared first on Altmetric.

]]>
One day in a staff meeting, I listened to a fascinating presentation on a research project the foundation I was working for had funded. After the presentation was over, it was time for questions. One of the very first questions, from the back of the room, was “This is great – has anyone picked it up? Is anyone talking about it?” I practically leapt from my seat.

You see, three years prior to this question being asked, it would have taken me a considerable amount of time and effort to provide an answer. The research I conducted would have been centered on fairly general searches in Google, with a myriad of quasi-relevant results, which would in turn need to be qualified: Is this talking about the specific study, or the researcher? If it is about the research, what is being said? Hours of my life would have been spent on this relatively simple question.

But I leapt from my seat because we were now using Altmetric’s Explorer for Institutions. I knew that I only needed the DOIs, and I could provide an answer – and did – within minutes. This was precisely the type of question the foundation could now answer with a great degree of confidence, quickly, and with context. Having these answers could help us better target our engagement efforts, strategize about funding, and communicate to our stakeholders, among other things.


Knowing and seeing how the development of technologies is made in concert with principles I admire and aspire to makes me genuinely happy to be an advocate here.”


When I decided to leave the foundation and come to work for Digital Science, it was in large part because I became obsessed with the ways in which technology could help advance research and science in general. Think about how many institutions around the world lack basic resources for researchers – how can we collaborate to help bridge that gap with all of these amazing tools? How do we help accelerate the pace of research globally, while making it as inclusive as possible? How can we do this in an economically fair and sustainable way?

SIDEBAR: Am I a technical person? Absolutely not. I don’t and can’t code; I can barely wrap my head around algorithms. I just know that these things are rapidly expanding the world of what is possible in amazing ways that I can’t wait to discover.

How could having this data take the burden off of researchers when reporting on grants? How could this data help researchers win funding to begin with? How can this data arm funding institutions with actionable insights into the work they’re funding? What informal networks could be discovered that would help researchers connect with each other globally? How can funders use this information to help nurture these networks?

Now that I’m here at Digital Science, “behind the curtain” so to speak, I’m even more enthusiastic. The people who come up with these products are not only brilliant, but they aspire to the same goals: advance scientific research, provide useful (and gorgeous!) tools, and, perhaps most importantly, listen to the needs of the scientific community. Knowing and seeing how the development of technologies is made in concert with principles I admire and aspire to makes me genuinely happy to be an advocate here.

Not to mention that with the advances in natural language processing, artificial intelligence and machine learning, our imaginations may one day be our only actual limits, to paraphrase generously.  

I could honestly leap from my seat all day.

Register here to receive the latest news and updates from Altmetric

The post Leaping into the future: Why I’m excited about what’s ahead appeared first on Altmetric.

]]>
Dimensions Badges: A new way to see citations https://www.altmetric.com/blog/dimensions-badges-a-new-way-to-see-citations/ Fri, 26 Jan 2018 11:08:00 +0000 https://www.altmetric.com/?p=2062 Last week, the entire Digital Science family was thrilled to finally announce the Dimensions platform…

The post Dimensions Badges: A new way to see citations appeared first on Altmetric.

]]>
Last week, the entire Digital Science family was thrilled to finally announce the Dimensions platform to the world. We at Altmetric were especially excited, as we played a major role by developing a related product: the Dimensions badges.

Dimensions badges are interactive visualizations that showcase the citation data for individual publications. Each publication that is indexed in the Dimensions database gets its own badge. Although the first thing you see on every Dimensions badge is the citation count, clicking through to the accompanying details page reveals even more useful information, from new citation performance metrics such as the Field Citation Ratio (FCR) to a visualization that indicates the publication’s relative influence on specific research areas.


Introducing the new Dimensions badges

a Dimensions/Altmetric badge with citation data and below that, a menu tab and detailed citation data

Early in 2017, the Altmetric team was asked to create something new and compelling that would showcase the richness of the Dimensions citations data. Inspired by our Altmetric badges, we thought another form of embeddable badge would be the easiest way for end users to showcase their own citation metrics. Having gained extensive data visualization expertise through various Altmetric projects over the years, we felt that we could apply many of the design lessons we’d learned to the creation of the new Dimensions badges.

Our primary goal was to make the Dimensions badges and associated details pages as simple, attractive, and easy-to-use as possible. We wanted the metrics to be clear and understandable and thus decided to provide human-readable explanations that would make it easier to place the numbers in context.

As such, on the Summary tab of each Dimensions badge details page, we have presented four main citation metrics (total citation count, recent citation count, Field Citation Ratio or FCR, and Relative Citation Ratio or RCR) as visual gauges, with accompanying interpretative text. The interpretative text changes depending on the value of the Field Citation Ratio, and is helpful for giving users a sense of whether the number of citations received by the publication is high, low, or average when compared to other publications in the same field. You can read more about what the metrics mean, how they are calculated, and how the interpretative text is generated here.

The other tabs on the Dimensions badge details page are intended to make the citations data more auditable and useful – for example, if we give users citation counts, then it should always be possible to look at the actual citing publications and their details, too. On the Citations tab, users can view up to 3 recent citing works, and then go directly into the free Dimensions platform to browse through the full list.

The Citing Research Categories tab provides users with a visualization that illustrates how many times the publication has been cited in various research areas, as indicated by Fields of Research (FoR) codes. This represents a new way to visualize the publication’s influence on research, within and across disciplines.


Behind the scenes: the design process

We had a lot of fun designing the new Dimensions badges – something you may have noticed by the fact that the circular badge spins when you mouse over it! We wanted to ensure that the badges would be both eye-catching and also suitable for any website style. As such, we have provided many configuration options, so that users can adapt the badge for use on any website.

Following last week’s Dimensions launch, I have been asked a few times about the meaning of the colors in the badges – do they change with the metrics, in a similar way to the Altmetric badges? The answer is actually no – the colors are meant to represent the Dimensions branding. In turn, the Dimensions logo colors reflect the multi-faceted contributions to the project by each Digital Science portfolio company.

Earlier on in the design process, we did briefly explore the idea of modulating Dimensions badge colors alongside changes in the citation metrics. However, when we tested some of these initial ideas out with researchers, the overall response was that the color shifts were interesting, but also quite confusing if one wasn’t familiar with metrics like the FCR. In the end, we decided to keep things simple. As one of our objectives with the badge was to help introduce Dimensions to the world, we wanted the Dimensions logo to stay recognizable. We hope that, in this instance, our users will agree with our decision to keep the colors simple!

A Dimensions dashboard with lists of publication data and the Dimensions and Altmetric Badges on the right side panel

Additionally, as we have always regarded altmetrics and citations data to be complementary to each other, we knew that many potential users of the Dimensions badge might want to include it alongside an Altmetric badge. To ensure that the two embeds would look great next to each other, we made the 2 Dimensions badge styles (the circular and rectangular styles) available in the same sizes as the Altmetric badges. The Dimensions platform makes full use of this by displaying the new Dimensions and Altmetric badges side-by-side on individual publication pages.


How do I get the Dimensions Badges for my website?

Dimensions badges can easily be embedded into any webpage with just a simple line of code. The documentation includes a handy “Badge Builder”, which can help users configure the badge appearance and behavior. The badges are free for individual researchers, academic institutional repositories, and certain publishers to use (but terms of commercial use may vary – click here for more information).

Show us your badge integrations!

We would love to see where you are using the Dimensions badges, so please do share screenshots and links with us via Twitter or by emailing us at info@dimensions.ai. Please also feel free to share any feedback you have, so we can continue to improve the badges and accompanying details pages in future.

Register here to receive the latest news and updates from Altmetric

The post Dimensions Badges: A new way to see citations appeared first on Altmetric.

]]>
The Altmetric score is now the Altmetric Attention Score https://www.altmetric.com/blog/the-altmetric-score-is-now-the-altmetric-attention-score/ Thu, 30 Jun 2016 11:39:00 +0000 https://www.altmetric.com/?p=2124 You’re probably already familiar with the Altmetric score (that handy number seen inside an Altmetric “donut”).…

The post The Altmetric score is now the Altmetric Attention Score appeared first on Altmetric.

]]>
You’re probably already familiar with the Altmetric score (that handy number seen inside an Altmetric “donut”). The score is a weighted count of all of the mentions Altmetric has tracked for an individual research output, and is designed as an indicator of the amount and reach of the attention an item has received.

We have now decided to formally name this indicator the Altmetric Attention Score.

The new name will make it easier for newcomers to altmetrics to understand what our score indicates (the volume and likely reach of research’s attention, not quality or impact), at a glance. We’re also aiming to make it clearer that this is a number calculated and assigned by Altmetric, the company, as opposed to an ‘altmetrics’ score.

If you refer to the “Altmetric score” in presentations, LibGuides, or anywhere else, we encourage you to update your resources as soon as possible.


Why do we have the score?

Occasionally people ask why we use a score at all, and internally we’ve discussed it a few times. The strength and weakness of the score is that it simplifies something quite handwavy: what kind of attention did this research output receive? The strength comes from allowing you to rank and quickly understand attention. The potential weakness is that it might encourage people to oversimplify, or use a single number somewhere that quantitative analysis is called for.

There are many things that make the score useful, and it is for these reasons that we continue to apply it:

It allows for ranking

The score helps users see at a glance which pieces of research have received a lot of attention – providing a useful indicator of where there might be attention and discussions that are worth exploring further. That activity might be positive or negative, but it can help readers at least begin to decide what altmetrics data to dive into first.

It allows for context

Of course not all research outputs should be compared to one another – there are huge discrepancies in the way research is shared and discussed between disciplines, formats and geographies. What the score does enable users to do, between articles published in the same journal or same timeframe, is to benchmark and see how the amount of attention for one item compares with that of another. Such information can be useful for authors, for example, in determining which journal to publish in (although of course you should always also check what type of attention the research is receiving, and why!). 

Identifying trends and hot topics is easier

Because the score is so easy to track it’s possible to monitor the uptake of many items at once – similar to the way download counts might be used to monitor popular content. A rapidly rising score might bring an item to the attention of the press office or editors, who may other not have been aware of its resonance amongst a broader audience.


How is it calculated?

The Altmetric Attention Score is a weighted count of all of the online attention Altmetric have found for an individual research output. This includes mentions in public policy documents and references in Wikipedia, the mainstream news, social networks, blogs and more. Click here to view a full list of the sources we track for mentions of research.

Mendeley readers, CiteUlike bookmarks and Scopus citations do not count towards the score because we are unable to show you all of the details of who saved or cited the item.

More detail on the weightings of each source and how they contribute to the attention score is available here.


How should researchers, publishers and institutions use the score?

We’d always advise that you use the score only as a very preliminary indicator of the amount of attention an item has received. It can help you identify where there are ‘mentions’ that would be worth digging into, and signifies where an item has achieved a high level of engagement. In CVs, performance reviews or grant applications, it might be useful to provide information such as:

“This item has received an Altmetric Attention Score of 150, putting it in the top 5% in terms of attention for papers published in this journal. Coverage included stories in the Washington Post and Der Spiegel, as well as commentary from several leading bloggers. A full summary of the attention score and record of all of the online mentions can be found here” (and include a link to the associated details page)

At the institutional level attention score makes it easy to identify where there is a lot of activity taking place – to help find influencers or departments that are doing a particularly good job of communicating their research more broadly, or where they may need support from a scholarly communications office or similar. It can highlight where there is opportunity for improvement – for example a great piece of research that has not received the attention that it deserves or a subject area that the institution is aiming to become more established in. Having the immediate and real-time feedback that the score represents also makes for an engaging starting point in discussions between the library or research support office and faculty.

Publishers can use the attention score to identify their most popular content and keep on eye on how their publications are being received – something that suddenly attracts a lot of attention (causing the score to rise) might warrant further investigation. Press, marketing and editorial teams can then use this information to monitor the activity, respond if necessary, and get an idea of which content resonates most amongst their audience.

Register here to receive the latest news and updates from Altmetric

The post The Altmetric score is now the Altmetric Attention Score appeared first on Altmetric.

]]>
Numbers behind Numbers: The Altmetric Attention Score and Sources Explained https://www.altmetric.com/blog/numbers-behind-numbers-the-altmetric-attention-score-and-sources-explained/ Tue, 26 May 2015 12:53:00 +0000 https://www.altmetric.com/?p=2112 In the last blog post in our researcher series, we included some perspectives on Altmetric from…

The post Numbers behind Numbers: The Altmetric Attention Score and Sources Explained appeared first on Altmetric.

]]>
In the last blog post in our researcher series, we included some perspectives on Altmetric from some metrics-savvy researchers. One of the responses was from Jean Peccoud, who commented on the Altmetric attention score, saying it “can [sometimes] feel a little like black magic”.

This isn’t the first time we’ve heard this, or similar, and we appreciate that people are keen to understand more about what goes on in the background to calculate the score for each research output. Our aim for this blog post, therefore, is to provide more detail around the Altmetric scoring system, and to offer insight into the weighting we give to each source we’re tracking.

We hope this post will help to answer some of the questions researchers new to altmetrics may have about how Altmetric collects and displays attention data. For those who are already familiar with Altmetric and use it to monitor the attention for their research, we hope this post will refresh their memories and provide a bit more context around the data.


Where can I find the Altmetric attention score?

an Altmetric donut

The Altmetric attention score appears in the middle of each Altmetric donut, which is our graphical representation of the attention surrounding a research output.  It can often be found on publisher article pages, and also appears when a user is using any of our apps, or using the Altmetric Bookmarklet.

The colours of the donut represent the different sources of attention for each output:

two lists of resources in columns

Why do Altmetric assign a score for articles at all? 

three columns with donut icons beside lists of articles

The Altmetric attention score is intended to provide an indicator of the attention surrounding a research output. Although it may be straightforward enough to monitor the attention surrounding one research output, for example, it becomes harder to identify where to focus your efforts when looking at a larger set. The number alone can of course not tell you anything about what prompted the attention, where it came from, or what people were saying, but it does at least give you a place to start – “is there online activity around this research output that would be worth investigating further?”

We work with a lot of publishers and institutions who want to be able to see which articles are getting the most (or indeed the least) attention. They’re interested in monitoring the attention of not only single articles, but to be able to place that measure within the context of the journal the article comes from, or in comparison with other publications from peers. Again, we’d always encourage anyone looking at our data to also click through to the Altmetric details page for each output content of the mentions and see what people are saying about the item, rather than using the arbitrary numbers to draw conclusions about the research.


How is the attention score calculated?

an Altmetric donut with square icons underneath, detailing what each colour represents

The Altmetric attention score is an automatically calculated, weighted algorithm. It is based on 3 main factors:

1. The volume of the mentions (how many were there?)
2. The source of the mentions (were they high-profile news stories, re-tweets, or perhaps a Wikipedia reference?)
3. The author of the mentions (was it the journal publisher, or an influential academic?)

Combined, the score represents a weighted approximation of all the attention we’ve picked up for a research output, rather than a raw total of the number of mentions. You can see this in the example on the right – the article has been mentioned in 2 news outlets, 2 blogs, 6 Facebook posts, 84 tweets, 1 Google + posts and 1 Reddit post. However, the score is 85, not 116.

That said, each source is assigned a default score contribution – as detailed in the list below:

two columns of data

These default scores are designed to reflect the reach and level of engagement of each source: a news story, for example, is for the most part likely to be seen by a far wider audience than a single tweet or Facebook post. It’s also worth mentioning that social media posts are scored per user. This means that if someone tweets about the same research output twice, only the first tweet will count. Blog posts are scored per feed; if two posts that were stored in the same RSS feed link to the same article, only the first post will be counted.

You’ll have noticed that the Altmetric attention score for any individual research output is always a whole number – so each time a new mention is picked up the score is rounded to the nearest whole number. For example, a single Facebook post about an article would contribute 0.25 to the score, but if there was only one post, the score for that article would be 1. However, if there were four Facebook posts mentioning a research output, this would still only contribute 1 to the overall score.


Weighting the score

Beyond tracking and calculating based on these default score contributions, another level of filtering is applied to try to more accurately reflect the type and reach of attention a research output has had. This is where the ‘bias’ and ‘audience’ of specific sources plays a further part in determining the final score.

News outlets

News sites are each assigned a tier, which determines the amount that any mention from them will contribute to the score, according to the reach we determine that specific news outlet to have. This means that a news mention from the New York Times will contribute more towards the score than a mention from a niche news publication with a smaller readership, such as 2Minute Medicine. Each mention is counted on the basis of the ‘author’ of the post – therefore if a news source publishes two news stories about the same article, these would only be counted as one news mention.

Wikipedia 

In addition to the news weighting, scoring for Wikipedia is static. This means that if an article is mentioned in one Wikipedia post, the score will automatically increase by 3. However, if an article is mentioned in several Wikipedia posts, the score will still only increase by 3. The rationale behind this is that Wikipedia articles can reference hundreds of research outputs. As such, a mention of a paper as a reference alongside lots of other research, is not really comparable (in terms of reach and attention) to a mainstream news story that is only about one research paper. We consulted a Wikipedia expert when trying to decide on the appropriate scoring, and eventually decided to keep the score static to reduce the potential for gaming. We agreed that if we were to decide that score would increase with each Wikipedia mention, people could potentially game the scoring by manually adding their publications as references to old articles. This would mean that their scores were biased through illegitimate attention.

Policy Documents

The scoring for policy documents depends on the number of policy sources that have mentioned a paper. Mentions in multiple policy documents from the same policy source only count once. If, for example, a research output is mentioned in two policy documents from the same source, this will contribute 3 to the score. However, if two policy documents from two different policy sources mention the same research output, these would both count towards the score, so the score would increase by 6.

Social media posts

For Twitter and Sina Weibo, the original tweet or post counts for 1, but retweets or reposts count for 0.85, as this type of attention is more secondhand (and therefore does not reflect as much engagement as the initial post). Again, the author rule applies; if the same Twitter account tweets a the same link to a paper more than once, only the first tweet will actually count towards the score (although you’d still be able to see all of the tweets on the details page). For tweets, we also apply modifiers that can sometimes mean the original Tweet contributes less than 1 to an article score. These modifiers are based on three principles:

  • Reach – how many people is this mention going to reach? (This is based on the number of people following  the relevant account)
  • Promiscuity – how often does this person Tweet about research outputs? (This is derived from the amount of articles mentioned by this Twitter account in a given time period).
  • Bias – is this person tweeting about lots of articles from the same journal, thereby suggesting promotional intent?

These principles mean that if (for example) a journal Twitter account regularly tweets about papers they have just published, these tweets would contribute less to the scores for these articles than tweets from individual researchers who have read the article and just want to share it – again, here we are trying to reflect the true engagement and reach of the research shared. This can also work the other way; if (for example) a hugely influential figure such as Barack Obama were to tweet a paper, this tweet would have a default score contribution of 1.1, which could be rounded up to a contribution of 2.

Combating gaming

Gaming is often mentioned as a risk of altmetrics (as a principle, it is actually applicable to any kind of metric that can be influenced by outside behaviour). Researchers are keen to compare themselves to others and many in the academic world have taken to using numbers as a proxy for ‘impact’. Altmetric have taken steps to combat practices that could be suspected gaming or otherwise negatively influencing the score, including:

  • Capping measures for articles that have more than 200 Twitter or Facebook posts with the exact same content. For articles such as these, only the first 200 Twitter or Facebook posts will count towards the score, in order to prevent articles with lots of identical social media posts from having much higher scores than articles with examples of more legitimate, unique attention.
  • Flagging up and monitoring suspect activity: where an output sees an unusual or unexpected amount of activity, an alert is sent to the Altmetric team, who investigate to determine whether or not the activity is genuine.

The most powerful tool we have against gaming, however, is that we display all of the mentions of each output on the details page. By looking beyond the numbers and reading the mentions, it is easy to determine how and why any item has attracted the attention that it has – and therefore to identify whether or not it is the type of attention that you consider of interest.


What’s not included in the attention score?

Lastly, it’s useful to remember that some sources are never included in the Altmetric attention score. This applies to Mendeley and CiteULike reader counts (because we can’t show you who the readers are – and we like all of our mentions to be fully auditable), and any posts that appear on the “misc” tab on the details page (misc stands for miscellaneous).

We get asked about the misc tab quite a lot, so I thought it would be good to explain the rationale behind it. We add mentions of an article to the misc tab when they would never have been picked up automatically at the point when we are notified of them. This could have been because we’re not tracking the source, or because the mention did not include the right content for us to match it to a research output. By adding posts like this to the misc tab, we can still display all the attention we’re aware of for an article without biasing the score through excessive manual curation.

We hope that by posting this blog, we’ve managed to shed some light on the Altmetric score and the methods that go into calculating it. As always, any comments, questions or feedback are most welcome. Thanks for reading!

Updated 27 June 2016 to change “Altmetric score” to “Altmetric attention score” throughout the post.

Register here to receive the latest news and updates from Altmetric

The post Numbers behind Numbers: The Altmetric Attention Score and Sources Explained appeared first on Altmetric.

]]>
The Altmetric Widget is Here! https://www.altmetric.com/blog/the-altmetric-widget-is-here/ Mon, 27 Jan 2014 17:05:00 +0000 https://www.altmetric.com/?p=2116 One of the great things about altmetrics is that they can bring to your attention…

The post The Altmetric Widget is Here! appeared first on Altmetric.

]]>
One of the great things about altmetrics is that they can bring to your attention papers which may have otherwise gone unnoticed. Unless you have the time to endlessly trawl table of contents or RSS alerts, keeping up with the big news and developments in your field of interest is often a challenge.

With this in mind, we’re excited to announce the release of the Altmetric widget generator. Built for bloggers, all you need to do to create your own custom embed code is input a keyword and/or select a subject area. The generator will then pull in relevant articles and their related altmetric score, based on that criteria. You can specify the size of the widget (thin, medium, or wide) depending on where you would like it to sit on your page, and it will update automatically as new papers are published and begin to receive attention online.

In this example we’ve used the keyword “altmetrics”:

two columns with donut icons beside lists of articles
Altmetric widget

Help your readers discover content that will be of interest to them by creating your Altmetric widget today!

You might also like to check out our WordPress plugin, which will help you embed our delightful donuts wherever you mention a research article.

Register here to receive the latest news and updates from Altmetric

The post The Altmetric Widget is Here! appeared first on Altmetric.

]]>
Getting Scholarly Conversations Instantly: Altmetric It! https://www.altmetric.com/blog/getting-scholarly-conversations-instantly-altmetric-it/ Wed, 10 Jul 2013 10:11:00 +0000 https://www.altmetric.com/?p=2137 The Altmetric Bookmarklet: a free and useful reading companion People are talking about scholarly papers…

The post Getting Scholarly Conversations Instantly: Altmetric It! appeared first on Altmetric.

]]>
The Altmetric Bookmarklet: a free and useful reading companion

People are talking about scholarly papers online, but what are they saying? And what digital tools are they using to communicate their ideas?

In this blog post, we’d like to introduce you to the Altmetric Bookmarklet, a free browser tool that lets you easily find out how much attention that recent papers have received online. The image on the left gives you an idea of the data that you’ll see. First, the Bookmarklet shows you the Altmetric donut, which is colour-coded according to which sources have mentioned the article. Inside the donut, you’ll see the Altmetric score of attention for the paper. The metrics are displayed below the donut: here, you can see how many times a paper has been mentioned in various sources, such as social media, mainstream news, and blogs (the altmetrics data). You can even see the number of readers who have saved the paper in their online reference managers.

With the Bookmarklet, you’ll also have a direct portal into the Altmetric article details page, on which you can find out exactly what has been said about the article.

If it’s a new paper or  just hasn’t seen much attention yet use the “Get email updates…” link in the bottom left hand side to sign up to an email alert whenever somebody mentions it online.

We reckon that the Bookmarklet is a really useful reading companion, especially for any academics who are interested in quickly seeing the attention that has been paid to their own articles.


Get the scholarly conversations with just one click

The Altmetric Bookmarklet is a browser add-on for Chrome, Safari, and Firefox that is extremely simple to use. If you’re reading a scholarly article online and click the Bookmarklet’s “Altmetric It!” button, all of the altmetrics data for that paper will appear on the right side of the page.

To install the Bookmarklet, all you need to do is visit this page, drag the “Altmetric It” button to your bookmarks bar, and you’re all set. Watch the video walkthrough below.


Installation instructions and demo


The small print

At the moment, the Bookmarklet works only on PubMed, arXiv, or pages that contain a DOI (the Altmetric API handles other identifiers too, but we haven’t added this functionality to the bookmarklet yet). Furthermore Altmetric only started collecting data in mid-2011 any articles published before then will have patchy coverage: don’t expect to see everything!

By default, the Bookmarklet supports publishers who embed Google Scholar-friendly citation metadata. If a particular journal isn’t supported at the moment just let us know you’re keen to see it and we’ll add it to the bookmarklet development queue.

If no DOI is found but you can see one on the page reload, select the DOI (as if you were going to copy and paste it) and then hit the “Altmetric it” bookmarklet while the DOI is selected.

More generally if you have any issues installing the Bookmarklet, please refer to our Bookmarklet FAQ page.

We’d love to hear what you think of the Bookmarklet. Tweet us or e-mail us with your thoughts!

Register here to receive the latest news and updates from Altmetric

The post Getting Scholarly Conversations Instantly: Altmetric It! appeared first on Altmetric.

]]>