
Nopparat Khokthong / Shutterstock
By Jeffrey Brainard
To spread the word about a new study, some scientists are taking to Twitter to share a link. But Twitter links rarely draw attention to newspapers, a recent study finds.
A review of 1.1 million Twitter links to scientific papers found that half didn’t get clicks, and another 22% only got one or two. Only about 10% of links received more than 10 clicks, according to the Jan. 23 survey in the United States Journal of the Association for Information Science and Technology
Such lean click-through rates are not uncommon, other studies from Twitter have found. Tweets with stories in media channels do not do much better on average. But while most of the research articles included in the new study did not elicit clicks, a small minority went viral: an article about freshwater fish contaminated with radioactive cesium released in the 2011 Fukushima nuclear disaster received more than 25,000.
The Twitter study broke new ground by being one of the first to measure how users of the social media platform respond to tweeted scientific articles by using a statistic other than clicking the like or retweet buttons. Other research had shown that due to Twitter’s limit of 242 characters per tweet, many tweets about newspapers display the title only, and as a result, a like or retweet can only represent a cursory gesture of interest based on limited information. Clicking a link, on the other hand, provides a sign – though not evidence – that someone has actually read the paper, say the authors of the new study, which was led by Zhichao Fang of Leiden University.
Due to technical issues, the researchers could only view links to articles published between 2012 and 2017. Thus, they had no access to data on whether clicks have increased since the start of the COVID-19 pandemic, in which many researchers turned to Twitter. to share and comment on papers.
Another data limitation: The team only examined a subset of all links created by the bit.ly link shortening service, which allows social media users to compress longer URLs. About 15% of all tweets during the study period contained bit.ly links. (Twitter introduced its own link shortening feature in 2017, but does not make data about clicks on those links generally available.)
Even with such limitations, Fang’s team found that even the most popular tweets, measured by clicks and likes, don’t seem to have much of an impact on later scholarships. Papers mentioned in popular tweets, for example, didn’t get noticeably more quotes. This may indicate that tweets are usually posted quickly and with little deliberation, while citations are often chosen after careful consideration, other research shows. “Science and social media fit into two different spaces of engagement,” said Rodrigo Costas Comesana of Leiden University, who co-authored the new study. “Each of them has their own rules.”
Costas Comesana and Fang say if Twitter were willing to provide them with more data on links, they could help them better understand why scientists – and non-scientists – click on some tweeted articles and not others. For example, they wonder if the fame of the tweeter, or the journal where the paper is published, makes a difference. (Using existing data, the research team couldn’t determine how many scientists clicked on links.)
The new study adds to understanding how science is communicated on Twitter, says Nicolás Robinson-García of the University of Granada, who was not involved in the study. He and his colleagues published a separate analysis in 2017 that found it “impossible” to use the content of a Tweet alone to “conclude that there was any kind of involvement with the newspaper itself,” he noted in an email. -mail. In contrast, he wrote, examining link clicks can provide a clearer, but not complete, picture of what users are doing.
Robinson-García’s own work suggests that Twitter is not an effective medium for catalyzing meaningful, ongoing consideration of new findings. A 2017 analysis he conducted with colleagues examined the content of 8,247 tweets referring to 4,358 articles published in dental journals. They found that many tweets were simply retweets or duplicates sent from the same account, some likely by robots. Only 6% of the tweets, which came from just 1% of the Twitter accounts studied, showed evidence that the tweeter had read the paper, as evidenced by comments in the tweet about the article’s conclusion or other aspects.
It would be interesting, they wrote, “to identify the tweets and accounts that are truly informative, relevant, and indicative of research reception and discussion.”