Just a small percentage of Twitter users were responsible for sharing fake news during the 2016 U.S. presidential election, a new study found, but those that do were frequent Tweeters.
The study, published in the peer-review Science journal, found that among more than 16,000 accounts, 1 percent were exposed to more than 80 percent of the fake news links prevalent on Twitter in the runup to the 2016 election, while just 0.1 percent of those users actively shared 80 percent of fake news links. Northeastern University professor David Lazer, one of the authors of the study, told VentureBeat while he expected the sharing of fake news to be heavily concentrated among a small subset of users, the results were even more concentrated than he expected.
“Fake news as we define it — which is news like the content coming from websites that don’t produce accurate content — seems to be modest in scale. I wouldn’t say it’s nothing, but it seems to be a problem — at least on Twitter — in a small neighborhood,” Lazer told VentureBeat.
The researchers attempted to look at only the Tweets seen and sent by real U.S. voters, by linking a sample of U.S. voter registration records to Twitter accounts. They examined Tweets sent by these users between August and December 2016, as well as a random sample of Tweets posted by the people they follow to get a sense of what they were seeing in their Feed.
The researchers used a list of 300 fake news sites — culled by fact-checkers, journalists, and academics, as well as fact-checking site Snopes –and combed through the sample of Tweets to see how many users Tweeted out links to those fake news sites. Sites listed as being purveyors of fake news included Infowars and far-right site Gateway Pundit.
Older users and conservatives were found to be more likely to spread fake news during the 2016 election. A study published last week about fake news on Facebook also found that these two demographics were more likely to share fake news.
Those who shared or were exposed to 80 percent of fake news on Twitter sent a median of 71 Tweets per day, compared to 0.1 Tweets per day sent by the median Twitter user in the sample group.
There’s a few limitations of this study worth noting — one is that the study only looks at users who tweeted links to articles on fake news sites. So if a user mentioned the Pizzagate conspiracy theory in a Tweet — but didn’t include a link to one of the fake news sites — that wouldn’t count as spreading fake news. And though the researchers were able to see what kind of Tweets users may have seen during the 2016 election, it was impossible for them to create a perfect reconstruction of their Feed, given that they couldn’t recreate how Twitter algorithmically sorts Tweets.
Though the number of Twitter users who frequently share fake news is small, the fact that they see and share far more Tweets than the average user much means it may be harder for them to break through the echo chamber of fake news. Additionally, though most fake news is spread among a small subset of Twitters users, that doesn’t mean it can’t be spread far and wide if it gets enough oxygen. Another study published in Science last year found that fake news was about 70 percent more likely to be retweeted by people thank real news.
Lazer also stresses that fake news is just one way of looking at the broader spread of misinformation on platforms like Twitter.
“There’s a whole menagerie of information manipulation on social media, and fake news is just one animal of that menagerie…part of what i’d really like to dive into is all the more subtle ways that our perceptions are manipulated on social media,” Lazer told VentureBeat.