Subscribe now

Technology

Trick or tweet, or both? How social media is messing up politics

Misinformation that goes viral online could be undermining the democratic process. Is this something we can fix?

By Chris Baraniuk

12 July 2016

Trump

What will you trumpet today?

KENA BETANCUR/AFP/Getty Images

IF IT’S your job to check the kinds of things people say online, you’ll know it is hard work at the best of times. But doing it in the run-up to the US presidential election is exhausting. “It’s been ridiculous,” says journalist Brooke Binkowski at the myth-busting website snopes.com.

Since Donald Trump announced his candidacy for the Republican nomination last year, claims and counterclaims by all candidates or their supporters have been spreading at a blistering pace online, says Binkowski. “People seize on these ideas.”

“People use social media to share stories that reinforce their world view rather than challenge it“

Next week, the Republican party is expected to formally pick Trump as its candidate for president. In the race to the White House, fact checkers like Binkowksi are our last defence against online misinformation. What are we up against?

Trump himself has given fact checkers plenty to do over the past eight months, making “an inordinate number” of false claims, according to Eugene Kiely at FactCheck.org. Another website, PolitiFact.com, looked into 158 claims made by Trump since the start of his campaign and found that four out of five were at best “mostly false”.

Fact checkers correct stories doing the rounds both on social media and in the mainstream press. In the hunt for the truth, they spend their days poring over interview transcripts, scouring video footage and checking references. It can be a thankless job. Rarely are Binkowski’s attempts to present the facts received sympathetically.

For example, she recently posted an article debunking claims that a crowd started shouting “English only!” at an event addressed by the Hispanic civil rights veteran Dolores Huerta. “We got hundreds of emails calling us unprofessional, saying we were biased, saying we were anti-Latina,” says Binkowski.

With roughly six in 10 US adults getting news from social media, according to a recent Pew Research survey, the issue of accuracy might seem to be ever more important. “One of the things that give social media potency to impact political views is the immediacy of it,” says Julia Shaw, a psychologist at London South Bank University. Then there is the issue of blending fact with opinion. “You might even get an opinion before the information,” says Shaw, which can colour people’s judgement.

Walter Quattrociocchi, a computational social scientist at the IMT School for Advanced Studies in Lucca, Italy, is one of a growing number of researchers concerned about this trend. He and his team have spent the last few years trawling through data from sites like Facebook. In particular, Quattrociocchi has explored how social networks can give rise to echo chambers – spaces in which groups of individuals share stories that reinforce their world view, and rarely get to see anything that challenges their beliefs.

Algorithms not guilty

This phenomenon is often blamed on the algorithms used by social media sites to filter our newsfeeds. But Quattrociocchi’s most recent study suggests it’s partly down to our own behaviour. His team compared identical videos posted on Facebook and YouTube, and found that they led to the formation of echo chambers on both sites, meaning that the algorithm used for content promotion was not a factor.

“It’s the narrative that is attracting the users, not the content,” says Quattrociocchi. “The narrative is identified by the group and the group frames the narrative.”

In his book Lies Incorporated: The world of post-truth politics, US radio host Ari Rabin-Havt talks of an industry of misinformation, although he agrees that we bring much of it on ourselves. “When people are given a choice, they’re going to choose what’s comforting and easy for them,” he says. “They’re going to avoid information that challenges them and therefore get stuck in echo chambers.”

And since people only hear what they to want to hear, it isn’t straightforward to counter falsehoods spreading online. Shaw says that politicians exploit our willingness to remember something that appeals to us, regardless of whether it will eventually prove unfounded.

“Trump, for example, consistently says things that are demonstrably untrue and then takes them back,” she says. “He is getting people to believe things and relying on them to forget that he, or someone else, may correct it later on,” says Shaw.

What’s more, any sharing of the results of fact-checking typically lags the misinformation by 10 to 20 hours, according to a recent study by Chengcheng Shao at the National University of Defense Technology in Changsha, China, and colleagues.

Still, there are those who challenge the idea that online media has dramatically sidelined truth from politics. After all, the popularity of conspiracy theories is nothing new.

“Many of the same things were happening before Facebook,” says David Lazer, a computer and political scientist at Northeastern University in Boston. “I have not seen a compelling answer to whether this has really changed.”

Binkowski thinks otherwise. “There’s something about this perfect storm of identity politics plus the internet,” she says.

“What the post-truth era allows is for politicians to get away with it with no consequence,” says Rabin-Havt. It’s all just part of politics – but the the web speeds everything up.

Even if the truth is more of a hard sell than ever, Binkowski says it’s worth it if snopes.com’s efforts to set the record straight reach just 1 per cent of people. Kiely at FactCheck hasn’t lost hope either. “We’re seeing huge spikes in our traffic,” he says.

Biased bots

It isn’t just other people’s claims on social media that we should be wary of. Misinformation is increasingly circulating via social media accounts run by bots. Political bots were particularly active prior to the UK’s European Union referendum, for example.

A recent analysis by staff at the investigative website sadbottrue.com found that Trump has retweeted bots 150 times. They also claim that a recent Hillary Clinton tweet, in which she invited Trump to delete his Twitter account, was quickly retweeted by many bots.

Emilio Ferrara, a computer scientist at the University of Southern California in Los Angeles, thinks that political bots could influence the outcome of elections – and that this has been going on for several years. “We suspect bots were involved in spreading some form of misinformation or in some cases very explicit smear campaigns during the 2012 [presidential] election – on both sides,” he says.

This article appeared in print under the headline “Trick or tweet? Or both?”

Correction, 15 July 2016: The description of the results of the Pew Research survey has been amended since this article was first published.

Topics:

Sign up to our weekly newsletter

Receive a weekly dose of discovery in your inbox! We'll also keep you up to date with New Scientist events and special offers.

Sign up