After all, even if visitors to Blair’s site miss that it is supposedly satire, they could verify that this particular story about Islam and the Supreme Court was clearly untrue with a quick web search: Gorsuch didn’t hear his
Kurt Withrow, a former insurance salesman from Palatine, Illinois, didn’t mean to become one of the thousands of people who helped propel forward one of the most persistent fake news stories of the last year. But in January, a story making its way around Facebook caught his eye.
The headline, on many versions of the fabricated story, blared: “BREAKING: First Full Supreme Court Ruling In Over A Year Has Obama FURIOUS.”
Withrow read on about how, “Just last week the United States Supreme Court issued a direct and final blow to the Islamic Indoctrination of the young in this nation.” With a “tie-breaking vote” from Justice Neil Gorsuch, the court handed down a decision “banning Sharia Law and Islam from being taught in classrooms,” the story claimed.
Enthused by a decision he thought long overdue, Withrow copy-and-pasted chunks of the story into a post on Facebook, and added a bit of his own commentary. “This is the kind of story everyone SHOULD be hearing or reading in the media… but obviously it is not,” he wrote. A photo of Gorsuch shaking hands with President Trump appeared beside the text.
One of Withrow’s friends shared the post, a handful of others threw it a like or heart, and a couple more chimed in with criticisms. But none of Withrow’s Facebook friends — he currently has nearly 800 — questioned whether the claimed fact, that the Supreme Court had issued a ruling on teaching Islam in schools, was true.
“I’m very bothered now because I do try to discern stuff like this,” Withrow said. “It’s hard to know what to trust.”
But like Withrow, many other people — fueled by powerful social dynamics and psychological forces — also trusted this story, to the point that they forwarded it to their family and friends, posted it to their social media feeds, and discussed this non-existent court ruling on message boards.
Months later, Withrow told POLITICO that he could not remember what site the story had originally come from, though he thought it might have been The New York Times, which he reads regularly, along with The Washington Post and some conservative outlets, like Judicial Watch. “I was surprised actually because typically they are liberal,” he said, before conceding that maybe it wasn’t the Times, but something mixed in with all the other stories on his newsfeed.
The story never appeared in the Times. It was initially published in April 2017 on a purported satire site, but many seem to have missed the joke: Almost immediately, the story went viral, boosted by pro-Trump blogs, news aggregators, and shady clickbait sites that copied the original post verbatim or rewrote their own versions with minor tweaks. Withrow encountered it nine months after it was first published and, even today, more than a year later, despite multiple fact-check debunkings, the story continues to circulate through social media and email-forward chains.
In years past, it was easier. News was handed down from on-high, for better or worse, with editors in newspaper offices and at TV and radio stations deciding what did and didn’t merit attention. But with the decline of establishment media — including local media, which has been particularly devastated by the industry’s shifting economics – Americans are looking elsewhere for their news, which increasingly means Facebook and other social media platforms.
Whereas the editors at the local paper might have selected a story because they thought it provided important information, people share stories on Facebook and Twitter for very different reasons, said Daniel Kreiss, a professor at the University of North Carolina’s school of journalism.
“It’s a much more social than cognitive process,” he said. “People don’t often share the really deeply sourced informational content.”
So what exactly is it that compels people to share one story, but not another? Or to want to believe a news item that, given a moment’s thought, is obviously false? In order to understand the impulses that propel a story forward, this POLITICO analysis breaks down the mechanics of a fake news story from stem to stern — how it was created, how it spread and what it says about the way people interact, as well as the future of news.
***
This particular false story started not in Russia or Macedonia, but in Maine, as the work of a self-proclaimed satirist.
And while the distinction between false and legitimate news can sometimes be subtle or subject to interpretation, Christopher Blair admits that this story, like all the ones he writes for his website, America’s Last Line of Defense, is indisputably false.
Blair describes himself as a liberal troll, out to antagonize conservatives. But his work is so voluminous, and so consistently not interpreted as satire, that the fact-checking site Snopes has a tag and archive devoted just to stories from it, with more than 35 entries.
He published this story on the Supreme Court ruling on April 11, 2017, a few days after Gorsuch was sworn in. His stories are often riddled with syntax errors and typos — the types of mistakes that might signal Last Line of Defense is not exactly a professional news operation. For instance, in this story, clearly meaning to include the word “not,” he quoted Gorsuch as writing, “We should [sic] be teaching any religions in this country besides standard Judeo-Christianity, as our founders wanted.”
But those errors tend not to matter, as readers usually don’t pause to consider a site’s credibility.
After all, even if visitors to Blair’s site miss that it is supposedly satire, they could verify that this particular story about Islam and the Supreme Court was clearly untrue with a quick web search: Gorsuch didn’t hear his first case until April 17 and didn’t write any opinions until June. The post indicated the Obama administration previously won the case before “the 17th district court,” but no such federal court exists.
Basic knowledge of the First Amendment’s separation of church and state also might have been an immediate tip-off. But still, the story spread like wildfire. Dozens of sites shared the story to their readers, as recently as this April. (Some of those sites deleted their posts when contacted by POLITICO.)
According to Talia Stroud, a communications professor at the University of Texas at Austin, people have so much information flying at them so quickly now, and are so often pressed for time, that it’s difficult to always consider whether or not a story is true.
She said that people “have more gut reactions and they’re more likely to make these snap partisan judgments rather than engaging in a critical process that might lead them to say, wait, maybe there’s something going on here.”
POLITICO reached out to several people who shared the story.
One Twitter user, Lucy, who goes by @LucyForLiberty, told POLITICO that she would not have shared the story last month if she had realized it was fake. “Sometimes you just slip up,” she said. “Usually I verify. Busy month.”
Blake Gardner, an artist from Clayton, Georgia, said he shared the story on Facebook because the news of the ruling thrilled him. He said it was the type of decision that he and “millions” of others had been hoping to see.
“Greater than even we expected! Wonderful news,” one of Gardner’s friends commented at the time.
For Garry Jenkin, a marketing consultant in Dallas, the motivation was what he sees as the dangers of Islam.
“As more Muslims run for political offices, I feel obliged to follow the trends,” said Jenkin, who shared the story on Twitter in January.
What Gardner, Jenkin and thousands of others like them who shared the story have in common is this: it pricked their emotion.
In October, Ahmed Al-Rawi, a communications professor at Concordia University, authored a study showing that people are more likely to share stories that make them happy. That would include the response of Gardner, who was excited by the ruling. Meanwhile, other studies have found that negative feelings, like anger and anxiety, also drive sharing. That was the case with Jenkin, who was scared of “radical Islam.”
Jonah Berger, a marketing professor at the University of Pennsylvania’s Wharton School, co-authored a 2011 study that analyzed the sharing patterns of three months’ worth of New York Times stories and found that stories that evoke “high-arousal” emotions, whether positive or negative, drive virality.
That study analyzed which Times stories were most emailed, but Berger said that similar dynamics hold for sharing stories over social media.
“The most interesting thing to me about why we share fake news is that it’s exactly the same reason that we share regular news,” Berger said.
The question, he said, is, “What does it say about me to share this thing? Just like the car we drive and the clothes we wear says something about us.”
If such motivations applied in earlier decades when, for instance, a person clipped an interesting piece from the newspaper and sent it to a friend, now the Internet has opened up all kinds of opportunities for those feelings to be exploited. That’s why, in part, false stories that strike at divisive social issues or prejudices are able to spread so widely. This particular false story, which plays on Islamophobia, is a case in point.
“There’s at least a portion of our country that is very anxious about Muslims,” Berger said. “They’re sharing it because of that anxiety and to hopefully resolve that feeling of anxiety.”
Patrick Van Roy, a blogger in Pennsylvania, said that whenever he posts something, he is highly conscious of how it will make him appear to friends and followers. In a way, it’s no different than someone carrying an NPR tote bag on the subway, or wearing a Breitbart T-shirt down the street.
“I am trying to build a rep as hard-nose constitutional conservative,” Van Roy said, explaining why he uses Twitter in the first place. “I am not a Trump bot, or a GOP bot.”
Van Roy shared another article from Last Line of Defense – also about Islam and the Supreme Court – before vetting it. To some degree, he was just interested in seeing how his followers would react to it.
“I wanted to see what bounced back when it was tweeted,” he said.
Once one member of a social network has set a link in motion, it spreads easily among other people the original poster is connected to. “There’s some research showing that if your friends are endorsing any piece of content, you’re more likely to read it,” Stroud said.
***
In the aftermath of the 2016 election, Facebook embarked on an effort to stop false stories from spreading on its platform. That included attempting to disrupt the financial incentives for makers of fake news, changing its newsfeed algorithms, and partnering with fact-checking groups, like Snopes, FactCheck.org, PolitiFact, the Associated Press and others, to address stories circulating on the platform.
But if a lie can travel halfway around the world before the truth gets out of bed, a fake news story can take several more spins and stop for a long weekend at the beach before it gets fact-checked. Though this false story published on April 11, Snopes didn’t publish its fact check until April 17, while FactCheck.org and the AP both weighed in on April 21.
Saranac Hale Spencer, a fact-checker at FactCheck.org, has dealt with other similar stories originating from Last Line of Defense. She explained the process: first, the fact-checkers will see a story surface in their Facebook interface, which means that it is probably already circulating widely. Out of hundreds of options, she said they try to select stories that are at the right nexus of popular and important to debunk. Once they decide to perform a check, even for stories that are obviously false, sites like hers strive for airtight work, which takes time.
“We have a rigorous editing process,” Spencer said. “It does take usually a couple of days from start to finish.”
“The thing is, it takes almost no time to write this stuff, you can slap down whatever fiction you want in half-an-hour,” Spencer said. “To go through and show why each claim is false, it takes a long time.”
Part of the value of the fact-check is that it gets entered into Facebook’s machine-learning algorithms, to help train them to recognize and minimize viewership for similar stories. But even after all the fact checks, this story continued to move. Most of the people who spoke with POLITICO saw and shared the story after the debunker articles were published.
One problem is that it’s common for other fake-news websites to copy and paste false news stories wholesale and run them as their own.
“It’s a giant game of whack a mole,” Spencer said. “It is frustrating but that’s the nature of fake news.”
On April 17, 2017, the same day Snopes published its fact check, for instance, a far-right site called Freedom Daily, which PolitiFact deemed a “fake news site,” reblogged the story, including the bogus Gorsuch quotes.
“God Bless our new Supreme Court Justice Neil Gorsuch and our President Donald J. Trump!” wrote one of their bloggers, Al Waisman, adding “this should have been a unanimous decision, not 5 to 4.”
Freedom Daily took the post down after POLITICO reached out to Waisman, who did not respond to inquiries. The site has since shut down entirely.
But the Freedom Daily post was shared more than 60,000 times on social media before it was taken down, according to an archived version. It also spread via such popular Facebook groups as “United American Patriots,” which has more than 1 million followers.
It also prompted conspiracy theories about why the mainstream media ignored this apparently important piece of news, a view consistent with President Donald Trump’s contention that mainstream sources are biased.
“Did anyone see anything about this on any news channel? I didn’t,” one follower noted in the comments, his skepticism drowned in the thumbs ups and comments celebrating over a ruling that didn’t exist. Nearly 500 people shared the post from “United American Patriots.”
On April 19, 2017, another 2,000 people shared a version of the nonsense story posted to an unofficial Facebook fan page for Rep. Trey Gowdy (R-S.C.), which has more than 200,000 followers.
“EXACTLY why we need conservative judges on the bench,” wrote one woman who shared it from the Gowdy fan page post.
Those who commented that the story seemed implausible were shouted down.
“Snopes is hardly reliable,” replied one commenter when someone linked to a debunker. Distrust of fact-checking websites like Snopes and PolitiFact was a common refrain among commenters and those who spoke with POLITICO.
Earl Copeland, a former staffer of Sen. Tim Scott (R – SC), weighed in on the Gowdy page, trying in vain to warn others against believing the story. In reply, an acquaintance noted with concern that one of the page’s administrators had to approve the article before it was posted.
“I think that folks who embellish or lie about events actually hurt the cause that they think that they are helping,” Copeland said in an email to POLITICO.
***
Still today, more than 12 months after it was published, the story continues to circulate in various forms, including in email-forward chains, having virtually become part of the online right-wing bloodstream.
This is the explicit goal of Blair and his satire site “America’s Last Line of Defense”: to take advantage of the very emotions that make people likely to believe and rebroadcast a story to inject ludicrous claims into conservative social media circles. The result, more often, though, has been to create a runaway freight train of a fake news story.
Blair, also known by his handle, Busta Troll, declined an interview request from POLITICO but said in a statement that he believes he actually has a positive impact.
“In the world of ‘fake news’ there’s this one little anomaly that has done a world of good but that narrative doesn’t fit with what the liberal base will click, therefore I’ve been lumped in with Macedonians, Freedom Daily and [DisInfoMedia founder] Jestin Coler, all of whom sold their garbage as real,” Blair wrote by email.
In a recent interview with the Boston Globe, Blair said he and his writing partner often build their posts around conspiracy theories already spreading among conservatives, effectively tapping into the confirmation bias that can make false stories seem true.
Blake Gardner, the artist from Georgia who shared the story, said that he hoped it would die out, but he hasn’t deleted his post. This wasn’t the first time he’d shared a false story: he’d been “burned a few times by posting false or exaggerated info,” he said.
Having been fooled himself, Gardner said he understands why people continue to spread the claim. “Out of hope, and fear,” he said, “they keep finding and posting it.”
Be First to Comment