The closer we get to crucial elections like those for Baltimore mayor and American president in 2020, the angrier I get about Facebook and what it is doing to our democracy. After years of arguing that TV was still the principle storyteller of American life and the dominant medium of political persuasion, it feels like we are at a tipping point where the dominant electoral discourse of 2020 is going to be on Facebook and other social media platforms.
From Facebook chairman and CEO Mark Zuckerberg saying in November 2016 that it was “crazy” to think Facebook influenced the presidential election, to his disingenuous argument this fall that he was actually serving freedom of expression by allowing political candidates to lie on his social media platform, Zuckerberg has become easier and easier to hate. Yeah, that’s the word: hate.
And as we now see politicians like President Trump publishing bald-faced conspiracy-theory lies about Democratic candidate Joe Biden in his Facebook ads, or Gov. Larry Hogan posting a faked version of a Baltimore Sun front page on Facebook, it is not so very hard to predict how bumper-to-bumper traffic is going to be on the low road that Zuckerberg has paved for American politicians in 2020 with his OK-to-lie-your-butts-off policy. Yet another reason to blame him for the debased state of our civic conversation.
But now comes a study from the McCombs School of Business at The University of Texas at Austin that says maybe we share in some of the blame for social media’s toxic effect on democracy as a result of the way we interact with the false information and lies on Facebook. The title of an article published last month in the peer-reviewed scholarly journal “Management Information Systems Quarterly” suggests just how complicit some of us might be: “Fake news on social media: People believe what they want to believe when it makes no sense at all.”
And that goes for people across the political spectrum, not just the folks in the MAGA hats at the Trump political rallies heckling the press corps and cheering the ugly smears Trump delivers against those he considers opponents.
A team of three researchers set out to try to assess whether or not the warning flags Facebook introduced in late 2016 in response to fierce criticism of its practice of publishing fake news on its platform without distinguishing it from truthful information actually helped. The warnings came from third-party fact checkers in-system that lacked transparency.
The short answer is that the flags, which Facebook dropped on its own within a year, had very little if any positive effect. But in digging deeper for reasons they didn’t work, the researchers shed significant light on the role those of us who consume political news and information on Facebook play in this devil’s dance of disinformation and lies between the platform and its followers.
First, the researchers found most people think they are much better at spotting fake news than they are. Second, even when Facebook users are told something is false, they still tend to believe it if it reinforces their biases. And third, most people come to Facebook – and social media in general – in a far less critical mindset than they do in reading a newspaper or watching a TV news program. As a result, their guard is down in thinking logically about the information. They come to social media mainly seeking pleasure, and so they engage most deeply with those stories or bits of information in their news feeds that make them feel good.
All of which makes them more susceptible to believing and clinging to information that plays to their confirmation bias, while going out of their way to avoid or ignore information, no matter how true it might be, that creates cognitive dissonance for them.
“We all believe that we are better than the average person at detecting fake news, but that’s simply not possible,” lead author Patricia Moravec said in a University of Texas article about the study. “The environment of social media and our own biases make us all much worse than we think.”
The methodology that Moravec and her colleagues, Randall K. Minas of the University of Hawaii at Manoa and Alan R. Dennis of Indiana University, employed involved 83 undergraduates who were judged to be social media proficient. They were connected to EEG monitors that measured activity in their brains as they were presented with 50 fact-based news headlines and asked to assess credibility. The brain activity indicated cognitive dissonance when they were presented with information at odds with what they believed to be true.
Having taught media research methods for 18 years at Goucher College, I can vouch for the methodology. But what really intrigues me is the explanation and exploration in the study of the way we receive and process political information on Facebook.
It comes to us wrapped in a warm and fuzzy blanket of messages and updates from friends and family, lovable and goofy videos of pets, plus news and information selected by Facebook’s algorithms to give us pleasure. We are at our most relaxed and intellectually vulnerable as we scroll through the feed that we and the algorithms have constructed for us.
And now, Zuckerberg has given the green light to politicians to pump their worst lies, wildest conspiracy theories and most vile slanders into that soothing media stream.
We saw how Trump saturated Facebook just before the impeachment hearings with Facebook ads saying it is Democratic candidate Joe Biden – not Trump – who should be investigated for his actions in the Ukraine. Trump ads claimed Biden used his office when he was vice president to withhold aid to Ukraine and get a prosecutor fired who was investigating the activities of Biden’s son, Hunter, in his role on the board of a Ukrainian energy company. Trump is still trying to sell that conspiracy theory on Fox News even though an investigation was done of the Bidens and no wrongdoing was found.
The Biden campaign asked Facebook to take down the ad. Zuckerberg refused under the phony guise of his so-called commitment to the First Amendment and free speech. The ad spread like a virus through the media ecosystem, and is still out there in social media doing its dirty work on behalf of the president.
In 2018, the Republican Governors Association used the latitude given by Facebook even before the OK-to-lie-your-butts-off policy to characterize Gov. Larry Hogan’s Democratic opponent Ben Jealous in such a negative racial context that it was compared to the infamous “Willie Horton” ad in the 1988 presidential campaign of George H.W. Bush.
“The Republican Governors Association’s current ads for Larry Hogan harken back to the GOP’s worst dog-whistle campaigns,” University of Maryland Law Professor Larry Gibson wrote in a letter to the Sun.
“It’s bad enough that the ads selectively edit Democrat Ben Jealous’ words to distort his message,” Gibson’s letter continued. “What is worse about the RGA ads is how images in the commercials are intentionally distorted to give the impression that Mr. Jealous, a former national president of the NAACP, is an ‘angry black man’ with extreme political views. It’s a throwback to the Republican’s racist ‘Willie Horton’ ad of the 1988 presidential campaign, and Ronald Reagan’s ‘welfare queen’ comments.
I wrote a lot about Hogan’s ad campaigns and how his own team appeared to be using Facebook video to test market imagery, ideas and text before making a big TV buy. One of the things I believe they, like the RGA, were test-marketing is how far they could go in attack ads with such depictions of Jealous.
As the 2020 election season gets nasty, look for Facebook to be the place that battle will mainly be fought. It’s cheaper and it’s micro-targeted and now, candidates can lie and smear opponents all they want. If you thought TV attack ads took political campaigning into the gutter, brace yourself for a trip straight down into and through the sewer in coming months thanks to Zuckerberg and our willingness to let him use us.
David Zurawik has been TV/media critic at The Baltimore Sun since 1989. He can be reached at firstname.lastname@example.org