Facebook founder and CEO Mark Zuckerberg recently stepped in it, as he is wont to do, in an interview with Recode in which he was asked about the spread of misinformation on his platform. In responding to a question about Sandy Hook conspiracy theorists’ use of social media, Zuckerberg made an unprompted defense of Holocaust deniers’ right to be heard.
“I’m Jewish, and there’s a set of people who deny that the Holocaust happened,” he said, apparently as an example. “I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong,” he continued, bizarrely adding: “I don’t think that they’re intentionally getting it wrong.”
Predictably, Zuckerberg was pilloried for a stance that was tone deaf at best and naive at worst. The notion that Holocaust deniers are merely “getting it wrong” and not being willfully dishonest drew widespread criticism, prompting Zuckerberg to clarify his remarks the next day, saying: “I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny that.”
Let’s be honest: the 34-year-old billionaire is an easy target for derision. Just observe the widespread schadenfreude Wednesday when Facebook’s stock fell some 24 percent in extended trading, wiping out about $17 billion of Zuckerberg’s net worth. But as awkward as his comments were, it’s understandable that the CEO of the world’s largest social media platform would be reluctant to limit the range of thought on his platform.
Facebook, like any website, does have a responsibility to prevent abuse and other crimes. But purging all dishonest content from such a large forum would be a much more difficult and subjective task, and even if successful, a top-down solution to the problem of a gullible public.
To be sure, Facebook is rife with misinformation. A 2016 analysis of Facebook engagement statistics by BuzzFeed showed the disturbing reach of some of that year’s most notorious — and most easily debunked — fake news stories. “Pope Francis Shocks World, Endorses Donald Trump” led the way with 966,000 engagements, while “ISIS Leader Calls for American Muslim Voters to Support Hillary Clinton” got 522,000.
The fact so many are willing to fall for fictional narratives that support their preconceived notions of the world would remains a problem even if Facebook could somehow expel every last huckster from the site. The solution isn’t censorship, but rather teaching media literacy — and depriving liars of a credulous audience.
To this end, legislators in Italy added an experimental media literacy program to the curriculum in October. The program teaches students how to spot suspicious URLs, sniff out hoaxes and verify facts for themselves. It’s a novel concept worth adopting in U.S. schools. But adults, too, need to adopt a more rigorous skepticism when using social media — and understand that the creators of online content aren’t always as benign as one might assume.
Facebook, to its credit, has introduced a few tools to determine whether malevolent forces have been targeting you. The company compiled a list of pages from a Kremlin-linked propaganda outfit called the Internet Research Agency, and provided a tool letting users know whether they interacted with any during the 2016 presidential campaign. Facebook has also started appending fact-check links right below the most egregiously false stories being shared in its news feed.
Every news story you read online should raise a few questions: Who is behind it? Do they have a reputation to uphold? If they get it wrong, do they set the record straight later? This newspaper has been part of the community for more than 125 years, and when we err, we print corrections. We aren’t perfect, but we take our responsibility and commitment to truth seriously.
The spread of propaganda on social media is merely a symptom; the malaise itself is a public that is too easily fooled. And it’s both unreasonable and unwise to expect Facebook’s programmers to cure us.