When it comes to identifying fake news, it is less science than alchemy; a task residing somewhere between asking “What is art?” and “What is beauty?” In other words, fake news is largely, as the aphorism goes, in the eye of the beholder.
Though used excessively by Democrats and Republicans alike, the term itself is nothing more than a worthless catchall for content we might find objectionable for any number of reasons. Does the content seem biased and unfair against “our guy?” Fake news. Is the content from a news source we don’t like? Fake news. Did Jim Acosta file the story? Oh, definitely fake news.
Clearly there are examples of objectively fake news, where an outlet (or blog) intentionally publishes misinformation or fraudulently masks its legal identity. In the case of the former, which is rare in the overall context of “fake news,” it is difficult if not next to impossible to prove the intent was malicious, rather than shoddy journalism. With regard to the latter, involving truly fraudulent masking of news sources, legal channels are available with which to address the problem.
However, Facebook’s recent announcement that it was recommitting itself to fighting this particular scourge, brings to mind not so much a methodical, scientific endeavor that will reveal an objective truth; but rather an exercise in digital alchemy premised on what is revealed to the eye of the employee-beholder.
The real-world results have revealed this dilemma. For example, Facebook CEO Mark Zuckerberg got into hot water last week for off-handedly musing that Sandy Hook mass shooting conspiracies would be removed, but perhaps not content questioning the Holocaust. And then Facebook took flack for removing an ad featuring a centuries-old painting of Christ because of “nudity.” Both Zuckerberg’s statement and the ad’s removal are likely honest mistakes, but they illustrate the next-to-impossible challenge facing Facebook and other social media sites when attempting to censor content perceived as fake or objectionable.
The new content standard set by Zuckerberg this week, targeting content that “may lead to violence” (emphasis added) does not make their task any easier; if anything, it adds another layer of subjectivity to be deciphered. Take, for instance, a recent tweet by the liberal activist group MoveOn.org (yes, they’re still around), which branded low wages “violence.” Would, then, a post debunking the so-called benefits of minimum wage laws be categorized as violence? Would an algorithm know the difference; or should it? Would a human analyst, with the same political bias as MoveOn?
Which is fake news? Vox? Stokes? Both? Neither? Add to this that much gun-related content has already been pulled by socials, including by YouTube, and the problem becomes distressingly murky. Would the result of such an exercise then, mean Vox is allowed to post its content, but rebuttals removed if they include traditional conservative alternatives to gun control, such as concealed carrying to lethally stop mass shooters? It is doubtful even Zuckerberg could say for sure.
Therein lies the slippery slope of social media censorship. Even if well intended, algorithms designed to police subjective content can do no better than humans who have already proven themselves to be either too biased or ignorant to handle such a responsibility. At some point, which already has already been an issue prior to the fake news fervor, good content gets removed, bad content continues to flourish, and many users decide it is simply not worth the hassle.
This is the other challenge for Zuckerberg, since as a private company, such decisions may come with a considerable business impact. For example, producers of the gun content once hosted on YouTube did not suddenly change their content to get back online, or stop posting altogether; they went to a pornography site because of its more open policy on content. Is losing Facebook users, and spending millions in staff and engineering, to win an unwinnable fight, what Zuckerberg “the business leader” really wants. At the end of the day, probably not; yet the desire to be seen as “doing something” appears hard to resist.