Article Image

IPFS News Link • Censorship

How Should Facebook (and Twitter, and YouTube, and...) Decide What Speech To Allow?

• Reason - Nick Gillespie

Everywhere you turned in 2018, Facebook, Twitter, and other social media platforms were in the news for policing speech in ways that either delighted or infuriated users. YouTube refused to host certain sorts of videos altogether and "demonetized" others (meaning the channels couldn't run ads and earn revenue). Patreon, a service that allows people to pay creators directly, recently deplatformed Sargon of Akkad, a controversial anti-feminist, which sparked a public exodus by a number of "Intellectual Dark Web" folks, such as Jordan Peterson, Sam Harris, and Dave Rubin.

As a legal and practical matter, there seems to be no question that such services are free to disallow pretty much whatever content they choose. Earlier in the year, YouTube (owned by Google, which is in turn part of Alphabet) won a lawsuit brought by Prager U that charged the site was minimizing the reach of conservative points of view, if not outright censoring them. The crux of that case turned on whether YouTube should be treated as the equivalent of a government-licensed broadcast radio or television network and thus have to provide equal distribution to all participants. The ruling was unequivocal that YouTube (and, by extension, other social media services) are private businesses. From The Hollywood Reporter's writeup of the ruling:

Since the First Amendment free speech guarantee guards against abridgment by a government, the big question for U.S. District Court Judge Lucy Koh is whether YouTube has become the functional equivalent of a "public forum" run by a "state actor" requiring legal intervention over a constitutional violation.

Koh agrees with Google that it hasn't been sufficiently alleged that YouTube is a state actor as opposed to a private party.

"Plaintiff does not point to any persuasive authority to support the notion that Defendants, by creating a 'video-sharing website' and subsequently restricting access to certain videos that are uploaded on that website, have somehow engaged in one of the 'very few' functions that were traditionally 'exclusively reserved to the State,'" she writes. "Instead, Plaintiff emphasizes that Defendants hold YouTube out 'as a public forum dedicated to freedom of expression to all' and argues that 'a private property owner who operates its property as a public forum for speech is subject to judicial scrutiny under the First Amendment.'"

That settles the large legal issue: The platforms can decide what stays and goes. But most peeks into how they actually make those decisions are troubling. In August, The New York Times sat in with Twitter's "safety team" as it wrestled with banning Alex Jones and Infowars. (They eventually got bounced, albeit later than from Facebook, YouTube, and Spotify.) All agreed that "dehumanizing language" should not be tolerated, but the devil is in the details; accounts often get suspended or banned in ways that seem arbitrary or simply wrong. A few days ago, the Times reported on "Facebook's secret rule book for global political speech." The platform has about 7,500 moderators that make decisions, often about situations about which they are mostly ignorant and often using autotranslate services because they don't speak the languages being used.

Agorist Hosting