The New York Times ran an article yesterday about pressure that is mounting on Facebook to censor websites full of awful misogynistic material. The company said it was reviewing its processes for dealing with content under its hate speech policy.
As free speech and internet freedom advocates, what are we to make of this story? It seems to me that part of the ambiguity that arises in these cases is that there are two separate, incommensurable frames or paradigms through which we can interpret the situation.
In many respects, Facebook is a publisher like the New York Times, and like any editor has the right to exercise discretion in what it publishes, or allows others to publish in its pages. Within that framework (which is undoubtedly the legal framework that currently applies), those who care about internet freedom and free expression can relax, confident in the knowledge that whatever Facebook decides to do, all is well. In fact, from this point of view, we might note that social pressure against awful points of view is a perfectly acceptable means of restricting expression. Indeed, while we don't want the government censoring racist, misogynist, homophobic, or other hate speech, it is perfectly legitimate—desirable even—for citizens to express their social disapproval of such speech, and socially sanction those who use it. As a publisher, as a participant in the raucous conversation of public life, Facebook is entirely within its rights to act upon such anti-social material.
But Facebook can also be viewed through another lens. Facebook is in charge of a sprawling empire—a vast realm in which hundreds of millions of people vehemently express themselves and their opinions about the full range of human activity. It has to manage this swarming hive of activity, impose a minimum degree of order, and ensure that everything runs smoothly. It has the power to set rules, and to enforce them. In short, when it comes to the vast realm it oversees, Facebook is a government.
Viewed as a government, Facebook is almost an experiment in political science. What happens when a realm of human endeavor is governed by a near-absolute ruler that can set any rule, squelch any speech, expel any "citizen" for any reason, with only the due process protections it sees fit to spend money on?
Inevitably, without democratic checks and balances and pressure valves and escape mechanisms that allow people to act when they are unhappy, frustrations build, and privately run online worlds, like undemocratic countries, can be susceptible to social unrest and instability.
At the same time, the absolute nature of Facebook's power over its realm can actually help when it comes to one danger that affects democratic governments in particular: tyranny of the majority. As our Founding Fathers were so aware, the democratic passions of the majority, if not guarded by a strong judiciary, can themselves lead to the trampling of rights. Unpopular points of view are especially susceptible. If Facebook members could vote, there is no question that a wide variety of speech would be disallowed. So Facebook plays the role not just of the executive and legislative branches, but also of the judiciary.
Like any government, Facebook also has an interest in separating itself from responsibility for bad things. In the case of Facebook, "bad things" generally means bad speech. As I've written about before, once companies open the door to any censorship, they open themselves to blame and recriminations not only for anything they choose to censor, but also for anything they choose not to censor. And soon they are embroiled in pressures and counter-pressures over what to allow and what to forbid. If they declare their forum a free speech zone, on the other hand, they can credibly disclaim responsibility for anything that is said there.
I think Facebook recognizes this, and so has been commendably resistant to many calls for censorship, which are legion. In addition to the controversy over misogynistic sites, for example, there are also ongoing campaigns to remove various racist sites, and sites that glorify animal cruelty. But Facebook often seems hesitant to play the role of censor.
I suspect that Facebook also recognizes that hosting a lively, raucous forum where passionate debates rage, is much healthier not only for speech, but also, as a company that sells page-views, for its bottom line. As outrage over various awful material builds, and gets forwarded and circulated around, Facebook benefits not only from the additional page views, but also from the general passion and engagement and simple attention that Facebook is generating among its users. For Facebook, boring is bad and anything else is good.
At the end of the day, however, Facebook is not an absolute potentate; it is a for-profit company dependent on advertising revenue, and that gives advertisers power over it. And behind the advertisers stand consumers. So though Facebook has been admirably restrained in exercising its plenipotentiary powers over its little world, ultimately unpopular speakers within that world are susceptible to an attenuated version of the tyranny of the majority.
I think that over the long term, Facebook's best course is to absorb some short-term pain and stick fast to the principle that its forum shall be one where speech is free. Eventually people will learn that on Facebook, as in a public forum run by the real government, censorship of speech is just not an option, and will direct their outrage where it belongs: against the speaker, not against the forum.
Update: A colleague points me to this recent piece by Jeffrey Rosen detailing the struggles of Facebook and other companies to deal with content issues, and arguing as I have done that it is in their interest to maintain a strong pro-speech stance.