Back to News & Commentary

The Oversight Board’s Trump Decision Highlights Problems with Facebook’s Practices

Donald Trump's Facebook profile on a web browser.
The ACLU believes that political speech deserves the greatest protection to ensure the functioning of our democratic system. We don’t want Mark Zuckerberg making these important decisions alone.
Donald Trump's Facebook profile on a web browser.
Kate Ruane,
Former Senior Legislative Counsel,
ACLU
Vera Eidelman,
Staff Attorney,
ACLU Speech, Privacy, and Technology Project
Jennifer Stisa Granick,
Surveillance and Cybersecurity Counsel, ACLU Speech, Privacy, and Technology Project
Share This Page
May 6, 2021

The ACLU condemns the bald-faced lies that President Trump repeatedly propounded after decisively losing the Nov. 4 election, and we called for his impeachment for his concerted effort to subvert our democratic process, leading to the Jan. 6 assault on the U.S. Capitol. We also recognize that Facebook is a private entity with its own First Amendment rights to control the content it publishes. But Facebook’s decision to ban Trump nonetheless illustrates serious shortcomings in its content-related decision making — as its own Oversight Board (OB) properly declared yesterday in reviewing the decision. Facebook exercises quasi-monopoly power over a critical forum in our marketplace of ideas, and for many of the same reasons that we would be suspicious of a central government authority controlling what can and cannot be said, we have concerns with Facebook exercising such unchecked power.

The OB ruled that Facebook’s initial decision to suspend former President Trump’s account for 24 hours on Jan. 6, 2021, after the attack on the Capitol, was proper, but that its subsequent decision to suspend his account indefinitely — a sanction that is not mentioned in Facebook’s policies — was inappropriate. The board put the decision about a permanent ban back in Facebook’s hands, to be made in the next six months according to the rules the company applies to other users.

As pernicious as Trump’s speech was, the decisions by Facebook and other social media companies to remove Trump from their platforms highlight the immense power these corporations wield over our collective ability to speak online. For the foreseeable future, Facebook, Twitter, and a handful of other corporations will make some of the most consequential decisions affecting free expression. They present themselves as platforms for free speech rather than edited or curated newspapers. But historically they have failed to apply their own rules consistently, equitably, and transparently, or to adhere to basic notions of fair process in how they exercise the awesome power to decide what gets published on, and who can access, their forums.

Perfectly consistent content moderation is impossible in light of the scale at which these platforms operate. But Facebook’s failure to abide by basic principles of fairness and transparency are unacceptable given the influence they exert over our national debate. Facebook and similar platforms should err on the side of free expression, not censorship, while also offering users direct control over the content they see. Facebook effectively determines the boundaries of political speech for billions of users, even as it remains beholden to its bottom line, not the public interest.

In an attempt to add some accountability and transparency, Facebook created an Oversight Board to help it review hard questions regarding content moderation. But the company still holds too much unaccountable power over the process. The OB rightly highlighted many concerns with Facebook policies and practices that we share, but the decision also leaves crucial questions unanswered. Below we break down our take on this issue and the board’s decision.

What does the ACLU think of the Oversight Board’s decision?

Facebook’s initial decision to suspend Trump’s account for a defined and limited time, and the OB’s decision to uphold it, is understandable in light of the events of Jan. 6 and Trump’s part in spreading outright lies about the electoral process in the weeks and days leading up to those events. But the rule Facebook claimed to apply here — its community standard prohibiting “praise and support of dangerous individuals and organizations” — is too vague, and its application in this case offers little clarity. That standard, which Facebook explains is meant “to prevent and disrupt real-world harm,” bans those who “proclaim a violent mission or are engaged in violence” from the platform, including those engaged in “terrorist activity,” “organized hate,” and “organized violence or criminal activity.” It also bans content that “expresses support or praise for [the people and organizations] involved in these activities.”

That’s a disturbingly nebulous and far-reaching standard. Indeed, it’s worth keeping in mind that on Jan. 6, Facebook also banned “calls for protests – even peaceful ones — if they violate the curfew in D.C.” It’s not hard to imagine Facebook’s rule against “organized ... criminal activity” getting misapplied to any plans for protests after curfews, whether in Kenosha, Wisconsin last summer or Elizabeth City, North Carolina today.

In addition, as the board’s lengthy opinion makes clear, when assessing the potential for speech to cause “real-world harm,” context matters. Words typed on a screen are often not enough to stoke “real-world harm” on their own, nor do they suffice to assess likely impact, yet that is often all that Facebook relies upon. In this week’s decision, the board properly calls on Facebook to consider context when assessing “issues of causality and the probability and imminence of harm” for posts by politicians and other “influential users.” We call on Facebook to consider context for all users.

As the board also properly noted, the penalty of indefinite suspension raises concerns. Unlike removing content, suspending an account for a limited period of time, or removing an account entirely, “indefinite suspension” appears nowhere in Facebook’s own rules. Facebook needs to make clear to users when, how, and according to what standards the company will indefinitely suspend accounts — particularly given that such a blunt tool removes a speaker from the platform entirely rather than focusing on specific content that violates policies in a more tailored way. Again, we condemn the pernicious, baseless, and demonstrably false statements Trump often made, but the issue here is bigger than Donald Trump.

Should there be special rules for political figures?

The ACLU believes that political speech deserves the greatest protection to ensure the functioning of our democratic system. We have parted company with other advocacy organizations that have been more willing to limit the speech of political leaders on social media platforms. The ACLU believes that the speech of former President Trump should be presumed important to the functioning of our democratic system given his prior role in government. Most of what politicians and political leaders say is, by definition, newsworthy, and can at times have legal or political consequences. While their words may have greater capacity for harm, there is also a greater public interest in having access to their speech. For example, courts considered President Trump’s tweets as evidence in several challenges to his official acts, including the transgender military ban and the Muslim ban. Given the importance of protecting political speech by political figures, Facebook’s primary recourse should be striking discrete statements by President Trump that run afoul of its standards, rather than imposing a lifetime, outright ban.

At a minimum, statements of political leaders relate to government transparency. We agree with the OB that for transparency and accountability purposes, if Facebook decides to censor a public official, the company should have a consistent plan in place for preserving the offending speech for transparency, research, and historical record purposes. In addition, Facebook should publicly explain its rules for removing posts and accounts of political figures. And its rules, as the OB recommended, must take into account the needs of human rights advocates, researchers, journalists, and others to access rule-violating content.

What else did we learn about Facebook’s relationship with the Oversight Board in this decision?

This week’s decision also highlights a number of the problems with Facebook’s approach to the Oversight Board. The board is only as powerful as Facebook lets it be, and that is problematic. For example, as the board’s decision makes clear, Facebook refused even to answer several questions the OB found relevant to its review. These included “questions about how Facebook’s news feed and other features impacted the visibility of Mr. Trump’s content,” “whether Facebook has researched, or plans to research, those design decisions in relation to the events of Jan. 6, 2021,” “questions related to the suspension of other political figures and removal of other content,” and “whether Facebook had been contacted by political officeholders or their staff about the suspension of Mr. Trump’s accounts.” Facebook should answer these questions.

In addition, Facebook denies users whose accounts it has suspended any opportunity to appeal to the board. That means that if Facebook permanently bans Trump’s account, the OB will have no say over the decision — unless, of course, Facebook itself asks for the OB’s opinion a second time. Facebook should enable users who are subjected to Facebook’s bluntest tools the option to appeal to the Oversight Board.

The OB is purportedly an effort to ensure that content moderation decisions are accountable. We approve of that impulse. We don’t want Mark Zuckerberg making these important decisions alone. The process and transparency the OB has the potential to provide are important, but this week’s decision makes clear that many roadblocks still stand in the way of fulfilling that goal.

Will this decision have an impact on regular Facebook users?

Not really — and that suggests a problem with the selection process of cases for the board. High-profile decisions like this might be interesting, but they're not the ones that actually matter for most users, including, importantly, those who don't have other outlets to speak online — as President Trump does. Account suspensions and deactivations can be devastating for such users.

The board’s decision doesn’t tell regular users looking at Facebook’s standards — including the “praise” standard applied here, what those standards mean for them. And regular users whose accounts are suspended don’t have the opportunity to appeal to the Oversight Board. The precedent set by this decision is very limited. The board repeats throughout that its ruling is fact-bound. In other words, it doesn’t address important questions about regular people’s use of Facebook.

Again, Facebook is a private entity not governed by the First Amendment. And President Trump’s actions in the wake of the Nov. 4 election were deplorable. But the broader issue here is how an extraordinarily powerful private corporation regulates access to one of the country’s most important forums for discussion and debate. We believe Facebook can and must do more to ensure that it operates its platform consistent with principles of free expression and fair process for all. We’ll be paying close attention to see how Facebook’s approach evolves, and whether the Oversight Board plays a meaningful role in protecting political speech and free expression rights online.

Learn More About the Issues on This Page