The Biggest Supreme Court Case That Nobody Seems to Be Talking About
This is part of Opening Arguments, Slate’s coverage of the start of the latest Supreme Court term. We’re working to change the way the media covers the Supreme Court. Support our work when you join Slate Plus.
On Monday, the Supreme Court will hear arguments in a pair of cases out of Texas and Florida that could force major social media platforms to carry posts from Donald Trump or others who lie about elections being stolen or obliquely encourage election-related violence. A ruling in favor of these states would turn the First Amendment upside down and create the conditions for undermining American democracy. If there wasn’t so much else swirling around our elections and democracy right now, this case would be commanding everyone’s attention.
Moody v. NetChoice LLC and NetChoice LLC v. Paxton arise out of the actions that Facebook, Twitter (now X), and other social media companies took in removing Trump from their platforms after the attack on the U.S. Capitol on Jan. 6, 2021. Trump had been relentlessly calling the 2020 election results into question despite having no reliable evidence of widespread fraud or irregularities. In an infamous tweet in December 2020, he encouraged his supporters to come to Washington for “wild” protests on Jan. 6, the day that Congress would be counting the states’ Electoral College votes to confirm Joe Biden as the election victor. After Trump and his supporters spoke in speeches on the Ellipse on Jan. 6, a crowd stormed the Capitol. The violent incident left 140 law enforcement officers injured (four later died by suicide) and four protesters dead. After Trump failed to immediately condemn the violence and call for the siege to end, the platforms had enough, determining that Trump had violated their terms of service and needed to be removed.
In response to the removal of Trump and concern over what they call “censorship” of conservatives, Florida and Texas each passed laws that make content moderation difficult if not impossible for major social media companies. The laws differ in some particulars, but both would make it illegal to remove the kinds of content we saw from Trump before he was deplatformed in 2020. A coalition representing the platforms sued, arguing that the laws violated the platforms’ First Amendment rights to decide what content to include or exclude on their platforms. The coalition won their primary arguments in the Florida case but lost in the Texas case, and the Supreme Court is hearing both of them on Monday.
The key First Amendment question is how to treat the platforms when they curate content. The platforms argue that they are private actors just like Slate or the Wall Street Journal, having a constitutional right to include or exclude content as they see fit. It’s a strong argument. In a 1974 case, Miami Herald v. Tornillo, the court held unconstitutional a (different) Florida law that required newspapers to print the reply of someone who had been criticized in the newspaper. The court held that private actors like newspapers have every right under the First Amendment to include or exclude content as they see fit.
The states argue that we should treat social media platforms as “common carriers,” the way we do the phone company. There are laws that forbid the phone company from denying you service because it doesn’t like the messages you might communicate by voice or text. In an amicus brief in the cases that I filed with political scientist Brendan Nyhan and journalism professor Amy Wilentz and co-authored with Nat Bach and his team at Manatt Phelps, we argue that the common carrier argument is a weak one.
As professor Eugene Volokh, one of the originators of the common carrier analogy, explains, what separates entities such as newspapers from entities such as phone companies is whether they produce a “coherent speech product.” Those who do are entitled under the First Amendment to exercise editorial discretion. Social media platforms surely do, however, produce such coherent products. Of course the public reasonably associates a controversial politician’s speech with a platform’s editorial message. People may be attracted to or repulsed by Trump’s speech on a platform, but they will perceive that speech as part of the platform. (In contrast, no one perceives private text messages sent over AT&T’s network as AT&T’s speech.) People know that Truth Social, where Trump commonly posts, is different from a platform where people rarely, if ever, see posts from Trump, or a platform marketed to Democrats organized around criticizing Trump.
It should be no surprise that after Elon Musk took over Twitter and changed its moderation policies to make the platform’s content less trustworthy and more incendiary, users and advertisers reevaluated the platform’s strengths and weaknesses, with many choosing to leave. Content moderation policies shape how the public perceives a platform’s messages. Content moderation decisions—including Musk’s, whether wise or not—are the exercise of editorial discretion. The public then decides which platforms to patronize, value, or devalue.
There’s also a huge irony in seeing people like Volokh or Justice Clarence Thomas express support for the common carrier theory and requiring private companies to carry speech they may disagree with or even find dangerous. In his amicus brief supporting Florida’s appeal, Trump approvingly quoted Volokh: “Recent experience has fostered a widespread and growing concern that behemoth social media platforms … have ‘seriously leverage[d their] economic power into a means of affecting the community’s political life.’ ”
That kind of equalization rationale has been rejected by the libertarians on the court in cases like Citizens United, the case that freed corporations to spend unlimited sums in support of candidates for election to office. There, the court wrote (quoting a 1976 case, Buckley v. Valeo) that it is “wholly foreign” to the First Amendment to seek to equalize speech, and that the First Amendment can’t do anything to stop those with economic power from translating it into political power.
Now that it is conservatives yelling “censorship” rather than liberals complaining about big corporations seeking to have an outsize influence on whom is elected and on public policy, is the court really going to change its position on whether the government can mandate speech equalization depending on whose ox is being gored?
This case, though, implicates more than just the First Amendment right of platforms to curate content. It may affect the survival of American democracy itself, as the 2020 election demonstrated.
During the campaign and post-election periods, these platforms labeled and fact-checked many of Trump’s false and incendiary statements and limited the sharing of some of his content, but after Trump failed to condemn (and even praised) the Jan. 6 rioters, many major platforms, fearing additional violence fomented by the president, decided to remove or suspend Trump’s social media accounts. The platforms made voluntary decisions about labeling, fact-checking, demoting, and deplatforming content that undermined election integrity, stoked violence, and raised the risk of election subversion and democratic collapse.
In so doing, the platforms participated in the open marketplace of ideas by exercising their sound editorial judgment in a socially responsible way to protect democracy. Even if certain moderation decisions were imperfect in hindsight, the platforms’ efforts were vastly preferable to an alternative in which government fiat deprives platforms of the power to remove even dangerous speech.
We should learn by June if the Supreme Court agrees.