The Supreme Court appeared skeptical of arguments Monday by the states of Florida and Texas that they are justified in regulating social media content moderation in a landmark case with major implications for speech on the internet.
The court heard oral arguments for two major speech-related cases on Monday: NetChoice v. Moody and NetChoice v. Paxton. The technology industry group NetChoice sued the states of Texas and Florida over laws imposed by Republicans meant to hold social media platforms accountable for banning users based on viewpoint.
Florida’s law would allow residents to take legal action and the state to fine companies if they remove political candidates from social media platforms. The Texas law would require platforms to be “content-neutral” and allow the state’s attorney general and residents to sue platforms for removing content or blocking accounts. The court pressed the states to provide a justification for restricting speech. The justices, though, also asked questions aimed at determining the extent of Big Tech’s power over speech on the internet.
NetChoice v. Moody
Florida Solicitor General Henry Whitaker was the first to appear before the court to argue in NetChoice v. Moody. He said that platforms had to be “neutral” when it comes to content moderation and that the law merely regulates the “conduct” of a platform rather than the content. He also alleged that platforms such as Facebook and Google need to be treated as “common carriers.” Being defined as a common carrier, a term initially used for public transportation services and utilities but expanded to include radio stations and telephone services, would subject platforms to additional restrictions, including anti-discrimination regulations.
Multiple members of the court appear skeptical of Florida’s law, noting that it was very broad and affected more platforms than some claimed it would. “[Florida’s law is] covering almost everything,” Justice Sonia Sotomayor said. “The one thing I know about the internet is that its variety is infinite.”
Justice Samuel Alito noted there is also no list of platforms covered by Florida’s statutes. This broadness makes it challenging to deal with the case’s particulars, Justice Clarence Thomas argued. “We’re not talking about anything specific,” Thomas said. “Now we’re just speculating as to what the law means.” The e-commerce platform Etsy was brought up multiple times by the court as an example of a platform that would be inadvertently affected by Florida’s law.
Paul Clement, NetChoice’s representative, responded in his arguments by saying that Florida’s law violated the First Amendment “multiple times over.” He also tried to create a distinction between content moderation decisions made by government entities versus private entities. “There are things that if the government does, it’s a First Amendment problem, and if a private speaker does it, we recognize that as protected activity,” Clement argued.
The Biden administration’s Solicitor General Elizabeth Prelogar seemed to affirm Clement’s arguments, arguing in favor of NetChoice and limiting Florida’s power over speech.
Netchoice v. Paxton
The court reconvened a short time after to hear arguments about Texas’s law. Clement returned to represent NetChoice, arguing that Texas’s law requiring neutrality on the platform would make social media less attractive to users and advertisers since it would require platforms to host both anti-suicide and pro-suicide content as well as “pro-Semitic and antisemitic” content.
He also emphasized to the justices that a social media company was more like a parade or newspaper than a common carrier, trying to focus on the state of speech on the platform.
Aaron Nielson, Texas’s solicitor general, emphasized that social media platforms are a lot like telegraphs and that this nature should be why the state should restrict the sorts of censorship that platforms allow.
Nielson was questioned multiple times about how the state would handle its viewpoint-neutral emphasis. When asked how platforms could regulate viewpoint-neutral approaches to subjects such as terrorism, Nielson said platforms could just remove it. Instead of saying that “‘you can have anti-al Qaeda but not the pro-al Qaeda,’ if you just want to say, ‘Nobody is talking about al Qaeda here,’ they can turn that off,” Nielson argued.
Court conclusions
The court appeared divided on the extent to which content moderation was allowed. On one hand, they saw government-enforced moderation as questionable, mainly if it focused on content. On the other hand, they criticized the power exerted by Big Tech companies. Justice Neil Gorsuch brought up the example of private messaging services such as Gmail deciding to delete communications due to them violating certain viewpoint communications, a matter that multiple justices brought up before Clement.
The court appeared bothered by the two cases being “facial challenges,” a legal term for cases in which a party claims that a specific law is unconstitutional and should be voided. This approach offers little flexibility for the Supreme Court since the court could not limit the law’s effect to only a specific form of speech but leave other parts of the law intact.
CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER
Section 230, a part of the Communications Decency Act that protects platforms from being held accountable for content posted by third parties, was also brought up by the justices multiple times. The justices tried to weigh how that law would interact with the state’s attempts to block speech, as well as NetChoice’s arguments in favor of the platforms. Thomas argued that NetChoice’s argument that platforms had editorial control undermined its defense under Section 230.
The court is expected to release a decision on both cases sometime before July. The court will only be ruling on the preliminary injunction, which means that the decision will come quicker than other cases and that the decision will decide if the lower court’s blocking of the laws will be upheld or overturned.