The Case of Moody in Florida and NetChoice vs. Paxton: The First Amendment Impact of Second Amendment Laws in the United States
Texas has a social media law that protects online speech from discrimination on the basis of viewpoint and the majority was critical of the Fifth Circuit finding in favor of it. “Contrary to what the Fifth Circuit thought, the current record indicates that the Texas law does regulate speech when applied in the way the parties focused on below — when applied, that is, to prevent Facebook (or YouTube) from using its content-moderation standards to remove, alter, organize, prioritize, or disclaim posts in its News Feed (or homepage),” Kagan wrote for the majority. “The law then prevents exactly the kind of editorial judgments this Court has previously held to receive First Amendment protection.” The law is unlikely to survive First Amendment scrutiny.
The NetChoice cases are focused on laws in Florida and Texas that aimed to limit how large social media companies could regulate content on their sites. The legislation took shape after conservative politicians in both states criticized major tech companies. Tech industry groups NetChoice and the Computer & Communications Industry Association sued to block both laws. The Supreme Court makes the final decision about whether the statutes could be upheld after the appeals courts in different states make different conclusions.
The justices were all able to agree on several opinions. The majority opinion was penned by Justice Elena Kagan, and was joined by the other justices and Chief Justice John Roberts. Justice Jackson joined the majority opinion. Justices Clarence Thomas and Neil Gorsuch wrote concurring opinions with Samuel Alito.
The justices heard oral arguments in the two cases — Moody v. NetChoice and NetChoice v. Paxton — in February. At the time, several justices prodded counsel about how the laws would impact tech companies that did not seem top of mind when they were authored — including Uber, Etsy, and Venmo.
The case of Moody in Texas and NetChoice in Florida were returned to the lower court for analysis, but in doing so, they caused five separate opinions.
The record is underdeveloped, and the parties have not briefed the critical issues here, Justice Elena Kagan wrote.
The question before the high court was considered a significant First Amendment case that had the potential to rewrite the rules of road for online free speech.
Do Social Media Platforms Censor the Right Votes? The Case of the Florida Capitol riot and the U.S. Supreme Court
The decision to remove Trump from various social media platforms began when the Capitol riot occurred.
In response, lawmakers in Florida and Texas passed state laws barring social media sites from banning or restricting the reach of political candidates, claiming that conservative voices have been censored by tech companies.
The laws were brought despite the evidence that the opposite is true, and that right-wing commentators are using social media as a megaphone.
The justices wrestled with the issue of whether Meta and X have created a public square that distinguishes them from other private companies.
Lawyers for the tech companies said that forcing them to allow accounts that they think should be banned violates their First Amendment rights. It has been established in legal cases that social media sites have a right to decide what is published and not allow it on their own platforms.
The question was asked if the state laws would violate the First Amendment if platforms banned Trump again.
Silicon Valley thinks that if the ability to suspend or block users is not provided, social media sites would be overrun with bad content.
Technology companies are protected from lawsuits that arise from content hosted by platforms under the Communications Decency Act. The law also provides tech companies wide latitude in patrolling speech on their sites.
Section 230 has become a bipartisan punching bag. Conservatives argue the law gives platforms a free pass to censor right-wing perspectives, whereas liberals say it allows big social media firms to escape accountability for the rise of hate speech, disinformation and other harmful content.