(Bloomberg) -- The Supreme Court’s decision to hear a case challenging a legal shield for social media platforms puts the justices in the middle of a politically fraught debate over whether some of the world’s most powerful companies should be protected as neutral forums for speech or held accountable for the content.
At stake is the very business model that has allowed companies like Alphabet Inc.’s Google, Meta Platforms Inc. and Twitter Inc. to reap enormous profits while shaping civic discourse.
The case involves a lawsuit brought by the family of Nohemi Gonzalez, a 23-year-old US citizen who was among 129 people killed in coordinated attacks by ISIS in Paris in November 2015. Gonzalez’s family says Google’s YouTube service, through its algorithms, violated the Anti-Terrorism Act by recommending the terrorist group’s videos to other users.
In announcing Monday that it was taking up Gonzalez v. Google, the Supreme Court agreed to hear a challenge to Section 230 of the 1996 Communications Decency Act, which protects online platforms from liability for user-generated content.
“The Supreme Court hearing the Gonzalez case and ruling on where the limits are in immunizing these tech companies is a pivotal moment in culture and the law,” said Carrie Goldberg, a victims’ rights lawyer who has sued social media companies. “Our society has gone from seeing big tech platforms as untouchable by law and legislation to finally recognizing that when left to run amok they can cause atrocious personal and social injuries.”
Courts have interpreted Section 230 as giving a legal shield to internet companies when they decide how to display third-party content. Gonzalez’s family said the algorithmic recommendations are equivalent to editorial judgment and shouldn’t be protected by the liability provision.
Google says YouTube at the time of the attack used a sidebar tool to queue up videos based on user inputs including browsing history. The company says the only alleged link between the Paris attacker and YouTube was that one attacker was an active user of the video-sharing service and once appeared in an ISIS propaganda video.
The court on Monday also announced it would hear a related case involving allegations that Twitter and other social media sites helped facilitate acts of international terrorism through their services. That case, Twitter v. Taamneh, could clarify an anti-terrorism law that allows victims to sue those who have aided a terrorist group.
Google, Twitter and Meta declined to comment on the court’s decision to take up the cases.
Congress for years has debated the need to reform or revoke Section 230, but there has been little action in the face of partisan differences over the best approach and the practical difficulties of writing legislation for complex technology.
The issue was reignited last year when a Facebook whistle-blower, Frances Haugen, released thousands of internal documents revealing that the company understood the control it exerts over the flow of information and that the mechanics “are not neutral.”
Republicans, including former President Donald Trump, have threatened to revoke Section 230 protections as punishment for platforms they accuse of censoring conservative viewpoints. Democrats have made the opposite argument: that tech companies should do more to remove offensive content or face legal consequences.
Republican-controlled legislatures in Texas and Florida have each passed laws taking aim at what they view as social media censorship. The Texas law bars social media platforms with more than 50 million users from discriminating on the basis of viewpoint and the Florida law requires platforms to host political candidates.
The laws were challenged by a tech industry trade association, and while one federal appeals court struck down Florida’s law in May, another court last month upheld Texas’ law. NetChoice, which represents Meta, Twitter, and other tech companies, has petitioned the Supreme Court to resolve the split.
Google Chief Executive Officer Sundar Pichai told Congress last year that revoking Section 230 would revert content moderation to the early 1990s before that protection from lawsuits existed. “Platforms would either over-filter content or not be able to filter content at all,” Pichai said.
‘Remove’ Section 230
The White House, in remarks released in September after a technology policy roundtable, called for reforming the industry. Among the recommendations: “remove special legal protections for large tech platforms” afforded by Section 230. Now the Supreme Court has the opportunity to do just that.
The high court in 2020 declined to hear an appeal regarding Facebook’s liability for posts advocating violence in Israel. But a concurring opinion from Justice Clarence Thomas on a different case earlier this year made clear that he was willing to entertain challenges to this provision of the 1996 law.
“Assuming Congress does not step in to clarify Section 230’s scope, we should do so in an appropriate case,” he wrote. “When Congress enacted the statute, most of today’s major internet platforms did not exist.”
It is in some ways an unusual case for the court to take up, according to Carter Phillips, a veteran Supreme Court advocate, since it wouldn’t be resolving a conflict between two different appeals courts. Agreeing to hear the case suggests that the high court is “unusually interested in getting at the question of the scope of Section 230 and how that will regulate the tech industry,” Phillips said Monday on Bloomberg TV.
This court has shown a willingness to consider controversial cases with sweeping implications for society and the law, such as this year’s decision overturning the precedent that protected abortion access. But it’s unclear where the ideological lines will fall on this case, according to Jeff Kosseff, a cybersecurity law professor at the US Naval Academy.
“You can tell that some justices thought it was important enough to hear, but I don’t know if they all thought that for the same reason,” Kosseff said. “You need five justices to agree on a particular way to read a statute. I don’t know if we have that.”
Two lower courts, including the San Francisco-based 9th US Circuit Court of Appeals, sided with Google and said the lawsuit should be dismissed.
Eric Goldman, an internet law professor at Santa Clara University who supports leaving liability protections intact, said the San Francisco-based court’s opinion “exuded an unusually high degree of hostility to Section 230” in what he described as an “extremely problematic” ruling. Where the Supreme Court comes down -- and the scope of an eventual ruling -- could drastically change how user-generated content moves around the internet, Goldman said.
Altering Section 230, or making its protections conditional on other factors, would negate the provision entirely, according to Cathy Gellis, an internet law attorney who has written appeals court amicus briefs supporting Section 230.
“Every time we try to narrow Section 230 to not cover this, or not cover that, or cover something only conditionally, then all of the sudden there’s no point in having Section 230 at all,” Gellis said. “Because you’re going to go to court to fight over whether Section 230 applies to you.”