We’ve wasted countless hours with online recommendation algorithms that suggest we might enjoy watching another cat video or following another influencer. But in just a few months, social media platforms may need to find new ways to keep users engaged, and the Internet may undergo a major overhaul.
On Tuesday, the Supreme Court began hearing arguments in a case called Gonzalez v. Google, which questions whether tech giants can be held legally responsible for content promoted by their algorithms. The case points to a cornerstone of today’s Internet: Section 230, a statute that protects online platforms from liability for content produced by others. If the Supreme Court weakens the law, platforms may need to review or remove the recommendation algorithms that govern their feeds. And if the Court strikes down the law entirely, it will leave tech companies more vulnerable to lawsuits based on user content.
“If there are no protections for user-generated content, I don’t think it’s an exaggeration to say that this is probably the end of social networking,” says Hany Farid, a computer scientist at the University of California, Berkeley. Social platforms like Twitter and YouTube rely heavily on two things: user-created content and recommendation algorithms that promote content that is most likely to grab the attention of other users and keep them on the platform the longest. possible. The Court’s verdict could make one or both strategies more dangerous for tech companies.
Gonzalez v. Google it stemmed from the events of November 2015, when gunmen affiliated with the ISIS terrorist organization killed 130 people in six coordinated attacks across Paris. nohemi gonzalez, a 23-year-old student, was the only American to be killed in the attacks. Subsequently, his family plowed through Google, which owns YouTube, arguing that the recommendation algorithm of the video platform promoted content from the terrorist group.
Google argues that the use of algorithms to classify content is “publishing par excellence,” something necessary for users to be able to navigate the Internet and therefore protected by Section 230. That statute, which was originally part of the The Communications Decency Act of 1996 states that, under the law, computer service providers cannot be treated as publishers of information created by another person. It’s a measure that dates back to the early days of the internet and was intended to prevent technology companies from heavily intervening in what happens online.
“This law was designed to maximize speech, which means that by giving companies fairly broad immunity from liability, it allows them to create platforms where people can speak without a lot of proactive oversight,” says Gautam Hans, a associate clinician. professor of law at Cornell Law School.
gonzalez he argues that recommendation algorithms go beyond simply deciding what content to display, as “tool neutrals” like search engines do, and instead actively promote content. But some experts disagree. “This distinction just doesn’t make sense,” says Brandie Nonnecke, a technology policy specialist and director of the CITRIS Policy Lab, based at UC Berkeley. She contributed to a brief on the case arguing that both types of algorithms use pre-existing information to determine what content to display. “Differentiating content viewing and content recommendation is impossible,” Nonnecke says.
when deciding Gonzalez v. Google, the Supreme Court can follow one of three paths. If the court sides with Google and declares that Section 230 is fine as it is, all is well. In the most extreme case, the Court could throw all of Section 230 out the window, leaving the tech giants open to lawsuits not only for the content their algorithms recommend, but also for what users say on their sites.
Or the Court can go a middle course, tailoring the statute in a specific way that could require technology companies to face some additional liability in specific circumstances. That scenario could play out a bit like a controversial 2018 amendment to Section 230, which held the platforms accountable for third-party content linked to sex trafficking. Given the limitations of Gonzalez v. GoogleThe amendment to Section 230 could involve changes such as the exclusion of content related to terrorism, or require companies to control the algorithms that drive increasingly extreme content and that prioritize advertising profits over the interests of users or society Farid says.
Hans does not expect the Supreme Court to publish its decision until the end of June. But he warns that if Section 230 falls, big changes will come to the Internet quickly, with repercussions that will go far beyond YouTube and Google. Technology platforms, already dominated by a handful of powerful companies, may further consolidate. And the companies that remain can crack down on what users can post, giving the case implications for people’s free speech. “That’s the downstream effect that I think we should all be concerned about,” says Hans.
Even if the Supreme Court sides with Google, experts say momentum is building for the government to rein in Big Tech, either by amending Section 230 or introducing other measures. Hans says he hopes Congress will take the lead, though he points out that lawmakers have yet to pass any new legislation to this end. Nonnecke suggests that an alternative approach could focus on giving users more control over recommendation algorithms or a way to opt out of sharing personal information with algorithms.
But neither does it seem likely that the Supreme Court will back away from the issue. TO second case being discussed this weekcalled Twitter vs. taamneh, it also discusses the responsibility of technology platforms for terrorist content. And already in this case, experts expect the Supreme Court to take up cases exploring two conflicting state laws on content moderation by social media platforms.
“No matter what happens in this case, the regulation of technology companies will continue to be a problem for the Court,” says Hans. “We’re still going to be dealing with the Supreme Court and technology regulation for a while.”