WASHINGTON — The Supreme Court on Thursday evaded a ruling on the legal shield that protects Internet companies from lawsuits related to user-posted content in a case over allegations that YouTube was responsible for suggesting videos promoting violent militant Islam. .
The court, in a brief unsigned opinion, did not decide the legal question of whether the liability protections enshrined in Section 230 of the Communications Decency Act safeguard YouTube’s alleged conduct.
That’s because in a related case involving similar allegations against Twitter, the court ruled unanimously Thursday that such claims couldn’t be brought in the first place under a federal law called the Anti-Terrorism Act. As a result, both the YouTube and Twitter lawsuits are likely to be dismissed without the courts having to address the Section 230 issues.
«This is a huge victory for free speech online. The court was asked to undermine Section 230, and it refused,» said Chris Marchese, a lawyer for NetChoice, a trade group for tech corporations.
YouTube’s lawsuit accused the company of bearing some responsibility for the murder of Nohemi González, an American college student, in the 2015 Paris attacks carried out by the Islamic State terror group.
In the Twitter case, the company was accused of aiding and abetting the spread of militant Islamist ideology in a way that contributed to the death of a Jordanian citizen killed in a terrorist attack.
The judges in that case ruled that the relatives of Nawras Alassaf, who was killed in Istanbul in 2017, cannot continue claims that Twitter, Google and Facebook were responsible for aiding and abetting the attack under the Anti-Terror Law. As a result of that decision, it is unlikely that González’s family will be able to pursue their claim.
As a result, there is no need for the courts to address the issue of Section 230 immunity.
The unsigned decision said the allegations were «materially identical to those at issue» in the Twitter case. As a result of that ruling, «it appears to follow that the complaint here also does not establish a claim,» the court said.
«Therefore, we decline to address the application of Section 230 to a complaint that appears to indicate little or no plausible claim for relief,» the court added.
Hannah DeLaine Prado, general counsel for YouTube owner Google, said in a statement that the various entities that supported Section 230 would be «reassured by this outcome.»
Eric Schnapper, a lawyer for the plaintiffs in both cases, declined to comment.
The YouTube case was being closely watched by the tech industry because recommendations are now the norm for online services, not just YouTube. Platforms like Instagram, TikTok, Facebook, and Twitter long ago began to rely on recommendation engines or algorithms to decide what people watch most of the time, instead of emphasizing chronological sources.
The possible reform of Section 230 is an area in which President Joe Biden and some of his staunchest Republican critics agree, although they disagree on why and how it should be done.
Conservatives generally say companies inappropriately censor content, while liberals say social media companies are spreading dangerous right-wing rhetoric and not doing enough to stop it. Although the Supreme Court has a 6-3 conservative majority, it was unclear how it would address the issue.
González, 23, was studying in France when she was killed while dining at a restaurant during a wave of terrorist attacks carried out by ISIS.
His family alleges that YouTube helped ISIS spread its message. The lawsuit focuses on YouTube’s use of algorithms to suggest videos to users based on content they have previously viewed. YouTube’s active role goes beyond the kind of conduct Congress was intended to protect with Section 230, the family’s lawyers allege.
The family presented the lawsuit in 2016 in federal court in Northern California and hopes to pursue claims that YouTube violated the Anti-Terrorism Act, which allows people to sue people or entities that «aid and abet» acts of terrorism.
Citing Section 230, a federal judge dismissed the lawsuit. That decision was upheld by the San Francisco-based U.S. Court of Appeals for the Ninth Circuit in a June 2021 decision that also resolved similar cases, including the Twitter dispute, that the families of other victims of terror attacks had filed against technology companies.
In the case of Twitter, Alassaf was visiting Istanbul with his wife when he was one of 39 people killed by ISIS affiliate Abdulkadir Masharipov at the Reina nightclub in Istanbul. Masharipov had created a «martyrdom» video saying he was inspired by ISIS and wanted to die in a suicide attack. He evaded capture after the shots were fired, but was later arrested and convicted.
Alassaf’s family claimed that without the active assistance of Twitter, Facebook and Google, the ISIS message and related recruitment efforts would not have spread as widely. The family does not allege that Twitter actively sought to help ISIS.
A federal judge had thrown out the suit, but the Ninth Circuit in the same decision that addressed the YouTube case said the claim of abetting could go forward. The family properly alleged that the companies had provided substantial assistance to ISIS, the court concluded.
Judge Clarence Thomas wrote in Thursday’s ruling that the «family’s allegations are insufficient to establish that these defendants aided and abetted ISIS in carrying out the relevant attack.»
Thomas said aiding and abetting must constitute «knowing, voluntary and culpable participation in the wrong of another», which the plaintiffs had not shown.
Twitter’s lawyers argued that it provides the same generic services to all of its users and actively tries to prevent terrorists from using them. A ruling against the company could allow lawsuits against many entities that provide widely available goods or services, including humanitarian groups, the lawyers said.
David Greene, director of civil liberties at the Electronic Frontier Foundation, a group that advocates for free speech online, welcomed Twitter’s ruling, saying the court correctly found that «an online service cannot be responsible for attacks terrorists simply because their services are generally used by terrorist organizations in the same way that they are used by millions of organizations around the world.»
Twitter’s press office, as usual, automatically responded to a request for comment with a poop emoji.