It’s a big week for Big Tech… maybe its biggest ever.
On Wednesday, Twitter found itself before the Supreme Court arguing it cannot be held liable for a terrorist attack that may have been organized in part on its platform. Just a day earlier, Google was the center of a court case also stemming from victims of a terrorist attack allegedly linked to content posted on YouTube. Both cases have been viewed as challenging the protected-status status quo of internet platforms under Section 230.
Breaking the “Backbone of the Internet”
Both cases center on tragedies. In Twitter v. Taamneh, family members of one of the 39 victims slain in a 2017 shooting at an Istanbul nightclub carried out by an Islamic State extremist are suing Twitter for allegedly allowing similar extremist groups to use the platform for recruitment, general fundraising, and inciting violence. The plaintiffs in Gonzalez v. Google, meanwhile, are family members of an American college student who was one of over 100 people killed in a string of terrorist attacks in Paris in 2015. The family is accusing Google of recommending extremist recruiting videos on YouTube.
Both cases, and Google’s in particular, intersect with and challenge Section 230’s protections for internet platforms, the longstanding rule that grants websites legal immunity for hosting potentially illegal content posted on to their site by outside users (i.e., the owner of a blog cannot be held liable if a commenter posts libelous comments in their comment section). Without Section 230, YouTube, for example, would be liable for the content of the roughly 270,000 hours worth of video uploaded each day.
While Section 230 has long been upheld in courts to provide almost blanket protection to internet sites and platforms, both cases are presenting unique challenges to the statute:
- The scope of the argument in the Twitter case hinges on language in anti-terrorism legislation passed in 2016 that allows terrorism victims to seek compensation from entities that “aid and abet” terrorism, like, say, a bank that provides a loan to terrorist groups. Social media platforms, the plaintiffs argue, should fall under that category, Section 230 be damned.
- In the Google case, the plaintiffs argue that YouTube’s video recommendation algorithm falls outside the protections of Section 230. In other words, while YouTube cannot be held liable for merely hosting illicit content, it could still be held liable for promoting or recommending illicit content.
Middle Ground: “Courts often like to reach middle ground decisions,” Jared Carter, assistant professor at Vermont Law School, told The Daily Upside. “[SCOTUS] could create a carveout” to hold social platforms accountable for the content they recommend. “Maybe that’s a third way.” If so, creating carveouts to 230 in instances pertaining to terrorism wouldn’t be dissimilar to the ones created in 2018 legislation intended to curb online sex trafficking. The rulings of both cases are expected sometime before July.