The Supreme Court may have a chance to strike down Section 230 after a child’s death was blamed on a ‘blackout challenge’ seen on TikTok

  • The death of a 10-year-old child has been blamed on a ‘blackout challenge’ seen on TikTok.

  • In a recent ruling, the Third Circuit court found that TikTok could be held liable for her death.

  • The ruling could go to the Supreme Court, where liability protections for online platforms could be rolled back.

The tragic death of a 10-year-old girl after her parents accused her in a lawsuit of participating in a “blackout challenge” presented to her on her TikTok “For You Page” could change the internet as we know it.

In an August 27 ruling, the U.S. Court of Appeals for the Third Circuit found that TikTok used its “For You Page” algorithm in 2021 to recommend a video promoting a “blackout challenge” for 10-year-old Nylah Anderson.

“Nylah, still in the first year of her adolescence, likely had no idea what she was doing or that watching the images on her screen would kill her,” Third District Judge Paul Matey wrote in his dissent. “But TikTok knew Nylah would watch because the company’s customized algorithm pushed the videos to her ‘For You Page.’”

The challenge encouraged viewers to choke themselves until they passed out. In a 2022 lawsuit filed against TikTok, lawyers for her parents said Nylah tragically died trying to replicate what she saw.

The case was initially dismissed by a district judge because of Section 230. However, the Third Circuit ruled that TikTok’s algorithm, which tailors content recommendations to specific users based on the posts they interact with, is a form of speech — a form not protected by Section 230.

The company had argued in court that it was immune from prosecution under Section 230 of the Communications Decency Act. TikTok did not immediately respond to a request for comment from Business Insider.

Section 230 of the Communications Decency Act of 1996, often referred to as the law that created the Internet, protects online platforms like TikTok, Meta, X and others from liability for content posted by users of their sites.

This means, for example, that if a user posts a video encouraging viewers to harm themselves, the platform on which the clip is posted cannot be held liable if a vulnerable viewer follows this suggestion.

But the Third Circuit’s ruling could change that.

Supporters of the ruling say it is high time

“Imagine if someone walked up to Nylah at school and suggested he strangle himself. We would recognize that person’s guilt immediately,” David French, a columnist and former attorney, wrote in a recent op-ed for The New York Times. “You could argue that the algorithmic suggestion is even more powerful than a suggestion made in person.”

French and other supporters of the Third Circuit ruling argue that TikTok’s liability protections should stop where its algorithmic suggestions begin.

French and other advocates say that while neutrally hosting a wide range of content on an online platform is fine, promoting specific content — especially content that the site’s operators know can be harmful — is where a new line needs to be drawn. The platforms themselves should be held legally liable for this, as the Third Circuit has held.

Defenders of Article 230 argue that the ruling is a blow to freedom of expression

Opponents of the ruling argue that critics of Section 230 are exploiting Anderson’s tragic death and the reasonable desire to protect children online to undermine the right to free speech.

“It’s a myth — these laws that claim to protect children,” Betsy Rosenblatt, associate director of Case Western University’s Spangenberg Center for Law, Technology & the Arts, told Business Insider. “They’re all dressed up as child protection, but underneath they’re not child protection — they’re attempts to silence speech.”

Rosenblatt said the Third Circuit’s decision makes no logical or legal sense. It’s morally reprehensible to suggest in a video that a child is strangling himself for online engagement — but it’s not illegal, and it shouldn’t be illegal for the host of the video to show it on your “For You Page.”

“The more you need platforms to filter speech, the more platforms have to take down speech first and ask questions later,” Rosenblatt told BI. “And that means that things that are controversial, as soon as they’re challenged, get taken down, even if they should stay online.”

What now?

TikTok can appeal the Third Circuit’s decision. If the company appeals, the case will head to the Supreme Court, where the justices could choose to take up the case or let the Third Circuit’s ruling stand. This would force platforms like TikTok to rethink how their algorithms work to avoid liability in cases like Anderson’s.

While the Supreme Court has thus far hesitated to define the scope of Section 230, the conservative justices have previously indicated that they are open to reconsidering the statute. If they do, their ruling could have even broader implications than the Third Circuit’s decision.

Justices Clarence Thomas and Neil Gorsuch in July dissented from the court’s refusal to hear a case that would have revisited Section 230. The case involved allegations that the Snapchat app has a design flaw that aids sexual predators, but lower courts ruled that the app’s parent company was shielded by Section 230.

And in a decision from the previous term, SCOTUS also left open a loophole to hold platforms liable based on the country in which they are headquartered. In Moody v. NetChoice, In a ruling that a platform’s algorithmic activity is a form of “expressive activity” and should be regulated in the same way as speech, Judge Amy Coney Barrett wrote that a foreign social media platform — such as TikTok — may not enjoy the same First Amendment protections as an American company.

Rosenblatt said that if SCOTUS agrees to hear the case, the high court could agree with the Third Circuit that TikTok’s algorithmic recommendations are a version of the site’s speech. If they do, the question would become whether the recommendation itself was negligent, which could have legal ramifications.

“That would still be terrible for doing business on the internet, but it wouldn’t kill all websites,” Rosenblatt said of upholding a narrower interpretation of the Third Circuit’s ruling. But the possibility of a broader interpretation, one that could hold that any form of content moderation amounts to a conversion of user speech into a platform’s speech, would have “devastating effects on the internet ecosystem and technology in general.”

Read the original article on Business Insider

Leave a Comment