close
close
migores1

US Court of Appeals Tracks Big Tech Liability Shield

A US appeals court has rejected a long-standing argument from social media platforms: that a federal law grants them blanket immunity from liability for harm caused by users — even the wrongful death of children.

A Pennsylvania mother won the right to sue TikTok over the death of her 10-year-old daughter in a ruling by a three-judge appeals court panel on Tuesday. The court said TikTok could be liable because its algorithm provided dangerous content to the child.

Three weeks before Christmas 2021, Nylah Anderson was found dead in her mother’s closet in suburban Philadelphia. She accidentally strangled herself with her mother’s purse strings while mirroring a “challenge in black” video that was promoted to her on TikTok.

This algorithmic recommendation forms the key to the decision by the US Court of Appeals for the Third Circuit.

The justices overturned a lower court’s decision to dismiss the case on the social media liability defense: Section 230 of the Communications Decency Act of 1996. Under Section 230, online platforms are shielded from liability for content posted on their sites by third parties.

But by promoting a “Self-Suffocation Guide” to a child, TikTok exceeded the scope of a passive intermediary protected by Section 230, the ruling said. Instead, it has become an active promoter of dangerous content. The Anderson family’s lawsuit seeking to hold TikTok liable for the “knowing distribution and targeted recommendation of the disruption challenge” may continue, it said.

The ruling could have major implications for all websites that run on user-generated content that site owners do not vet before it is posted. The liability shield has allowed social media platforms to grow on a huge scale because they tend to stop and review user posts only if other users report them – and even then, they tend to do so using artificial intelligence. With so much content, these same platforms have increasingly relied on algorithmic recommendation to ensure users are watching what will hold their attention, supporting their advertising businesses.

“Big Tech just lost their ‘get out of jail free’ card,” said Jeffrey Goodman, a partner at Saltz Mongeluzzi Bendesky, who argued on behalf of the family, in response to the ruling. If social media platforms cause harm, “they will now have to face their day in court,” he said.

A TikTok spokesperson declined to comment on the decision or whether the company plans to appeal. In an earlier statement about the case, the company said: “TikTok remains vigilant in its commitment to user safety and would remove any content related to causing the outage from the app.”

The Anderson family declined an interview request. In a statement released by Goodman, they said: “Nothing will bring back our beautiful little girl, but we are comforted to know that – by holding TikTok accountable – our tragedy can help other families avoid future, unimaginable suffering.” Social media platforms must “stop exploiting children for profit,” they said in the statement.

The appeal court ruling comes amid growing concerns about the harm social media has caused to a generation of children. Hundreds of lawsuits have been filed against social media platforms in recent years, alleging they created addictive products that promoted suicidal and self-harm content to children and connected young users to drug dealers and sextortionists.

Bloomberg Business Week published a 2022 cover story on the blackout challenge that was cited in Goodman’s appeals court brief. The Blockade Challenge is a dare in which participants choke themselves with household objects such as shoelaces or a power cord until they black out and film the adrenaline they get and regain consciousness.

The Business week the story linked the dare to the deaths of at least 15 pre-adolescent children. It also found evidence that TikTok knew its algorithm was sending videos promoting the abortion challenge to children — and that some of them had killed themselves trying — before Anderson’s death.

Five months after her death in May 2022, Anderson’s parents filed a lawsuit against TikTok, citing product liability, negligence and wrongful death.

Two months later, a second lawsuit was filed against TikTok by the Social Media Victims Law Center on behalf of the families of Arriani Arroyo, 9, of Wisconsin, and Lalani Walton, 8, of Texas, who both committed suicide accidentally trying to cause the crash. The families claimed the dangerous dare was recommended to the girls via the TikTok For You feed, a page of personalized content curated for each user, to keep them scrolling for as long as possible. The Arroyo and Walton case is ongoing.

In the Anderson case, TikTok argued that it was “fully protected” by Section 230, and in October of that year, a district court granted the company’s motion to dismiss the case. It said TikTok cannot be held liable for a video that a third party posted on its site. The family appealed a week later.

“By promoting this challenge and populating it on the ‘For You’ pages of children across America, TikTok is putting children at risk — and in many cases killing them — in the name of corporate greed,” the Anderson family’s appeal filed. said.

The family’s lawyers went on to challenge how section 230 “has been tragically applied to protect the goliaths of the tech industry”. Enforcing the law as a blanket shield of immunity, they argue, “has empowered social media companies to develop increasingly predatory and manipulative technologies designed to become dependent on users and control their actions. Children, more than anyone else, pay the price.”

The appeals court agreed, saying that TikTok knew that the deadly interruption challenge was spreading on its app, that the algorithm was “fueling” the children’s challenge and that several children had died trying it.

“Nylah, still in her first year of adolescence, probably had no idea what she was doing or that watching along with the images on the screen would kill her,” said Judge Paul Matey, writing for the three-judge panel. . The justices overturned the lower court’s decision, giving Anderson’s parents the right to sue.

The appellate decision also went beyond the Anderson case, criticizing how section 230 “steps in to shield corporations from virtually any claim related to content posted by a third party, regardless of the cause of action and regardless of the provider’s actions.” Congress did not intend to “create a lawless land of legal liability,” the justices said.

This loose reading of the law immunized social media platforms “from the consequences of their own conduct.” It allowed these companies to ignore the usual obligations other companies face, such as preventing their services from causing “devastating harm”.

The case is Estate of Nylah Anderson v. TikTok, Inc. etc. al., Case no. 22-3061 (on appeal from the United States District Court for the Eastern District of Pennsylvania in Case No. 2:22-cv-01849)

Photo: Photographer: Brent Lewin/Bloomberg

Copyright 2024 Bloomberg.

TOPICS
US Liability InsurTech Tech

Related Articles

Back to top button