Metropolitan News-Enterprise

 

Friday, August 23, 2024

 

Page 3

 

Ninth Circuit:

Allegedly False Representations Are Not Shielded by CDA

 

By a MetNews Staff Writer

 

The  Ninth U.S. Circuit Court of Appeals yesterday partially reinstated an action against the maker of a messaging app that enables anonymous communications, holding that there’s no statutory bar to claims that the defendant falsely represented that users who engaged in harassment and bullying would be barred and their identities revealed.

Sec. 230 of the Communications Decency Act (“CDA”), which immunizes interactive computer services from liability for postings by third parties, does shield defendant YOLO Technologies, Inc. against product liability claims, the court declared, but does not apply to claims based on misrepresentation. The lead named plaintiff in the putative class action is the estate of Carson James Bride who, at the age of 16, was so distraught over anonymous taunting messages that he committed suicide. The complaint seeks redress on behalf of “approximately 10 million users of app ‘YOLO’.”

Also suing are three anonymous teenagers and the Tyler Clementi Foundation, a national advocacy group seeking to combat cyberbullying. Clementi, 18, committed suicide based on a video that was posted.

The action against YOLO was dismissed with prejudice by District Court Judge Fred W. Slaughter of the Central District of California based on the CDA. 

Yolo argued on appeal:

“Because the Putative Class  Members cannot demonstrate any error in that ruling, this Court should affirm the  judgment of dismissal for Defendant-Appellee YOLO Technologies. Inc…..

“In so doing, the Court will protect a foundational pillar that supports the  growth and preservation of a key national interest—the Internet. Section 230 of the  CDA grants immunity to Internet publishers of content created or developed not by  them, but instead by third parties. Congress enacted this law to promote the free  exchange of information and ideas over the Internet, and to encourage voluntary  monitoring for offensive or obscene material.”

The appellants contended:

“Here, the District Court first erred by not distinguishing the failure to warn and misrepresentation claims which solely focus on YOLO’s own statements and conduct: a conspicuous and misleading notification that it would reveal and ban bad actors on the platform.”

It maintained that with a staff of no more than 10, YOLO could not have fulfilled its promise to protect against improper communications given the daily traffic of about 10 million users.

Ninth Circuit’s Opinion

Yesterday’s opinion partially reversing that action was authored by Sixth U.S. Circuit Court of Appeals Judge Eugene E. Siler, sitting by designation.

Siler noted that “§ 230 protects apps and websites which receive content posted by third-party users (i.e., Facebook, Instagram, Snapchat, LinkedIn, etc.) from liability for any of the content posted on their services, even if they take it upon themselves to establish a moderation or filtering system, however imperfect it proves to be.”

Explaining why §230 does not apply to the misrepresentation claims, he wrote:

“YOLO repeatedly informed users that it would unmask and ban users who violated the terms of service. Yet it never did so, and may have never intended to. Plaintiffs seek to enforce that promise—made multiple times to them and upon which they relied—to unmask their tormentors. While yes, online content is involved in these facts, and content moderation is one possible solution for YOLO to fulfill its promise, the underlying duty being invoked by the Plaintiffs…is the promise itself….Therefore, the misrepresentation claims survive.”

He went on to say:

“Section 230 prohibits holding companies responsible for moderating or failing to moderate content. It does not immunize them from breaking their promises.”

Product Liability Claims

The plaintiffs asserted in their product liability claims that the YOLO’s app is inherently dangerous it allows anonymous communications which have led teenage users to commit suicide.

“At root, all Plaintiffs’ product liability theories attempt to hold YOLO responsible for users’ speech or YOLO’s decision to publish it,” he said, concluding that “[t]his is essentially faulting YOLO for not moderating content in some way, whether through deletion, change, or suppression.”

The jurist declared that the “product liability theories…attempt to hold YOLO liable as a publisher of third-party content” and are “foreclosed” by §230.

The case is Estate of Bride v. Yolo Technology, Inc., 23-55134.

 

Copyright 2024, Metropolitan News Company