Teen sues to destroy the nudify app that left her in constant fear

A spokesperson told The Wall Street Journal that “nonconsensual pornography and the tools to create it are explicitly forbidden by Telegram’s terms of service and are removed whenever discovered.”

For the teen suing, the prime target remains ClothOff itself. Her lawyers think it’s possible that she can get the app and its affiliated sites blocked in the US, the WSJ reported, if ClothOff fails to respond and the court awards her default judgment.

But no matter the outcome of the litigation, the teen expects to be forever “haunted” by the fake nudes that a high school boy generated without facing any charges.

According to the WSJ, the teen girl sued the boy who she said made her want to drop out of school. Her complaint noted that she was informed that “the individuals responsible and other potential witnesses failed to cooperate with, speak to, or provide access to their electronic devices to law enforcement.”

The teen has felt “mortified and emotionally distraught, and she has experienced lasting consequences ever since,” her complaint said. She has no idea if ClothOff can continue to distribute the harmful images, and she has no clue how many teens may have posted them online. Because of these unknowns, she’s certain she’ll spend “the remainder of her life” monitoring “for the resurfacing of these images.”

“Knowing that the CSAM images of her will almost inevitably make their way onto the Internet and be retransmitted to others, such as pedophiles and traffickers, has produced a sense of hopelessness” and “a perpetual fear that her images can reappear at any time and be viewed by countless others, possibly even friends, family members, future partners, colleges, and employers, or the public at large,” her complaint said.

The teen’s lawsuit is the newest front in a wider attempt to crack down on AI-generated CSAM and NCII. It follows prior litigation filed by San Francisco City Attorney David Chiu last year that targeted ClothOff, among 16 popular apps used to “nudify” photos of mostly women and young girls.

About 45 states have criminalized fake nudes, the WSJ reported, and earlier this year, Donald Trump signed the Take It Down Act into law, which requires platforms to remove both real and AI-generated NCII within 48 hours of victims’ reports.

Leave a Reply

Your email address will not be published.