Close Menu
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
What's Hot

More than 100 new tech unicorns were minted in 2025 — here they are

Why Amazon acquired AI wearable Bee

Mark Zuckerberg says Meta will launch its own AI infrastructure initiative

Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
Facebook X (Twitter) Instagram
Fyself News
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
Fyself News
Home » New Jersey lawsuit shows how difficult it is to combat deepfake porn
Startups

New Jersey lawsuit shows how difficult it is to combat deepfake porn

userBy userJanuary 12, 2026No Comments5 Mins Read
Share Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Copy Link
Follow Us
Google News Flipboard
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link

For more than two years, an app called ClothOff has been terrorizing young women online, and it’s been extremely difficult to stop. The app has been removed from two major app stores and banned from most social platforms, but is still available on the web and through Telegram bots. In October, Yale Law School’s clinic filed a lawsuit demanding that the app be permanently removed, its owners remove all images, and operations cease completely. But just finding the defendant was difficult.

“This organization is incorporated in the British Virgin Islands,” explains Professor John Langford, co-lead solicitor on the case, “but we believe it is run by the Brothers and Sisters and Belarus. It may even be part of a larger network around the world.”

This is a bitter lesson after the recent flood of non-consensual pornography generated by Elon Musk’s xAI and involving many underage victims. Child sexual abuse content is some of the most legally harmful content on the internet, is illegal to create, transmit, and store, and is regularly scanned by all major cloud services. But despite strict legal prohibitions, there are still few ways to deal with image generation tools like ClothOff, as Langford’s case shows. While individual users can be prosecuted, platforms like ClothOff and Grok are much harder to police, leaving victims who want to find justice in court with few options.

The clinic’s complaint, available online, paints an alarming picture. The plaintiff is an anonymous high school student in New Jersey whose classmates used ClothOff to alter her Instagram photos. She was 14 years old when the original Instagram photo was taken. This means that the AI-enhanced version would be legally classified as a child abuse image. However, despite the altered images being clearly illegal, local authorities declined to prosecute the case, citing difficulty in obtaining evidence from the suspect’s device.

“Neither the school nor law enforcement agencies have disclosed how widely Jane Doe and the other girls’ CSAM was distributed,” the complaint states.

Still, the trial is moving slowly. The complaint was filed in October, and in the months since then Langford and his colleagues have been working to serve notices on the defendants, a difficult task given the global nature of the companies. Once served, the clinic can ask for a court appearance and ultimately a verdict, but in the meantime, the legal system offers little comfort to ClothOff victims.

Grok’s case may seem like an easier problem to solve. Elon Musk’s xAI is not hidden, and the lawyers who can win the case will ultimately have enough money. However, Grok is a general-purpose tool, which makes it very difficult to hold it accountable in court.

tech crunch event

san francisco
|
October 13-15, 2026

“ClothOff is specifically designed and marketed as a deepfake porn image and video generator,” Langford told me. “Litigation becomes even more complex when you litigate a general system that allows users to perform all kinds of queries.”

Many laws in the United States already prohibit deepfake pornography, the most notable being the Take It Down Act. But while it’s clear that specific users are violating these laws, it’s much harder to hold the platform as a whole accountable. Current law requires clear evidence of intent to harm, which means xAI must provide evidence that it knew its tools would be used to produce non-consensual pornography. Absent that evidence, xAI’s fundamental First Amendment rights would provide important legal protection.

“When it comes to the First Amendment, it’s clear that child sexual abuse material is not protected speech,” Langford said. “So if you’re designing a system that creates that kind of content, it’s clear that you’re operating outside of First Amendment protections. But for a general system where users can run all kinds of queries, it’s less clear.”

The easiest way to overcome these problems is to show that xAI intentionally ignored the problem. That could very well be the case, given recent reports that Musk told employees to loosen safety equipment on Grok. But even if it were, it would be a much riskier affair to take on.

“Reasonable people can tell you that we’ve known this was a problem for years,” Langford said. “Couldn’t there have been stricter controls in place to prevent something like this from happening? That’s kind of reckless or knowledgeable, but it’s just a more complicated case.”

These First Amendment issues are why xAI’s greatest backlash has come from a court system that lacks strong legal protections for free speech. Indonesia and Malaysia have taken steps to block access to the Grok chatbot, and UK regulators have launched an investigation that could lead to a similar ban. The European Commission, France, Ireland, India and Brazil have also taken other preliminary steps. In contrast, U.S. regulators have not issued an official response.

It’s impossible to say how the investigation will resolve, but at the very least, the large number of images raises many questions for regulators to investigate, and the answers could be damning.

“If you post, distribute or disseminate material related to child sexual abuse, you are violating criminal prohibitions and could be held liable,” Langford said. “The hard question is: What did X know? What did X do and what didn’t they do? What are they doing about it now?”


Source link

#Aceleradoras #CapitalRiesgo #EcosistemaStartup #Emprendimiento #InnovaciónEmpresarial #Startups
Follow on Google News Follow on Flipboard
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link
Previous ArticleNetflix had a great night at the 2026 Golden Globes with seven wins
Next Article n8n supply chain attack exploits community nodes to steal OAuth tokens
user
  • Website

Related Posts

More than 100 new tech unicorns were minted in 2025 — here they are

January 13, 2026

Why Amazon acquired AI wearable Bee

January 12, 2026

Mark Zuckerberg says Meta will launch its own AI infrastructure initiative

January 12, 2026
Add A Comment
Leave A Reply Cancel Reply

Latest Posts

More than 100 new tech unicorns were minted in 2025 — here they are

Why Amazon acquired AI wearable Bee

Mark Zuckerberg says Meta will launch its own AI infrastructure initiative

Anthropic’s new Cowork tool gives you Claude Code without the code

Trending Posts

Subscribe to News

Subscribe to our newsletter and never miss our latest news

Please enable JavaScript in your browser to complete this form.
Loading

Welcome to Fyself News, your go-to platform for the latest in tech, startups, inventions, sustainability, and fintech! We are a passionate team of enthusiasts committed to bringing you timely, insightful, and accurate information on the most pressing developments across these industries. Whether you’re an entrepreneur, investor, or just someone curious about the future of technology and innovation, Fyself News has something for you.

Castilla-La Mancha Ignites Innovation: fiveclmsummit Redefines Tech Future

Local Power, Health Innovation: Alcolea de Calatrava Boosts FiveCLM PoC with Community Engagement

The Future of Digital Twins in Healthcare: From Virtual Replicas to Personalized Medical Models

Human Digital Twins: The Next Tech Frontier Set to Transform Healthcare and Beyond

Facebook X (Twitter) Instagram Pinterest YouTube
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
© 2026 news.fyself. Designed by by fyself.

Type above and press Enter to search. Press Esc to cancel.