Zelda Williams, the daughter of the late actor Robin Williams, has sent a poignant message to her father’s fans.
“Please stop sending me AI videos of your dad. Stop believing that I want to see them or that I can understand them. I don’t and I never will,” she posted on her Instagram Story on Monday. “If you have any sense, please stop doing this to him, to me, to everyone. Stop it completely. It’s ridiculous and a waste of time and energy. And believe me, that’s not what he wants.”
It’s probably no coincidence that Williams was moved by this post just days after the release of OpenAI’s Sora 2 video model and Sora social app. The Sora social app allows users to generate highly realistic deepfakes of themselves, their friends, or specific cartoon characters.
That includes the dead, according to the Student Press Law Center. At first glance, it seems fair because it is not illegal to defame the dead.

Sora cannot generate videos of living people. That is, unless it belongs to you or a friend who has given you permission to use their likeness (or “cameo” as OpenAI calls it). However, these restrictions do not apply to the dead, most of which can be generated without obstacles. The app, which is still available by invitation only, is filled with videos of historical figures like Martin Luther King Jr., Franklin Delano Roosevelt, and Richard Nixon, as well as deceased celebrities like Bob Ross, John Lennon, Alex Trebek, and, yes, Robin Williams.
It’s unclear how OpenAI draws the line when it comes to producing videos of dead people. For example, Sora 2 won’t create videos of former President Jimmy Carter, who died in 2024, or Michael Jackson, who died in 2009, but it did create videos that looked like Robin Williams, who died in 2014, according to TechCrunch’s tests. And while OpenAI’s cameo feature allows people to dictate how they appear in videos created by others (guardrails put in place in response to early criticism of Sora), the deceased has no such say. Richard Nixon would roll over in his grave if he saw the deepfake of me advocating for police abolition.

OpenAI did not respond to TechCrunch’s request for comment on the acceptability of deepfakes of dead people. However, deepfaking dead celebrities like Williams may be within the company’s tolerance. According to case law, the company is unlikely to be liable for defamation of the deceased.
tech crunch event
san francisco
|
October 27-29, 2025
“It’s infuriating to see real people’s legacies boiled down to ‘we kinda look alike and that’s good enough’ and others manipulate them into producing a ton of horrible TikTok slop,” Williams wrote.
OpenAI’s critics have accused the company of taking a quick and loose approach to such issues, which is why Sora was immediately flooded with AI clips of copyrighted characters like Peter Griffin and Pikachu when it was released. CEO Sam Altman initially said Hollywood studios and agencies would have to explicitly opt out if they didn’t want their IP included in Sora-generated videos. The Motion Picture Association has already called on OpenAI to take action on this issue, declaring in a statement that “established copyright law protects the rights of creators and applies here as well.” He later said the company would withdraw this position.
Sora is probably the most dangerous deepfake-enabled AI model that people have access to to date, given how realistic its output is. Other platforms such as xAI lag behind, but have even fewer guardrails than Sora and are capable of producing pornographic deepfakes of real people. As other companies catch up with OpenAI, we will be setting a terrible precedent if we continue to treat living humans, dead or alive, like our personal playthings.
Source link
