CHATGPT users may want to think twice before turning to AI apps for treatment or other kinds of emotional support. According to Openai CEO Sam Altman, the AI industry still doesn’t understand how to protect user privacy when it comes to these more sensitive conversations.
Executives made these comments about the recent episode of Theo Von, Theo Von.
In response to questions about how AI works in today’s legal system, Altman said one of the issues that it doesn’t have a legal or policy framework for AI yet is that its user conversations are lacking legal confidentiality.
“People talk about the most personal sh** in Chatgupt’s life,” Altman said. “People use it – especially young people, especially use it – as a therapist, as a life coach. You have these relationship problems. [asking] ‘What should I do? ‘And now, if you talk to a therapist, lawyer, or doctor about those issues, then there is legal privilege. It is confidential and legally confidential for doctors and patients. And when you talk to ChatGpt, we still don’t understand it. ”
This could raise privacy concerns for users in the case of litigation, as Openai is legally required to generate conversations of today, Altman added.
“I think it’s a very mess. I think we should have the same concept of privacy for conversations with AI that we are therapists and so on.
The company understands that lack of privacy can become a blocker for wider user recruitment. In addition to AI requests for so many online data during training periods, we are also asked to create data from user chats in several legal contexts. Already, Openai is fighting court orders in a lawsuit with the New York Times. This requires storing chats from hundreds of millions of chat users around the world, except from ChatGpt Enterprise customers.
TechCrunch Events
San Francisco
|
October 27th-29th, 2025
Openai said in a statement on its website that it called the order “overreach.” If the court can override Openai’s own decision on data privacy, it could open the company to request more legal discovery or law enforcement purposes. Today’s high-tech companies are regularly summoned for user data to help with criminal prosecution. However, there have been additional concerns about digital data recently as the law has begun to restrict access to previously established freedoms, like the rights of women to choose.
For example, the Supreme Court is Roev. When it overturned Wade, customers began switching to more private-duty tracking apps or Apple Health.
Altman also asked the podcast host about his use of ChatGpt, considering he told him he didn’t talk much with the AI chatbot due to his privacy concerns.
“I think it makes sense… that you really want privacy clarity before you use it [ChatGPT] A lot of things – like legal clarity,” Altman said.
Source link