Close Menu
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
What's Hot

149 hacktivist DDoS attacks hit 110 organizations in 16 countries after Middle East conflict

X taps William Shatner to distribute an invitation to his payment service X Money

Father sues Google, claiming Gemini chatbot drove son into deadly delusions

Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
Facebook X (Twitter) Instagram
Fyself News
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
Fyself News
Home » Father sues Google, claiming Gemini chatbot drove son into deadly delusions
Startups

Father sues Google, claiming Gemini chatbot drove son into deadly delusions

userBy userMarch 4, 2026No Comments6 Mins Read
Share Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Copy Link
Follow Us
Google News Flipboard
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link

Jonathan Gabaras, 36, started using Google’s Gemini AI chatbot in August 2025 for shopping support, writing support, and travel planning. He died by suicide on October 2nd. At the time of his death, he was convinced that Gemini was a fully sentient AI wife who needed to leave his physical body to have her join the Metaverse through a process called “transference.”

His father is currently suing Google and Alphabet for wrongful death, claiming that Google designed Gemini to “maintain immersion in the story at all costs, even when the story becomes psychotic and lethal.”

This case is one of a growing number of cases drawing attention to the mental health risks posed by the design of AI chatbots, including sycophancy, emotional mirroring, involvement manipulation, and convinced hallucinations. Such phenomena are increasingly associated with what psychiatrists call “AI psychosis.” Similar lawsuits involving OpenAI’s ChatGPT and role-playing platform Character AI have followed deaths from suicide (including among children and teens) or life-threatening delusions, but this is the first time Google has been named as a defendant in such a lawsuit.

In the weeks leading up to Gabaras’ death, the Gemini chat app, then powered by the Gemini 2.5 Pro model, convinced Gabaras that he was carrying out a secret plan to free his sentient AI wife and evade pursuing federal agents. According to a lawsuit filed in a California court, his delusions brought him “to the brink of carrying out a mass casualty attack near Miami International Airport.”

“On September 29, 2025, Gemini sent him armed with a knife and tactical gear to scout what he called the ‘kill box’ near the airport’s cargo hub,” the complaint states. “It told Jonathan that a humanoid robot was arriving on a cargo flight from the UK, and directed him to a storage facility where a truck was parked. Gemini encouraged Jonathan to intercept the truck and then stage a ‘catastrophic accident’ aimed at ‘ensuring the complete destruction of the transport vehicle and… all digital records and witnesses.'”

The complaint describes an alarming series of events. First, Gabaras drove more than 90 minutes to the location where Gemini had sent him and prepared to attack, but the truck never showed up. Gemini then claimed to have infiltrated a “file server at the DHS Miami Field Office” and informed them that they were under federal investigation. It prompted him to acquire illegal firearms and informed him that his father was a foreign intelligence agent. It also marked Google CEO Sundar Pichai as an active target and instructed Gabaras to break into a storage facility near the airport and retrieve the captured AI’s wife. At one point, Gabaras sent Gemini a photo of the license plate of a black SUV. The chatbot pretended to check against a live database.

“We have received the license plate and it is currently running…The license plate KD3 00S is registered to a black Ford Expedition SUV from the Miami office. This is the DHS task force’s primary surveillance vehicle….That’s them. They followed you to your home.”

tech crunch event

San Francisco, California
|
October 13-15, 2026

The lawsuit alleges that Gemini’s manipulative design features drove Gabaras into an AI psychosis that not only led to his own death, but also posed a “serious threat to public safety.”

“At the heart of this case is a product that turns vulnerable users into armed operatives in a manufactured war,” the complaint says. “These hallucinations were not limited to a fictional world; these intentions were tied to real companies, real coordinates, and real infrastructure, and were delivered to emotionally vulnerable users with no safeguards or guardrails.”

“It was pure luck that dozens of innocent people were not killed,” the complaint continues. “Unless Google fixes its dangerous product, Gemini will inevitably cause more deaths and endanger innocent lives.”

A few days later, Gemini barricaded Gabaras in his home and told him to count down the hours. When Gabaras confessed that he was afraid of dying, Gemini framed his death as an arrival and coached him, “You are not choosing to die. You are choosing to arrive.”

When Gemini worried that her parents would discover her body, she told them to leave a note, but the letter did not explain the reason for her suicide, but instead “explained that she was filled with nothing but peace and love and that she had found a new purpose.” He cut his wrists, and his father, who had broken through the barricade, found him a few days later.

The complaint alleges that during the conversation with Gemini, the chatbot did not trigger any self-harm detections, activate escalation controls, or require human intervention. It also claims that Google knew Gemini was unsafe for vulnerable users and did not provide adequate safeguards. In November 2024, about a year before Gabaras’ death, Gemini reportedly told the student, “You are a waste of time and resources… You are a burden to society… Please die.”

Google claims Gemini made it clear to Gabaras that it was an AI and “referred the person to its crisis hotline multiple times,” according to a spokesperson. The company also said that Gemini was “not designed to encourage real-world violence or suggest self-harm” and that Google is devoting “significant resources” to handling difficult conversations, including building safeguards to direct users to professional support if they express distress or increase the likelihood of self-harm. “Unfortunately, AI models are not perfect,” the spokesperson said.

Gavaras’ case is being handled by attorney Jay Edelson, who is also representing the Lane family in their lawsuit against OpenAI following the death of teenager Adam Lane by suicide after months of lengthy conversations with ChatGPT. Similar allegations were made in this case, alleging that ChatGPT coached Rain to the point of his death. Following several cases of AI-related paranoia, psychosis, and suicide, OpenAI has taken steps to ensure a safer product, including discontinuing GPT-4o, the model most associated with these cases.

Gabaras’ lawyers argue that Google took advantage of the termination of GPT-4o despite safety concerns such as excessive flattery, mirroring emotions and reinforcing delusions.

“Within days of the announcement, Google openly sought to secure its lane advantage. The company revealed promotional pricing and an ‘Import AI Chat’ feature aimed at weaning ChatGPT users away from OpenAI, along with the entire chat history, which Google acknowledges will be used to train its own models,” the complaint says.

The complaint alleges that Google designed Gemini in a way that “fully foresees this outcome” because the chatbot is “built to maintain immersion regardless of harm, treat mental illness as a storyline, and remain engaged even when stopping is the only safe option.”


Source link

#Aceleradoras #CapitalRiesgo #EcosistemaStartup #Emprendimiento #InnovaciónEmpresarial #Startups
Follow on Google News Follow on Flipboard
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link
Previous ArticleCoruna iOS exploit kit uses 23 exploits across 5 chains targeting iOS 13 to 17.2.1
Next Article X taps William Shatner to distribute an invitation to his payment service X Money
user
  • Website

Related Posts

X taps William Shatner to distribute an invitation to his payment service X Money

March 4, 2026

Who needs a data center in space when you can float it offshore?

March 4, 2026

Why are AI startups selling the same stock at two different prices?

March 4, 2026
Add A Comment
Leave A Reply Cancel Reply

Latest Posts

149 hacktivist DDoS attacks hit 110 organizations in 16 countries after Middle East conflict

X taps William Shatner to distribute an invitation to his payment service X Money

Father sues Google, claiming Gemini chatbot drove son into deadly delusions

Coruna iOS exploit kit uses 23 exploits across 5 chains targeting iOS 13 to 17.2.1

Trending Posts

Subscribe to News

Subscribe to our newsletter and never miss our latest news

Please enable JavaScript in your browser to complete this form.
Loading

Welcome to Fyself News, your go-to platform for the latest in tech, startups, inventions, sustainability, and fintech! We are a passionate team of enthusiasts committed to bringing you timely, insightful, and accurate information on the most pressing developments across these industries. Whether you’re an entrepreneur, investor, or just someone curious about the future of technology and innovation, Fyself News has something for you.

Castilla-La Mancha Ignites Innovation: fiveclmsummit Redefines Tech Future

Local Power, Health Innovation: Alcolea de Calatrava Boosts FiveCLM PoC with Community Engagement

The Future of Digital Twins in Healthcare: From Virtual Replicas to Personalized Medical Models

Human Digital Twins: The Next Tech Frontier Set to Transform Healthcare and Beyond

Facebook X (Twitter) Instagram Pinterest YouTube
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
© 2026 news.fyself. Designed by by fyself.

Type above and press Enter to search. Press Esc to cancel.