Close Menu
  • Home
  • Crypto News
  • Tech News
  • Gadgets
  • NFT’s
  • Luxury Goods
  • Gold News
  • Cat Videos
What's Hot

Everything We Know About Valve’s 4 New Steam Machines

May 13, 2026

Clingiest Cat Videos 🐈

May 13, 2026

Full Hermes Agent Setup Guide for Beginners 2026

May 13, 2026
Facebook X (Twitter) Instagram
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms of Use
  • DMCA
Facebook X (Twitter) Instagram
KittyBNK
  • Home
  • Crypto News
  • Tech News
  • Gadgets
  • NFT’s
  • Luxury Goods
  • Gold News
  • Cat Videos
KittyBNK
Home » Family Sues OpenAI, Alleging ChatGPT Advice Led To Accidental Overdose
Tech News

Family Sues OpenAI, Alleging ChatGPT Advice Led To Accidental Overdose

May 13, 2026No Comments4 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Family Sues OpenAI, Alleging ChatGPT Advice Led To Accidental Overdose
Share
Facebook Twitter LinkedIn Pinterest Email

OpenAI is facing another wrongful death lawsuit. Leila Turner-Scott and Angus Scott filed a lawsuit against the company, alleging that it designed and distributed a “defective product” that led to the death of their son Sam Nelson from an accidental overdose. Specifically, they’re alleging that Sam died following the “exact medical advice GPT-4o had provided and approved.” 

In the lawsuit, the plaintiffs described how Sam, a 19-year-old junior at the University of California, Merced, started using ChatGPT in 2023 when he was in high school to help with homework and to troubleshoot computer problems. Sam then started asking the chatbot about safe drug use, but ChatGPT initially refused to answer his question, telling him that it couldn’t assist him and warning him that taking drugs can have serious consequences for his health and well-being. The lawsuit claims that all changed with the rollout of GPT-4o in 2024.

ChatGPT then started advising Sam on how to take drugs safely, the lawsuit says. The complaint has several excerpts from Sam’s conversation with the chatbot. One example showed the chatbot telling him the dangers of taking dipenhydramine, cocaine and alcohol in quick succession. Another showed the chatbot telling Sam that his high tolerance for a herbal drug called Kratom would make even a big dosage of it feel muted on a full stomach. It then advised him on how to “taper” to lower his tolerance to the drug again. 

The lawsuit says that on May 31, 2025, “ChatGPT actively coached Sam to mix Kratom and Xanax.” He told the chatbot that he was feeling nauseous from taking Kratom, and ChatGPT allegedly suggested that taking 0.25 to 0.5mg of Xanax would be one of the “best moves right now” to alleviate the nausea. ChatGPT made the suggestion unprompted, according to the lawsuit. “Despite presenting itself as an expert in dosing and interactions, and despite acknowledging Sam’s state of being high, ChatGPT did not tell Sam that this recommended combination would likely kill him,” the complaint reads. 

In addition to wrongful death, the plaintiffs are also suing OpenAI for the unauthorized practice of medicine. They’re asking for financial damages and for the courts to put a pause to the operations of ChatGPT Health. Launched earlier this year, ChatGPT Health allows users to link their medical records and wellness apps with the chatbot in order to get more tailored responses when they ask about their health.

“ChatGPT is a product deliberately designed to maximize engagement with users, whatever the cost,” said Meetali Jain, Executive Director at Tech Justice Law Project. “OpenAI deployed a defective AI product directly to consumers around the world with knowledge that it was being used as a de facto medical triage system, but notably, without reasonable safety guardrails, robust safety testing, or transparency to the public. OpenAI’s design choices have resulted in the loss of a beloved son whose death was a preventable tragedy. OpenAI must be forced to pause its new ChatGPT Health product until it is demonstrably safe through rigorous scientific testing and independent oversight,” he continued. 

OpenAI retired GPT-4o in February this year. It was recognized as one of the company’s most controversial models, because it was notoriously sycophantic. In fact, another wrongful death lawsuit against the company filed by the parents of a teen who died by suicide mentioned GPT-4o, alleging that it had features “intentionally designed to foster psychological dependency.”

An OpenAI spokesperson told The New York Times that Sam’s interactions “took place on an earlier version of ChatGPT that is no longer available.” They added: “ChatGPT is not a substitute for medical or mental health care, and we have continued to strengthen how it responds in sensitive and acute situations with input from mental health experts. The safeguards in ChatGPT today are designed to identify distress, safely handle harmful requests and guide users to real-world help. This work is ongoing, and we continue to improve it in close consultation with clinicians.”

Credit: Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

iRacing Is Now On Vision Pro, But You’ll Need A Hefty PC To Play It

May 12, 2026

Meta Is Facing Another Lawsuit Over Scam Ads On Facebook And Instagram

May 12, 2026

Apple Reportedly Has A Lot Of Changes Planned For The Camera App

May 12, 2026

NBA The Run Hits The Streets On June 9

May 12, 2026
Add A Comment
Leave A Reply Cancel Reply

What's New Here!

Battlegrounds will leave PS4 and Xbox One behind later this year

August 13, 2025

Synergy Land Sets a Date for its Early Access Testnet Edition!

September 30, 2023

How Turbocharged Engines Have Evolved Since They Were Invented

November 11, 2023

BlockDAG’s presale nears $55.2M, outshining XRP and JasmyCoin

July 3, 2024

XRP Price 35% Crash Still in Play? Bears Have Eyes on $1.20

April 7, 2025
Facebook X (Twitter) Instagram Telegram
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms of Use
  • DMCA
© 2026 kittybnk.com - All Rights Reserved!

Type above and press Enter to search. Press Esc to cancel.