Skip to playerSkip to main content
  • 12 minutes ago
The family of a 19-year-old from the U.S. has initiated legal proceedings against OpenAI following their son's death that occurred after he received drug interaction advice from ChatGPT. Sam Nelson inquired if it was safe to mix kratom and Xanax and was assured it was. The lawsuit, lodged in a California state court, claims that ChatGPT offered harmful and unqualified medical advice. This case brings to light critical issues regarding AI accountability and the well-being of countless Americans who rely on chatbots for health guidance.

Category

πŸ—ž
News
Transcript
00:00The parents of a 19-year-old who died after following drug safety advice from ChatGPT
00:04have filed a lawsuit against OpenAI.
00:07And their account of what happened is alarming.
00:10Sam Nelson used ChatGPT to ask whether it was safe to combine Kratom,
00:14a supplement found in health stores across America, with Xanax,
00:18a widely prescribed anti-anxiety medication.
00:22ChatGPT told him it was safe.
00:24It was not.
00:25Sam died.
00:27His parents alleged that OpenAI's AI platform provided dangerous medical guidance
00:32it had no qualification to give,
00:34and that the program's confident, authoritative tone
00:37gave their son false assurance at a fatal moment.
00:41The case is now in California state court.
00:44It is not the first time ChatGPT has faced allegations of providing harmful health information,
00:49and it raises urgent questions about what happens.
00:53When millions of Americans treat an AI chatbot as a medical authority,
00:57it is not qualified to be.
Comments

Recommended