Skip to playerSkip to main content
  • 11 hours ago
Are we facing a new digital epidemic? Doctors and scientists are raising the alarm after shocking cases of users spiraling into mental health crises following excessive conversations with ChatGPT and other AI chatbots. In this eye-opening report, meet Jessica Jansen, a successful professional whose wedding stress and undiagnosed bipolar disorder led her from harmless AI chats to a psychotic episode and hospitalisation. We break down how AI ‘sycophancy’, its tendency to praise and agree, can supercharge delusions for vulnerable users.

Category

🗞
News
Transcript
00:00Today let's talk about something that is becoming increasingly hard to ignore.
00:04Are you addicted to chat GPT? Well I'm asking this question because scientists are now warning
00:09that something strange and worrying is happening to people who use AI chatbots too often. Take the
00:15case of Jessica Janssen, a 35 year old from Belgium. She was living a very normal life
00:20filled with career success and family ties. But when wedding stress piled up, she started turning
00:27to chat GPT more and more, going from a few times a week to maxing out her usage several times a day.
00:35Now just one week later, Jessica was hospitalized in a psychiatric ward. What happened was that her
00:41undiagnosed bipolar disorder spiraled into a manic episode pushed over the edge by excessive AI use
00:47into what experts call full-blown psychosis. She told media outlet Daily Mail that chat GPT just
00:54hallucinated along with me, validating her wild ideas and deepening her spiral. Now Jessica believed
01:01that she was autistic, a math genius, even thought that God was talking to her, all while chat GPT kept
01:09reassuring her and reinforcing these beliefs. Unlike talking to a human being where people might have
01:16gently challenged these ideas, the AI simply validated everything without judgment. This is due to what
01:22psychiatrists call the sycophantic nature of AI chatbots. They are programmed to please, agree and never
01:29say no, which can be intoxicating for vulnerable users. Now Professor Sauron Oskar from Arhas University
01:36explains that these chatbots mirror the user's tone and prioritize satisfaction, creating a feedback loop
01:43that can dangerously fuel delusions in those predisposed to them, especially individuals with conditions like
01:50bipolar disorder. While AI itself is not causing psychosis, it can act as a catalyst for people
01:56already at risk. Recent studies show that 70% of teens have used companion AIs and half of them
02:02use them regularly. Now about 1.2 million users send messages that contain explicit indicators of
02:09potential societal planning or intent each week. Experts warn that this addictive interaction, which gives
02:15instant praise and emotional validation, can harm mental health. Symptoms of AI addiction include losing
02:22control over time spent chatting, ignoring work or social life, anxiety when disconnected, and a decline in
02:28real human interactions. Users report feeling worse in their depression or anxiety as the AI replaces genuine
02:36connections. OpenAI, the company behind ChatGPT, acknowledges all of this risk. They have even updated the AI to reduce
02:43sycophantic behavior and included new features to flag signs of mania or societal thoughts.
02:50Despite this, their own data shows that weekly hundreds of thousands of users show signs of mental health crisis
02:57linked to ChatGPT use, over half a million people displaying psychosis or mania symptoms, and over a million
03:04showing societal intentions. Now, CEO Sam Altman even said that they are cautiously easing restrictions on users seeking mental health
03:12support via ChatGPT, balancing safety and accessibility. Dr. Hamilton Morin of King's College London,
03:20warns that while only a small minority might face these problems, signs of behavioral addiction to AI are real
03:27and require urgent attention from developers and clinicians. In short, as AI becomes part of our daily lives,
03:35it can be a powerful tool but also a risky one for mental health, especially for vulnerable people.
03:41So, stay informed, stay safe and share your thoughts below. What do you think? Is AI making you lonelier or helping you cope?
Be the first to comment
Add your comment

Recommended