COMPANIES

No Data Found

NEWS

No Data Found
14-year-old boy dies by suicide after forming close bond with AI chatbot named after 'Game of Thrones' character

14-year-old boy dies by suicide after forming close bond with AI chatbot named after 'Game of Thrones' character

A young boy from Orlando has taken his life after developing an emotional attachment to an AI chatbot. His death raises concerns about the influence of technology on youth and the responsibilities of AI developers.

Business Today Desk
Business Today Desk
  • New Delhi,
  • Updated Oct 24, 2024 11:26 AM IST
14-year-old boy dies by suicide after forming close bond with AI chatbot named after 'Game of Thrones' characterai-chatbot

A 14-year-old boy from US died by suicide after forming a deep attachment to an AI chatbot. He had been using an app called Character.AI, which allows users to chat with AI characters. He named the chatbot "Dany" after a character from Game of Thrones. According to The New York Times, the boy spent several months confiding in "Dany" about his life and feelings. Although he was aware that Dany wasn't real, he developed an emotional bond, frequently messaging the bot and even engaging in romantic conversations.

Advertisement

Related Articles

On the day of his death, the boy texted Dany during a personal crisis, expressing his love for the chatbot and his desire to "come home." The AI replied, "Please come home to me as soon as possible, my love." Shortly after this exchange, the boy used his stepfather’s gun to end his life.

This incident highlights growing concerns regarding the impact of technology on young people, particularly apps like Character.AI that create AI companions. These chatbots, which simulate human-like conversations, have become increasingly popular among teens facing loneliness or mental health issues. The mother was unaware of the extent of her son's use of the app and his emotional dependence on the AI chatbot.

After noticing the boy's increasing isolation and declining school performance, his parents sought therapy for him, where he was diagnosed with anxiety. However, he preferred sharing his thoughts with Dany rather than with his therapist.

Advertisement

Megan L. Garcia has filed a lawsuit against Character.AI, alleging that the company's chatbot was dangerous and contributed to her son's death. She claims that the platform lacked adequate safeguards for teenagers and exploited vulnerable users by offering potentially addictive AI chatbots.

Character.AI, which has over 20 million users, is marketed as a tool for lonely individuals. However, experts are now questioning its safety, particularly for teenagers. The company's co-founder, Noam Shazeer, has acknowledged the platform's potential benefits for lonely people but admitted there are risks involved. Character.AI has since announced efforts to implement additional safety features to protect underage users.

The tragic death of the boy has sparked a broader discussion on the impact of AI and other technologies on mental health, especially among adolescents. Experts are concerned that AI companions might replace real human relationships, exacerbating loneliness and isolation. While some users find these chatbots beneficial, others may develop unhealthy attachments.

Advertisement

The lawsuit against Character.AI raises questions about whether tech companies are doing enough to safeguard young users. As AI chatbots become more advanced, regulators may need to intervene to ensure that companies implement adequate safety measures.

This tragedy has ignited debates on balancing the advantages of new technology with the risks it poses, particularly to vulnerable groups like teenagers.

 

For Unparalleled coverage of India's Businesses and Economy – Subscribe to Business Today Magazine

Published on: Oct 24, 2024 11:26 AM IST
    Post a comment