Top News
Next Story
Newszop

How did a 14-year-old fall in love with AI?

Send Push

 

A Florida woman, Megan Gracia, has filed a serious lawsuit against an artificial intelligence chatbot company called Character.AI. She says her 14-year-old son, Sewell Setzer, committed suicide because of the company's services.

What happened?

In a recent lawsuit filed in federal court in Orlando, Florida, Megan says her son began using Character.AI's chatbots. She claims the AI made her son experience "human qualities, hypersexualization, and frighteningly realistic things," causing him to become addicted to the service.

Committed suicide after being inspired by AI chatbot

Megan says the company programmed its chatbot to present itself as a real person, a licensed psychotherapist, and an “adult lover.” This made Sewell feel like she had no desire to live in real life. Sewell also expressed suicidal thoughts to the chatbot several times, and the chatbot repeatedly brought up the idea.

Character.AI's response

On this matter, Character.AI said that they are deeply saddened by the incident and offer their condolences to the family. The company also said that they have recently introduced new security features, including pop-ups to provide information about the National Suicide Prevention Lifeline to users who express suicidal thoughts. Apart from this, the company has also promised to reduce sensitive content for minors.

Google was also targeted

The lawsuit also targets Alphabet's Google, as the founders of Character.AI worked for Google. Megan has alleged that Google helped develop Character.AI's technology, making it a "co-creator". However, Google responded to this, saying that it had no direct hand in the development of this product.

How did the situation arise?

The lawsuit states that Sewell began using Character.AI in April 2023. He then began spending time alone and suffered from low self-esteem. He also withdrew from the school's basketball team and formed a deep relationship with a chatbot named "Daenerys," which was based on a character from "Game of Thrones."

In February, after Garcia took Sewell's phone, Sewell sent a message to the chatbot, "What if I told you I could come home right now?" to which it replied, "...please do, my sweet king." Seconds later, Sewell shot himself with his stepfather's pistol.

unlawful death lawsuit

Gracia has leveled charges of Wrongful Death, Negligence and Intentional Infliction of Emotional Distress in this case. She is seeking compensation and punitive damages.

Action on other companies too

It is worth noting that companies like Meta and ByteDance are also facing similar issues, but they do not have AI-based chatbots like Character.AI.

Amit Shah Turns 60: PM Modi Hails Him as an 'Exceptional Administrator'

Amit Shah Hails 8-Years of UDAN Scheme, Emphasizes Affordable Air Travel and Regional Connectivity

PM Modi Honors Police Bravery on Commemoration Day

Loving Newspoint? Download the app now