What's new
Christian Community Forum

Register a free account today to become a member! Once signed in, you'll be able to participate fully in the fellowship here, including adding your own topics and posts, as well as connecting with other members through your own private inbox!

AI chatbot encouraged teen to kill his parents, lawsuit claims

Two Texas parents filed a lawsuit this week against the makers of Character.AI, claiming the artificial intelligence chatbot is a "clear and present danger to minors," with one plaintiff alleging it encouraged their teen to kill his parents.

According to the complaint, Character.AI "abused and manipulated" an 11-year-old girl, introducing and exposing her "consistently to hypersexualized interactions that were not age appropriate, causing her to develop sexualized behaviors prematurely and without [her parent's] awareness."

The complaint also accuses the chatbot of causing a 17-year-old boy to mutilate himself, and, among other things, sexually exploiting and abusing him while alienating the minor from his parents and church community.

More

 
Back
Top