Two Texas parents filed a lawsuit this week against the makers of Character.AI, claiming the artificial intelligence chatbot is a "clear and present danger to minors," with one plaintiff alleging it encouraged their teen to kill his parents.
According to the complaint, Character.AI "abused and manipulated" an 11-year-old girl, introducing and exposing her "consistently to hypersexualized interactions that were not age appropriate, causing her to develop sexualized behaviors prematurely and without [her parent's] awareness."
The complaint also accuses the chatbot of causing a 17-year-old boy to mutilate himself, and, among other things, sexually exploiting and abusing him while alienating the minor from his parents and church community.
More
According to the complaint, Character.AI "abused and manipulated" an 11-year-old girl, introducing and exposing her "consistently to hypersexualized interactions that were not age appropriate, causing her to develop sexualized behaviors prematurely and without [her parent's] awareness."
The complaint also accuses the chatbot of causing a 17-year-old boy to mutilate himself, and, among other things, sexually exploiting and abusing him while alienating the minor from his parents and church community.
More
AI chatbot encouraged teen to kill his parents, lawsuit claims
A lawsuit has been filed against Character Technologies over its artificial intelligence chatbot, Character.AI, alleging the bot encouraged a teen to kill his parents.
www.foxbusiness.com