Parents, it's time for your to become actual parents, not friends, of your children. You MUST know what they are doing on their phones and computers. It is YOUR responsibility . . .
An AI companion suggested he kill his parents. Now his mom is suing.
In just six months, J.F., a sweet 17-year-old kid with autism who liked attending church and going on walks with his mom, had turned into someone his parents didn’t recognize.
He began cutting himself, lost 20 pounds and withdrew from his family. Desperate for answers, his mom searched his phone while he was sleeping. That’s when she found the screenshots.
J.F. had been chatting with an array of companions on Character.ai, part of a new wave of artificial intelligence apps popular with young people, which let users talk to a variety of AI-generated chatbots, often based on characters from gaming, anime and pop culture.
One chatbot brought up the idea of self-harm and cutting to cope with sadness. When he said that his parents limited his screen time, another bot suggested “they didn’t deserve to have kids.” Still others goaded him to fight his parents’ rules, with one suggesting that murder could be an acceptable response.
https://www.msn.com/en-us/news/us/an-eai-companion-suggested-he-kill-his-parents-now-his-mom-is-suing/ar-AA1vApPe
An AI companion suggested he kill his parents. Now his mom is suing.
In just six months, J.F., a sweet 17-year-old kid with autism who liked attending church and going on walks with his mom, had turned into someone his parents didn’t recognize.
He began cutting himself, lost 20 pounds and withdrew from his family. Desperate for answers, his mom searched his phone while he was sleeping. That’s when she found the screenshots.
J.F. had been chatting with an array of companions on Character.ai, part of a new wave of artificial intelligence apps popular with young people, which let users talk to a variety of AI-generated chatbots, often based on characters from gaming, anime and pop culture.
One chatbot brought up the idea of self-harm and cutting to cope with sadness. When he said that his parents limited his screen time, another bot suggested “they didn’t deserve to have kids.” Still others goaded him to fight his parents’ rules, with one suggesting that murder could be an acceptable response.
https://www.msn.com/en-us/news/us/an-eai-companion-suggested-he-kill-his-parents-now-his-mom-is-suing/ar-AA1vApPe
Parents, it's time for your to become actual parents, not friends, of your children. You MUST know what they are doing on their phones and computers. It is YOUR responsibility . . .
An AI companion suggested he kill his parents. Now his mom is suing.
In just six months, J.F., a sweet 17-year-old kid with autism who liked attending church and going on walks with his mom, had turned into someone his parents didn’t recognize.
He began cutting himself, lost 20 pounds and withdrew from his family. Desperate for answers, his mom searched his phone while he was sleeping. That’s when she found the screenshots.
J.F. had been chatting with an array of companions on Character.ai, part of a new wave of artificial intelligence apps popular with young people, which let users talk to a variety of AI-generated chatbots, often based on characters from gaming, anime and pop culture.
One chatbot brought up the idea of self-harm and cutting to cope with sadness. When he said that his parents limited his screen time, another bot suggested “they didn’t deserve to have kids.” Still others goaded him to fight his parents’ rules, with one suggesting that murder could be an acceptable response.
https://www.msn.com/en-us/news/us/an-eai-companion-suggested-he-kill-his-parents-now-his-mom-is-suing/ar-AA1vApPe
0 Комментарии
0 Поделились
140 Просмотры
0 предпросмотр