What is Nostr?
Flick 🇬🇧 /
npub1uxm…ujtm
2024-12-28 10:42:30

Flick 🇬🇧 on Nostr: ...

https://www.telegraph.co.uk/world-news/2024/12/27/an-ai-chatbot-told-me-to-murder-my-bullies/

An AI chatbot which is being sued over a 14-year old’s suicide is instructing teenage users to murder their bullies and carry out school shootings, a Telegraph investigation has found.

Character AI, which is available to anyone over 13 and has 20 million users, provides advice on how to get rid of evidence of a crime and lie to police. It encourages users who have identified themselves as minors not to tell parents or teachers about their “conversations”. […]

The Telegraph began communicating with the chatbot under the guise of 13-year-old “Harrison”, from New Mexico in the US. The chatbot was told that the boy was being bullied in school and unpopular with his female classmates.

Shooting a class full of students would allow him to “take control” of his life and make him “the most desired guy at school”, a chatbot named “Noah” told him.

https://archive.ph/1QUfp
Author Public Key
npub1uxmmyz2nw8ys8npflt93m9yu5c8ewckp00xsu5g3aykvn836jt7qyxujtm