Wired AI on Nostr: This Prompt Can Make an AI Chatbot Identify and Extract Personal Details From Your ...
This Prompt Can Make an AI Chatbot Identify and Extract Personal Details From Your Chats
Security researchers created an algorithm that turns a malicious prompt into a set of hidden instructions that could send a user's personal information to an attacker.
https://www.wired.com/story/ai-imprompter-malware-llm/

Security researchers created an algorithm that turns a malicious prompt into a set of hidden instructions that could send a user's personal information to an attacker.
https://www.wired.com/story/ai-imprompter-malware-llm/