Cory Doctorow on Nostr: Here's a fun AI story: a security researcher noticed that large companies' ...
Here's a fun AI story: a security researcher noticed that large companies' AI-authored source-code repeatedly referenced a nonexistent library (an AI "hallucination"), so he created a (defanged) malicious library with that name and uploaded it, and thousands of developers automatically downloaded and incorporated it as they compiled the code:
https://www.theregister.com/2024/03/28/ai_bots_hallucinate_software_packages/1/
Published at
2024-04-01 14:44:24Event JSON
{
"id": "423a24c2198bf03bfa32f5237b38bfc42b416aa0e7cc6ebe4872cf3de7d85281",
"pubkey": "21856daf84c2e4e505290eb25e3083b0545b8c03ea97b89831117cff09fadf0d",
"created_at": 1711982664,
"kind": 1,
"tags": [
[
"proxy",
"https://mamot.fr/users/pluralistic/statuses/112196495887778626",
"activitypub"
]
],
"content": "Here's a fun AI story: a security researcher noticed that large companies' AI-authored source-code repeatedly referenced a nonexistent library (an AI \"hallucination\"), so he created a (defanged) malicious library with that name and uploaded it, and thousands of developers automatically downloaded and incorporated it as they compiled the code:\n\nhttps://www.theregister.com/2024/03/28/ai_bots_hallucinate_software_packages/\n\n1/",
"sig": "1d42b338f0ef8dc7170f1c42da8a69ea4dec029f0607026eb432484ddc415b56396c5fdfeb05574294d45ae5388315fdc7d081026a6dfb2ac72df5cec8c2b9eb"
}