Event JSON
{
"id": "a74c814cb285a322c6ec8d7e3c2ab3b947589baeecf64081a580b804a1eff7ab",
"pubkey": "373644f487c5c76466d92b5c01b00b5d0422c961437723ec62ee9e9a76cd381c",
"created_at": 1703909222,
"kind": 1,
"tags": [
[
"p",
"3aa006c3ddf9e138ccc1b4358414b470102a4ccdd7533a6d7b9137cd94c6a89a",
"wss://relay.mostr.pub"
],
[
"p",
"290e6b4c745bac342bb9e69a459477b5328d0793dd6dfa81fe085fd61fbdafb3",
"wss://relay.mostr.pub"
],
[
"p",
"a1be97d6902c38bb2b1745c00a3b5f4c36db7d5b26dab91d5409a1f4c230f54c",
"wss://relay.mostr.pub"
],
[
"p",
"4b15ec44388d4d363dd87cc03a8ff4f667a24c9a1eb33f070767c983eacc9c80",
"wss://relay.mostr.pub"
],
[
"p",
"9e9c58006890e3e31e306e20518f236965f1d8ee97640175d74afb88769a7c4d",
"wss://relay.mostr.pub"
],
[
"p",
"2282f531cd6204c7bca386d95164bb61508ad565a37ea11da86b082a664acb40",
"wss://relay.mostr.pub"
],
[
"e",
"4794aa4e0d22a73eae2fc057cdcecdd084d53b579f5e7a9925cc855680ddc1ee",
"wss://relay.mostr.pub",
"reply"
],
[
"content-warning",
"probably annoying hot take"
],
[
"proxy",
"https://fosstodon.org/users/urusan/statuses/111667394807093143",
"activitypub"
]
],
"content": "nostr:npub182sqds7al8sn3nxpks6cg995wqgz5nxd6afn5mtmjymum9xx4zdq2vq6t5 nostr:npub19y8xknr5twkrg2aeu6dyt9rhk5eg6punm4kl4q07pp0av8aa47esmvwxgj nostr:npub15xlf045s9sutk2chghqq5w6lfsmdkl2mymdtj825pxslfs3s74xqw7dj0q I think the main thing going on here is that traditional ideas about the separation between developers and users are breaking down for systems like LLMs. When you are using a LLM, you are simultaneously programming it.\n\nThis is why prompt injection attacks are a thing. If the user input were merely passive data it would be impossible to execute such an attack. User input is code.",
"sig": "140778296c3b6db01fb9e493190ffc558ccedfd27450a4a4abe90b3db89000a0f218380814f8fb90af57b03ab37281243581f9ab0ecc0057fd20eeb6145cd279"
}