What is Nostr?
Urusan /
npub1xum…auka
2023-12-30 04:07:02
in reply to nevent1q…p9mu

Urusan on Nostr: npub182sqd…vq6t5 npub19y8xk…vwxgj npub15xlf0…7dj0q I think the main thing going ...

npub182sqds7al8sn3nxpks6cg995wqgz5nxd6afn5mtmjymum9xx4zdq2vq6t5 (npub182s…q6t5) npub19y8xknr5twkrg2aeu6dyt9rhk5eg6punm4kl4q07pp0av8aa47esmvwxgj (npub19y8…wxgj) npub15xlf045s9sutk2chghqq5w6lfsmdkl2mymdtj825pxslfs3s74xqw7dj0q (npub15xl…dj0q) I think the main thing going on here is that traditional ideas about the separation between developers and users are breaking down for systems like LLMs. When you are using a LLM, you are simultaneously programming it.

This is why prompt injection attacks are a thing. If the user input were merely passive data it would be impossible to execute such an attack. User input is code.
Author Public Key
npub1xumyfay8chrkgeke9dwqrvqtt5zz9jtpgdmj8mrza60f5akd8qwq72auka