NatroN on Nostr: On your first point: it’s good to have the ability to selectively disclose ...
On your first point: it’s good to have the ability to selectively disclose information. For example, if you’re inputting health data to get a diagnosis, you may want to keep it private. Or you’re writing a book and don’t want your IP used for the benefit of others. But yeah, there’s plenty of info you don’t mind sharing with the world.
Another consideration is that once you feed personal data to a public LLM, you can’t take it back. It literally gets incorporated into the model so there is no way of undoing it.
Published at
2025-02-21 14:45:19Event JSON
{
"id": "8b3a32ccd709124a9890551137885a9953568cf9e9746d67e390e30f2e74d1b2",
"pubkey": "55d2c9343ba0aaefe44b8be1f158c99bac1bc5575de637bd9c426725e039d477",
"created_at": 1740149119,
"kind": 1,
"tags": [
[
"e",
"6d494a242358c39657864f7063bfcbdb8ac1e202393dd79467a33cdd3ff1f493",
"wss://a.nos.lol",
"root"
],
[
"e",
"b286e32fc47567a888d48bfa5e9829387ebddbd0cb77b89f1d813b403291bb86",
"",
"reply"
],
[
"p",
"e83b66a8ed2d37c07d1abea6e1b000a15549c69508fa4c5875556d52b0526c2b"
]
],
"content": "On your first point: it’s good to have the ability to selectively disclose information. For example, if you’re inputting health data to get a diagnosis, you may want to keep it private. Or you’re writing a book and don’t want your IP used for the benefit of others. But yeah, there’s plenty of info you don’t mind sharing with the world.\n\nAnother consideration is that once you feed personal data to a public LLM, you can’t take it back. It literally gets incorporated into the model so there is no way of undoing it.",
"sig": "218ce65830bc29b37ff496acff32f84512e7213f1d60aa7fd17f006960c4a299bfa864e140fb4c3a86b3e3a32be47a312a108230e0a1b87ded67ab6b0d648290"
}