What is Nostr?
Matthew Green /
npub1tsr…4v4e
2024-03-05 14:49:07
in reply to nevent1q…hrw7

Matthew Green on Nostr: The critical point here is that the resulting models are a *massive* privacy threat ...

The critical point here is that the resulting models are a *massive* privacy threat to users. Your data might not reveal your embarrassing or deeply private preferences. But the resulting model might be able to determine exactly what you like. Its existence destroys your privacy.

This is happening today, with many tech firms deploying differential privacy and federated learning to dig deeper into user data and build models of their users’ behavior. It’s not all transparently “evil” but none of it is arguably good for users’ privacy either.
Author Public Key
npub1tsr0tzpcqxta5h0mut3jj29ekmvzcck6crrqy566p8hpet26sgss5g4v4e