Matthew Green on Nostr: I’m referring to the rise of “private” federated machine learning and ...
I’m referring to the rise of “private” federated machine learning and model-building work, where the end result is to give corporations new ways to build models from confidential user data. This data was previously inaccessible (by law or customer revulsion) but now is fair game.
A typical pitch here is that, by applying techniques like Differential Privacy, we can keep any individual user’s data “out of the model.” The claim: the use of your private data is harmless, since the model “based on your data” will be statistically close to one without it.
A typical pitch here is that, by applying techniques like Differential Privacy, we can keep any individual user’s data “out of the model.” The claim: the use of your private data is harmless, since the model “based on your data” will be statistically close to one without it.