spmatich :blobcoffee: on Nostr: npub1xr8gd…f3q0g npub1cay9c…xekld it's not so much the models as the dataset used ...
npub1xr8gd35szene0r2yvfafr29tjwdedgwd89lnfpmh3pylnygnvd4qmf3q0g (npub1xr8…3q0g) npub1cay9cekft9qlmepdfrltf7syapgkr7pz2x5hlsjx2m2x4rmusplstxekld (npub1cay…ekld) it's not so much the models as the dataset used to train them? If it becomes law to disclose what and whose data is used to train models, then presumably there will be a financial incentive for models where it is not disclosed.
I'm thinking about deep fakes and how they might be used for coercion, extortion or blackmail. In which case if the model used to produce a deep fake has used the targets data without permission, to make it more convincing, that might have a market. Not just for criminal activity, but also espionage.
Or maybe it would just be a dataset from socials of a company like meta. So like Cambridge Analytica, but where the data is used to train models. It's not easy to prove where the data came from once the model has been trained. So black market models might be a way to hide data theft or unauthorized access as well.
I'm thinking about deep fakes and how they might be used for coercion, extortion or blackmail. In which case if the model used to produce a deep fake has used the targets data without permission, to make it more convincing, that might have a market. Not just for criminal activity, but also espionage.
Or maybe it would just be a dataset from socials of a company like meta. So like Cambridge Analytica, but where the data is used to train models. It's not easy to prove where the data came from once the model has been trained. So black market models might be a way to hide data theft or unauthorized access as well.