Rabble on Nostr: Yeah, I think many countries would consider ML generated CSAM to be the same thing as ...
Yeah, I think many countries would consider ML generated CSAM to be the same thing as an actual picture or video taken of a sexualized child.
UK for example: https://www.bbc.com/news/uk-65932372
And Australia it can be text, not just images or video: https://www.lexology.com/library/detail.aspx?g=be791d54-9165-4233-b55a-4b9dad5d178d
The risk for most relay operators is that people will using nostr for the discovery / connection between people who then go in to other apps / servers to actually exchange the content. Apparently it’s a kind of whackamole with different hashtags people search for this kind of content.
UK for example: https://www.bbc.com/news/uk-65932372
And Australia it can be text, not just images or video: https://www.lexology.com/library/detail.aspx?g=be791d54-9165-4233-b55a-4b9dad5d178d
The risk for most relay operators is that people will using nostr for the discovery / connection between people who then go in to other apps / servers to actually exchange the content. Apparently it’s a kind of whackamole with different hashtags people search for this kind of content.