fishcake on Nostr: Yeah, fairly simple indeed. Scan for nudity and then scan the same image for the age ...
Yeah, fairly simple indeed. Scan for nudity and then scan the same image for the age of people in it. Then human has to verify if it’s CSAM and report it if it is. Fairly effective given the simplicity, and good at catching AI CSAM
Published at
2024-10-28 22:09:22Event JSON
{
"id": "0c523f7a5179d06c85a1bed2dc7e2be05a571a3000b6dd77953231f992f6053d",
"pubkey": "8fb140b4e8ddef97ce4b821d247278a1a4353362623f64021484b372f948000c",
"created_at": 1730153362,
"kind": 1,
"tags": [
[
"e",
"f6155315121d937ad83739de741bbe3464e993029ba547863a5aa7ba9fd4eaa7",
"",
"root"
],
[
"e",
"cfc9510b54b29045398403533dafc3f84f301a378c5fd9a27faa5f670b8c033b",
"",
"reply"
],
[
"p",
"32e1827635450ebb3c5a7d12c1f8e7b2b514439ac10a67eef3d9fd9c5c68e245"
]
],
"content": "Yeah, fairly simple indeed. Scan for nudity and then scan the same image for the age of people in it. Then human has to verify if it’s CSAM and report it if it is. Fairly effective given the simplicity, and good at catching AI CSAM",
"sig": "7fc7ea92a336f7fb61156d3d43310e9aa26261282f43ef27507acc1965a0a335d0062e0f6126037696d0477b99b374572e53e1f701032aebac4515643c9f0c61"
}