splinter on Nostr: There are two sides to this, the first is protecting the user from having to see the ...
There are two sides to this, the first is protecting the user from having to see the shit, and the second is tackling the content itself.
Reports are used by clients to determine which content they hide from other users, taking care of the first problem.
As for the second problem, people often forget that Nostr relays only carry text. The actual offending content is stored on other platforms such as image hosts, etc. It is those platforms that have a duty to weed out abuse images, etc.
This means Nostr as a whole is reasonably well equipped to tackle this problem without requiring centralized means of controlling and erasing content. It's critical that text can't be erased Nostr wide, that's the whole point. Nostr relays do not talk to each other.
Reports are used by clients to determine which content they hide from other users, taking care of the first problem.
As for the second problem, people often forget that Nostr relays only carry text. The actual offending content is stored on other platforms such as image hosts, etc. It is those platforms that have a duty to weed out abuse images, etc.
This means Nostr as a whole is reasonably well equipped to tackle this problem without requiring centralized means of controlling and erasing content. It's critical that text can't be erased Nostr wide, that's the whole point. Nostr relays do not talk to each other.