What is Nostr?
IFTAS /
npub1hn0…a8n2
2023-12-13 20:11:03

IFTAS on Nostr: IFTAS intends to provide guidance, and operate or facilitate services to support ...

IFTAS intends to provide guidance, and operate or facilitate services to support electronic service providers (ESPs) who require assistance mitigating Child Sexual Abuse Media (CSAM) on their services.Motivation

IFTAS serves the independent social media trust and safety community, and is driven in large part by the community Needs Assessment.

Support for CSAM issues is consistently ranked as one of the most requested needs, and as such IFTAS is seeking to mitigate the legal exposure and personal trauma faced by ESPs and content moderators who are tasked with moderating CSAM. 

Regulatory compliance requires ESPs to either actively scan for, or respond to reports of CSAM on their service. Understanding the regulatory requirements is confusing and jurisdictionally complex. Detection solutions can be costly, technically difficult to implement, and pose an additional regulatory burden. Moderating CSAM can be traumatic. The various bodies engaged in child safety are not open to working with thousands of ActivityPub service providers.

IFTAS wishes to:

Promote a healthier, safer Internet;
Reduce the regulatory burden and legal exposure for ESPs;
Minimise harm to content moderators;
Provide or facilitate the use of CSAM classification services while preserving privacy and security to the fullest extent possible;
Reduce duplicative effort;
Serve as a trusted voice for this issue in the open social web.IFTAS Activities and Services

IFTAS intends to make various resources available, including but not limited to the following:Content moderator trauma support

Moderators exposed to CSAM via their moderation workflows have expressed the need for post-trauma support. Working with University Middlesex London Centre of Abuse and Trauma Studies, IFTAS is reviewing self-help materials and guidance for trauma mitigation resources, to be made available on the forthcoming IFTAS community library.Legal and regulatory guidance

While we have published some guidance already (https://github.com/iftas-org/resources/tree/main/CSAM-CSE), IFTAS plans to consult with domain experts in relevant jurisdictions to provide guidance for ESPs, updating routinely to ensure accurate, actionable guidance from trustworthy sources.IFTAS Media classification

Safer is an IFTAS-hosted enterprise deployment that performs hash-matching on images and videos securely transmitted from opted-in services to IFTAS for classification, and creates an automatic report to NCMEC if required. Shield is a hash-matching API from the Canadian Centre for Child Protection that can be called to examine locally-hosted media (images and video) and provide a classification. 3-is is a similar service oriented to EU hosts. IFTAS is exploring methods to facilitate access to these services. 

https://safer.io/https://projectarachnid.ca/en/#shield, https://www.3-is.eu/ fedi-safety

fedi-safety is an open source clip interrogation tool that can help classify images. IFTAS is exploring methods to facilitate the use of fedi-safety locally, or as a third-party service.Known Hashtags

IFTAS plans to provide service administrators with a rolling list of known hashtags in use by sellers and sharers of CSAM, to support local service moderation decisions.Known Hosts

IFTAS plans to provide service administrators with a rolling list of services seen to host CSAM with no intent or ability to moderate the content, to support local service moderation decisions.Best Practice

IFTAS is consulting with child safety experts including INHOPE, Arachnid, NCMEC, End Violence Against Children and others to source and share best practices for moderation workflow enhancements to minimise harm for moderators likely to be exposed to CSAM. for example blurring images, using monochrome, and using a dedicated browser profile for this work.Reference Material

IFTAS Moderator Needs Assessment Report (Q3 2023)
CSAM-CSE (IFTAS Guidance) 
About Child Safety on Federated Social Media – Fediversity – SocialHub
#140 – WIP: PhotoDNA Attestation extension (CW: mention of CSAM) – fep – Codeberg.org
2023-08-04 Special Topic Call – Social Web and CSAM: Liabilities and Tooling
Integrate PhotoDNA to scan for known CSAM · Issue #21027 · mastodon/mastodon · GitHub
https://github.com/mastodon/mastodon/issues?q=CSAM
Lemmyshitpost community closed until further notice – Lemmy.World
https://github.com/LemmyNet/lemmy/issues?q=CSAM
Stanford researchers find Mastodon has a massive child abuse material problem – The Verge
Child Safety on Federated Social Media 

https://about.iftas.org/2023/12/13/iftas-csam-roadmap/
Author Public Key
npub1hn0zj3efv2yg8rt9h7vvkqstzqmkpfsvjhefwyg7rrdguyyvlllqa2a8n2