Zachary Cutlip on Nostr: I created a github repo for `robots.txt` entries to tell generative AI scrapers to ...
I created a github repo for `robots.txt` entries to tell generative AI scrapers to fuck off
Please submit PRs (follow PR guidance in the README)
If there's another project like this please let me know
https://github.com/zcutlip/gen-ai-robots.txtPublished at
2024-07-02 04:30:53Event JSON
{
"id": "ac97909778b4e577448eadb532534dc381d2c072e6d166edd53a1e82618025c8",
"pubkey": "6e189043fe53eb7a31d115b9c86c5fde5e7f066e1793e99bbe4f796bbbc79cb2",
"created_at": 1719894653,
"kind": 1,
"tags": [
[
"proxy",
"https://hachyderm.io/users/zcutlip/statuses/112715015980388720",
"activitypub"
]
],
"content": "I created a github repo for `robots.txt` entries to tell generative AI scrapers to fuck off\n\nPlease submit PRs (follow PR guidance in the README)\n\nIf there's another project like this please let me know\n\nhttps://github.com/zcutlip/gen-ai-robots.txt",
"sig": "2e0bc7192fa5cdfc606cbb5ea4db826e9b38bdf685233c5c86adddabffde55196bc414d6423313a63a12662309e8478a1ef89b3bf2865f25b9025a8ffc3ebf4e"
}