dansup on Nostr: I also want to mention that FediDB uses a well defined User Agent, and does not try ...
I also want to mention that FediDB uses a well defined User Agent, and does not try to bypass limits with remote crawlers or any other means.
I understand there was a disagreement with myself and GtS regarding robots.txt, however, I always meant to add support for them, so I am doing that now.
The crawler page will be updated with instructions on how to block the crawler once that is ready.
https://fedidb.org/crawler.html
I understand there was a disagreement with myself and GtS regarding robots.txt, however, I always meant to add support for them, so I am doing that now.
The crawler page will be updated with instructions on how to block the crawler once that is ready.
https://fedidb.org/crawler.html