Mario Zechner on Nostr: While I had caching and rate limits in place, which saved me from more costs, I ...
While I had caching and rate limits in place, which saved me from more costs, I didn't have a robots.txt. Lesson learned.
The reason I looked into the logs was because I saw about 2k uniques but 250k requests on a single day in my anonymized goaccess logs. That didn't seem right. Neither did the cost I saw for the past two days for OpenAI.
So I dug into the access logs and saw Google bot going crazy.
The reason I looked into the logs was because I saw about 2k uniques but 250k requests on a single day in my anonymized goaccess logs. That didn't seem right. Neither did the cost I saw for the past two days for OpenAI.
So I dug into the access logs and saw Google bot going crazy.