Wednesday, Debtember 22, 2021
Schadenfreude at some badly written bots
Back in October, I removed a whole section of my Gemini site because I got fed up with badly written bots. In the best practices guide there is a section about redirection limits (something that should be in the specification but for some reason isn't). In the section of my site I removed, I had such a test—a link that would always redirect to a link that itself would redirect, ad nauseum. So I would see a new client being tested and perhaps a dozen or two attempts to resolve the link and then it would stop.
But then there were the clients written for automated crawling of Gemini space that never bothered with redirection limits.
It got so bad that there was one client stuck for a month,
endlessly making requests for a resource that was forever out of reach.
I had even listed the redirection test as a place for bots NOT to go in the robots.txt
file for the site
(also a companion specification).
So because it can be needlessly complex to locate a bot contact address, I was like, XXXX it. I removed the redirection test (along with the rest of the section, which included a general Gemini client test because, well, XXXX it). Why waste my bandwidth for people who don't care?
I only bring this up now because I'm noticing two bots who are attempting to crawl a now non-existant redirection test, probably with a backlog of thousands of links because hey, who tests their bots anyway?