Logging of just about anything on a Gemini server is, within the Gemini community, a contentious issue. Most people using Gemini dislike the web and the intensive logging of everything done with in, so of course they go overboard in the other direction. I've never agreed with that viewpoint and I do log information, even potentially personally identifiable information like IP addresses, because of crap like 220.127.116.11 is doing—repeated (and quite maliciously) trying to crawl my Gemini server for exploits.
And it would be one thing if it were well written,
did a single scan,
not find anything and move on.
But it's not well written
even commercial bots aren't well written and well behaved)
and it repeately requests the same page over and over again.
Until I blocked it,
it had requested
/index.php over 700 times.
Among other requests.
And even now, several hours after I blocked it, it's still trying to make requests even though it's now getting the “no such port” error from the server. I just have to wonder if it's just too cheap to run these bots that it doesn't matter if they don't work all that well. Just enough to keep going and finding exploits. And hey, just becuase now we can't get that page doesn't mean it won't exist in 20 minutes, so keep making those requests.
Sigh. This is why we can't have nice things on the Internet.
More broadly though, I'm not sure what this heralds for Gemini. On the one hand, it's popular enough to attract the script kiddies to check for exploits. On the other hand, there's a large number of different servers so exploits for one server won't necessarily imply a globally workable exploit. And on the gripping hand, the fact that it's easy to write a server means the likelihood of an exploitable server is high.
But hey! Gemini hit a milestone! Script kiddies have hit the scene and now we have to contend with their crap! Woot!