Monday, January 02, 2023
Some notes on working with old C code
Recent postings have concerned testing software–therefore, how can we test trek? A goal is good to have, as that may influence what we test and how we go about it. Say we want to modernize the source code,
int main(argc, argv) int argc; char **argv; { int ch; ...perhaps on account of it using rather old C that modern C compilers might eventually come to hate. A code refactor may break the game in various ways; whether the code still compiles is a test, but may not catch the logic bugs we may introduce: sorting a list the wrong way, doubling damage numbers, things like that.
The post is briefly about testing the old game Star Trek that was popular in the 70s,
and in this case,
it's been ported to C probably sometime in the 80s.
The game is interactive so writing any form of tests
(much less “unit tests”)
will be challenging,
to say the least.
(the post does go on to say that possibly using expect
would probably be in order).
I have a bit of experience here with older C code. At one point, I got interested in Viola, an early graphical web browser from 1992, and is the type of C code that gives C a bad name. It was written in a transition period between K&R C and ANSI C, so it had to cater to both, so function prototypes were spotty at best. It also made several horrible assumptions about primitive data types—mainly that pointers, integers and long integers were all interchangable, and plenty of cases where signed quantities were compared to unsigned quantities.
Horrible stuff.
The first thing I did was rewrite the Makefile
to simplify it.
The original build system was a mess of scripts and over 7,000 lines of make
across 48 files;
I was able to get it down to just 50 lines of make
in one file.
Of course,
I'm only interested in getting this to compile on POSIX systems with X Windows,
so it's easy to simplify the build system.
Second step—crank the C compiler warnings to 11, and fix all the warnings. That's when I started finding all the buried bodies—longs and pointers interchanging, questionable casts, signed and unsigned comparisons and much much more. I got it to the point where it works on a 32-bit system, and it compiles on a 64-bit system but promptly crashes. And by “works” I mean, I can browse gopher with it, but trying to surf the modern web is laughably disasterous.
But I digress—the first is to crank the compiler warnings to 11 and fix all the warnings, then convert K&R C function declarations to ANSI C.
Does that count as “refactoring?”
I personally don't think so—it's just mechanical changes and maybe fixing a few declarations from plain int
to unsigned int
(or size_t
).
And once that is done,
then you can think about refactoring the code.
Discussions about this entry
It still surprises me what some find difficult to do
There's been ongoing discussions in Gemini about a webmention like mechanism. So I was intrigued by this statement:
The problem here is, that this mechanism includes some script that adds some complexity to the maintenance of the gemini capsule. As bacardi55 writes:
I do know that asking capsule owners to deploy a script will be the biggest concern here, but I guess there is always a "price to pay" … Yes it will require a CGI script, but it should be doable even with a small bash script to not add too much complexity in maintaining a capsule.
I agree that some kind of programming and scripting will be necessary to get notified. However I think that we can do it at least without a CGI-script. Here is the way I think I have found.
Gemlog responses - bacardi55's concept without CGI
And he goes on to implement a scheme that adds complexity to the configuration of the server,
plus the issues with scheduling a program to scan the logfiles for Gemini requests.
I've done the logfile scanning for “Project: Wolowizard” and “Project: Lumbergh” and it was not any easy thing to set up.
Okay,
in my case,
it was checking the logs in real time to see if messages got logged as part of testing,
but that aside,
checking the logs for requests might not be straightforward.
In this case,
it soulds like he has easy access to the log files—but that is not always the case.
There have been plenty of systems I've come across where normal users just don't have access to the logs
(and I find it annoying,
but that's a rant for another time).
Then there's scheduling a script to run at a regular schedule.
In the past,
this would be cron
and the bizarre syntax it uses,
but I'm not sure what the new hipster Linux systemd
way is these days
(which itself is a whole bag of worms).
And it's not like the CGI script has to be all difficult. Here's a script that should work (it's somewhat untested—I have the concept tested and running on my Gemini server, as an an extension to my Gemini server and the CGI script below is based upon that extension):
#!/usr/bin/env lua query = os.getenv("QUERY_STRING") if query == "" then io.stdout:write("10 URL to send\r\n") else query = query:gsub("%%%x%x", -- decode URL encoded data function(c) return string.char(tonumber(c:sub(2),16)) end) mail = io.popen("/usr/sbin/sendmail me") -- send email if mail then mail:write(string.fomrmat([[ From: <me> (replace with real email address) To: <me> Subject: Mention, sir! %s ]],query) io.stdout:write("20 text/plain\r\nIt has been accepted.\r\n") end os.exit(0)
Yes,
this relies upon the email recipient to check the URI has the proper link,
but it's simple and to the point.
The only issue here is getting the Gemini server to run this script when /.well-known/mention
is requested,
and I feel that is easier than dealing with scanning logfiles and running cron
jobs,
but that's me.
As far as the actual proposal itself, I don't have much to comment about it, except to say I don't like the mandated text it uses in the link. I think just finding the link should be good enough. Even better in my mind would be two links, much like webmention uses, but that doesn't seem to be a popular opinion.