Tuesday, Debtember 31, 2024
A preference for deterministic tools over probabilistic tools
Last month,
I added code to my assembler to output BASIC code instead of binary to make
it easier to use assembly subroutines from BASIC.
But I've been working on a rather large program that assembles to nearly 2K of object code,
and it takes a bit of time to POKE
all that data into memory.
So I took a bit of time
(maybe an hour total)
to add a variation—instead of generating a bunch of DATA
statements and using POKE
to insert the code into memory,
generate a binary file,
and output BASIC code to load said file into memory.
No changes to the assembly code are required.
So the sample code from last month:
.opt basic defusr0 swapbyte .opt basic defusr1 peekw INTCVT equ $B3ED ; put argument into D GIVABF equ $B4F4 ; return D to BASIC org $7F00 swapbyte jsr INTCVT ; get argument exg a,b ; swap bytes jmp GIVABF ; return to BASIC peekw jsr INTCVT ; get address tfr d,x ; transfer to X ldd ,x ; load word from given address jmp GIVABF ; return to BASIC end
I can now generate the previous BASIC code:
10 DATA189,179,237,30,137,126,180,244,189,179,237,31,1,236,132,126,180,244 20 CLEAR200,32511:FORA=32512TO32529:READB:POKEA,B:NEXT:DEFUSR0=32512:DEFUSR1=32520
or now a binary version and the BASIC code to load it into memory:
10 CLEAR200,32511:LOADM"EXAMPLE/BIN":DEFUSR0=32512:DEFUSR1=32520
For this small of a program,
it's probably a wash either way,
but when the assembly code gets large,
it not only takes a noticeable amount of time,
but it also take a considerable amount of space as the DATA
statements still exist in memory.
But as I was finishing up on this code, I had an epiphany on why I'm not so keen on AI. The features I added to my assembler are there to facilitate easier development. They do save time and effort, and sans any bugs, they just work. With AI like ChatGPT or Copilot, the output is not deterministic but probablistic—it may be correct, it may be mostly correct, it may be complete and utter garbage but you can't tell without going over the output. They just don't work one hundred percent of the time, and that just doesn't work for me. I prefer my tools to be reliable, not “mostly” reliable.
That it may write boilerplate code faster? Why are programmers writing boilerplate code in the first place? I recall IDEs of the past that would generate all the boilerplate code for a GUI-based application for the programmer, no AI required at the time. Automatic refactorings have been a thing in Java IDEs for a decade, maybe two now? No AI required there, and it's more reliable than AI too.
I don't even buy the “but it makes it faster to write software” excuse. I'm not sure why being the “first to maket” is even a thing. Microsoft was not first to the market with the GUI—that was Apple. And no, the Macintosh computer wasn't the first system with a GUI, nor even the first system with a GUI from Apple (that was the Lisa). In fact, Microsoft Windows 1.0 wasn't even good (seriously—it's not pretty). Google wasn't the first web search engine (there's easily a dozen engines, maybe more, before Google even showed up). Facebook wasn't the first “social media” type site (My Space and Friendsters come to mind). Amazon wasn't the first on-line retailer.
And so on.
But hey, there are plenty of programmers who find them useful. I'm just not one one of them. The use of AI for programming is totally alien to my way of thinking.