I just discovered, via CNN, that the Commodore 64 turned 25 last summer.

I received a Commodore 64 as a much appreciated gift for my confirmation, when I was in my first year of high school (9th grade). It was named after its then remarkable 64kB of memory; its operating system and its Basic interpreter fit into an additional 32kB of ROM. It had a graphic and a music processor that were not bad for the time, and it was endlessly fun to play with it. Its Basic language had instructions to read (*peek*) and write (*poke*) directly onto memory locations, and this was how pretty much everything was done. To draw on the screen one would set a certain bit of a certain memory location to a certain value to switch to a graphic mode, and then one would directly write on the bitmap. Similarly one could play a note on a certain channel, with a certain waveform for a certain amount of time by setting certain other memory locations. In 6th to 8th grade (middle school) we studied music, which consisted in part of learning how to play a recorder. The least said about my recorder-playing skills the better, but I left 8th grade with a stack of very simplified music scores of famous songs, which I then proceeded to translate into the numerical codes required by the C=64 music card so that I could make it play the songs. I also amused myself with more complicated projects, usually involving the drawing of 3-dimensional objects on the screen.

People that have met me later in life may be surprised to learn that I spent long hours *programming for fun*. Not that I need to be defensive or anything, and I certainly did not know so then, but, at the time, programming, even in the very basic Basic that came with the computer, was the closest thing I could do to math. Certainly, it was much closer than the “math” I was getting in school, which consisted in learning how to run certain numerical and algebraic algorithms by hand. Indeed I don’t think I encountered anything closer to math than programming until the first year of college, when the whole notion of axioms, theorems, proofs, and “playing a game with meaningless symbols” was unloaded on me in a course innocuously termed “Geometry.” (Nominally a course on linear algebra, the course was a parody of Bourbakism as a teaching style. In the first class the professor came in and said, *a vector space is a set with two operations that satisfy the following nine axioms. Now I should like to prove the following proposition…* I am not joking when I say that the fact that the elements of a $k$-dimensional vector space are $k$-tuples of numbers came as a revelation near the very end of the course.)

The fact that the “type” of a program is similar to a statement and the

implementation of a program is similar to a proof should be familiar to anybody who has written both. In both cases, one needs to break down an idea into basic steps, be very precise about how each step is realized, if a sequence of steps is repeated twice in a similar way one should abstract the similarity, write the abstracted version separately, and then use the abstracted version twice, and so on. The Curry-Howard isomorphism establishes this connection in a formal way, between a certain way of writing proof (say, Gentzen proof system with no cut in intuitionistic logic) and a certain way of writing programs (say, typed $\lambda$-calculus). I know because I once took a course based on the totally awesome book *Proofs and Types* by Girard, which is out of print but available for free on the web.

But we were talking about the Commodore 64. There was something amazing about a functional computer with an operating system fitting into a few kilobytes, and many people could understand it inside out. One could buy magazines that were in good part devoted to Basic programs that one could copy, typically video games. Naturally, one would then be able to change the game and to see what a reasonably non-trivial program would look like. The operating system I am using now has a source code that is probably millions of lines long, there is probably no person that has a complete understanding of it, and it sometimes does mysterious things. It is also able to handle more than one program running at a time. It was fun to turn on a computer, instantly get a prompt, type *10 PRINT “HELLO WORLD”* and then *RUN*, while now one has to do this. Of course riding a bike is simpler than driving a car which is simpler than piloting an airplane, but they have different ranges.

Under the Curry-Howard isomorphism, programming in the modern sense is more like Algebraic Geometry. One has to spend a lot of time learning how to use an expansive set of libraries, and in one’s lifetime it would be impossible to reconstruct how everything works from first principles, but then one has really powerful tools. I prefer the hands-on ethos of Combinatorics, where the big results are not general theorems, but rather principles, or ways of doing things, that one learns by reading other people’s papers, and replicating their arguments to apply them to new settings, changing them as needed.

And before I get distracted once more away from what is nominally the subject of this post, happy birthday to the Commodore 64 and to whoever is turning 30 tomorrow.

“I am not joking when I say that the fact that the elements of a k-dimensional vector space are k-tuples of numbers came as a revelation near the very end of the course.”

I believe you mean k-tuples of numbers all but finitely many of which are zero.

to whoever is turning 30 tomorrow.Happy Birthday to you?

“Under the Curry-Howard isomorphism, programming in the modern sense is more like Algebraic Geometry.”

I disagree in the sense that modern programming uses more powerful concepts, but once you get an interpreter set up you’re pretty much where you started as a youngster.

#3: I was thinking of something like programming in C++ under .net, or in any setting where you have enormous libraries and there is a big overhead in doing simple things, but considerable savings in doing complicated things. I guess writing a script in Perl, which would count as “modern,” is still very simple.

wow, my one moment of fame in theory — thanks, luca!

Speaking of turning 30, the Apple II and Commodore PET also turned 30 this year. The original PET had a monochrome monitor in the same case as the keyboard (unlike the Apple II which had a separate monitor and color capability) and like the Apple II, the main input device was a tape recorder, which also became the main device for the VIC-20, the Commodore 64’s very limited predecessor. (64KB of RAM was positively luxurious compared to the 4 or 5KB that the VIC-20 or original PET had.)

I got a summer job programming a rate quotation system on the 16KB Commodore PET to be used by group insurance agents in field offices of a large insurance company. The large floppy disks (in this later model) went bad so frequently we were constantly copying them to make sure we didn’t lose everything.

I give the company credit for seeing the potential benefits of personal computers but there was some tension between the two us who wrote BASIC and played games on the PET and the PL/I coders who had to write out their neatly coded sheets for keypunchers to type up and enter them into the IBM mainframe in another building.

Nice Post

re: #1

insert pedantic comment about choice of basis here.