What modern people can learn from old software

Did you know that you can now go, for free, to Archive.org, the great online library of all things, and load into your web browser an old, run-down emulated computer—a 1991 DOS box, a black-and-white Mac, a green-and-black Apple II— and run the WordPerfect of yore, boot old HyperCard stacks, or use VisiCalc from 1979 like God intended?

Maybe that doesn’t seem like a miracle to you. Industrial fair. Moore’s Law has taken us from about 250 billion CPU swaps per year on the earliest Macs to a trillion potential clock cycles on a good gaming PC, a healthy 4,000,000x increase. Anyone in their right mind might legitimately ask What? Why use a brand new computer to run old spreadsheets? And I might nod and shrug, but inside I’m a translucent plastic iMac full of emotion. Because I think it’s important to imitate.

You can learn history by doing Read books and to visit museums; You could even walk across a battlefield. But you can’t understand software from screenshots any more than you can understand music from album reviews, or baseball from box scores, or Rome from gladiator movies, the way you like gladiator movies. When you boot up a virtual version of a Macintosh from 30 years ago, you share the lived experiences of millions of ancients. You can see how they spent their meager CPU budget to fill their low-resolution screens.

You learn their priorities. They started with batch processing and running programs as chunks of code, but as soon as the CPUs allowed, they made them interactive and alive. Even if those were just green numbers on a screen, à la VisiCalc. As soon as they could, early users went post-textual, pictorial—pointing at things, abandoning Spartan virtue in favor of Athenian excess. Later, in Moore’s affluence, we spent new CPU cycles on colors, networks, or sound, from beeps to playing CDs to MP3s.

Emulation reminds me to wonder if the computing experience keeps getting better. I’m writing this in Google Docs so my editor’s little round avatar head can look in and make sure I don’t miss my deadline for once, but I would prefer writing it in WordPerfect 5.1 for DOS, the best word processor of all time—a blank screen lit only with letters and numbers, and offering just enough bold and italics to keep things interesting. I remember WP51 the way a non-nerd might remember a vintage Mustang. You could just take that thing out and go, man.

But it is more than a museum trip for self-enrichment. Emulation forces me to get back to basics – to remember that for most people, computers are tools, not a lifestyle. When I buy a computer, one of the first things I do is set up my software emulation environments, which are now about a terabyte of old disk images and various operating systems. Keeping this story close helps me accept the horrifying truth that everything new in our industry was actually invented by a group of Californians sitting in beanbags during the Carter administration. What seems permanent today is as fleeting as Twitter’s fleets. GAFA becomes FAANG becomes MAMAA. There will be new acronyms soon.

I recently did the leap from software-based emulation to specialized hardware. I bought a small black metal box about the size of three packs of playing cards that contained what’s called a field-programmable gate array — a shape-shifting circuit that takes on characteristics of other devices. It is solely for simulating retro machines including Commodore’s Amiga and 64, Atari STs, 486s and various gaming platforms which are the main event for most people (Neo Geos, Game Boys, Atari Lynx, up to space war! on the PDP-1).

The box is called MiSTer. It’s not a consumer product, it’s a people-made reference platform: if you buy these parts and assemble them, then download free software and plug in an HDMI card, it becomes an old machine. You pay around $600 for this privilege. It gives me the same joy I imagine in people who like expensive headphones or collect vintage vinyl – this feeling that something exists more real. The cores simulate everything, all the little glitches and weirdnesses and timings that make a chip a chip, that make the mouse move the way you remember it. Watching old code run on a modern big, sharp screen is hyperreal. Like a Madeleine by Proust, but by Cinnabon.

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée.