Finally, someone who shares my irrational interest in retrocomputing on a grand scale.
Archive for category Retrocomputing
Various movies, some documentary and some fictional, chronicle the history and social impact of our field. I thought it would be fun to watch some of these as a group and then talk about them. All are pre-screened by me, so you know they’re good.
For Fall semester 2011, screenings will be in 551 Soda Hall.
BYO popcorn and refreshments. (Suggest a movie, but I have the right to veto movies if I think they suck)
19??-??: Empire of the Air
1941-1945: Forgotten Heroes of Bletchley Park
It is easy to argue that the first electrical/electronic (albeit special-purpose) computer was Colossus, constructed at Bletchley Park, near London, to break Nazi codes in WWII. You may think you know all about it—that Bletchley Park is where Alan Turing developed techniques that were used to crack the Enigma machine’s code (on which DES was based, incidentally). But the effort that really won the war is a story largely untold with heroes mostly unsung, because they were required to hold it in secrecy throughout the rest of their lives after the war ended. This documentary tells the story of how rookie mathematician William Tutte came up with a technique to crack the much more sophisticated “Tunny” cipher, and how engineer Tommy Flowers designed and built the world’s first electromechanical computer to automate it, shortening the war by months or years.
1942-1950: Top Secret Rosies: The Female Computers of WWII
- When Computers Were Human, by David Alan Grier, is a nice treatment of this topic and era.
- Grace Hopper and the Invention of the Information Age, by Kurt Beyer, is a fascinating and humanizing look at a nearly-mythologically-prominent figure in this story.
1955-1975: The Real Revolutionaries
The quintessential—and original—Silicon Valley story of the “traitorous eight” engineers who left Shockley Semiconductor, the first Silicon Valley tech company, to found Fairchild Semiconductor, where the integrated circuit was not only invented but turned into a commercial phenomenon. In time, most of the Fairchild Eight left to form their own spin-offs (the “Fairchildren”), including Intel (Bob Noyce and Gordon Moore) and AMD (Jerry Sanders), inventing the distinctive Silicon Valley corporate culture in the process.
If you’re interested in learning more:
- Crystal Fire: The Invention of the Transistor and the Birth of the Information Age, by Michael Riordan and Lillian Hoddeson
1961-1969: Moon Machines—The Apollo Guidance Computer
[viewed 2/18/11] During Project Apollo, the US managed to land men on the moon, not just once but six times. How did we do that with a guidance computer based on 1960’s technology, whose most innovative hardware feature was the solid-state NOR gate in 4-to-a-package quad flat packs? How did people program a machine whose ROM consisted of long “ropes” of magnetic cores strung onto braids of thin filaments? What kind of UI can you get when the input consists of a 19-key keypad (big enough for astronauts wearing spacesuits) and the output consists of five LED 7-segment displays and a handful of “idiot light” indicators?
Learn more in Moon Machines: the Apollo Guidance Computer, which we’ll screen at 5:30 pm on Tuesday, 2/1, in 551 Soda.
As a bonus, we can check out the “Virtual AGC” open source software that does hardware-level emulation of this computer. It’s a little known fact that all the source (assembly) code for every Apollo mission is in the public domain, having been developed by a civilian agency at taxpayer expense. Virtual AGC can be used to run the same bits the astronauts ran on each Apollo mission.
1975-1991: Triumph of the Nerds (released 1995)
Travel back to the 80’s with Silicon Valley pundit and gossip columnist – er, podcaster -Robert X. Cringely, to learn about the roots of the PC revolution, the early days of the microprocessor, and how we ended up with a computer on every desktop.
- Part 1: Impressing Their Friends
- Part 2: Riding the Bear
- Part 3: Great Artists Steal
If you’re interested in learning more:
- Accidental Empires by Robert X. Cringely, the book on which the miniseries is based
- Insanely Great, by Steven Levy. The story of how the Mac came to be.
- iWoz: From Computer Geek to Cult Icon, by Steve Wozniak with Gina Smith. Autobiographical account of the founding and early days of Apple, and the creation of the landmark Apple ][ computer.
1990-1995: Nerds 2.0.1—A Brief History of the Internet (released 1999)
[viewed 10/20/10] If you’re interested in learning more:
1980-1989: GET LAMP (released 2010)
[viewed 11/17/10] No idea what a “text adventure” is? Read this 1-page intro.
“Before there was the first-person shooter, there was the second-person thinker.” Text adventures, or Interactive Fiction (IF), is one of the oldest categories of interactive entertainment. This documentary looks at the rise and subsequent disappearance of a fascinating subgenre whose potential perhaps remains underappreciated. A gaming scholar writing in 2000 said: “Lured by the siren song of ever-improving graphics power, terrified by the risks involved with truly unique ideas in gaming, the [gaming] industry is collectively stumbling along a path well worn by Hollywood.” Watch this documentary and learn about the road not taken.
As a bonus, we’ll also play the original (1975) ADVENT text adventure (compiled from the original FORTRAN source code), play classic adventures from pioneer Infocom (including Zork) on vintage hardware, and discuss an open-source DSL & bytecode interpreter for writing platform-independent text adventures, developed in the days when “port the software” meant “rewrite the software”.
If you’re interested in learning more:
- Zork: A Computerized Fantasy Simulation Game. IEEE COMPUTER, April 1979. Article by the authors of the original Zork (for PDP-10) who then founded Infocom and ported ZIL (Zork Intermediate Language) and the Z-Machine interpreter to most of the microcomputers of the day, and sold a number of commercially successful text adventures.
- The Inform Designer’s Manual. Inform is a high-level DSL for creating adventure games that compile to ZIL interpretable by the Z-machine, created post hoc by reverse engineering ZIL. The manual is both a language reference and a “howto” for creating interactive fiction. Maybe someone can create an adventure that takes place in and around the RAD Lab…
- Somewhere Nearby is Colossal Cave. Digital Humanities Quarterly, Summer 2007. After the original Crowther source code was found, this article examined the original ADVENT as a work of interactive fiction.
- Twisty Little Passages, by Nick Montfort. A scholarly but accessible survey of IF (interactive fiction) as a genre.
1980-1995: BBS: The Documentary
Imagine a virtual public space where anyone with a (free) account can post or reply to messages, using consumer networking equipment connected to their home PC’s, both of which were rapidly falling in price. Some of these sites are free and operated by hobbyists; others tried to go commercial, and the site managers ranged from smart hobbyists to clueless get-rich-quick schemers who figured if they put some lame pr0n pictures on the site, people might pay a monthly fee for access.
Of course, the sites are Bulletin Board Systems (BBSes), the home PC’s were TRS-80s and Apple IIs and IBM clones, and the consumer networking equipment was analog modems at 300, 1200, 2400, and ultimately 56K baud. It’s easy to forget that as late as the 90s, local phone service was metered, long-distance was unaffordable for most people, and having hundreds of active users meant having hundreds of modems. All the mini-dramas of the Internet—n00bs, pr0n, warez, flame wars, stupid handles, free & open source software vs. payware, clueless investor$, foreign sites circumventing local rules—happened here first.
1993-2000: Download: The True History of the Internet
Until 1990, the Internet was for academics and researchers. The invention of the World Wide Web in 1990 was a major turning point, but what brought the Web “to the masses” was the first point-and-click graphical Web browser, NCSA Mosaic, later commercialized as Netscape Navigator. The story of how Netscape took the IT world by storm, and what happened when Microsoft awoke to the threat, is the subject of this fast-paced documentary that reveals as much about the personalities involved—Marc Andreessen, Jim Clark, Bill Gates—as it does about the technologies that formed the IT battleground of the late 90s.
To learn more:
- Where Wizards Stay Up Late: The Origins of the Internet. Katie Hafner & Mark Lyons. A readable introduction and overview.
1995-2000: Revolution OS (released 2002)
[viewed 12/1/10] Microsoft Windows may have kicked the living daylights out of the Mac, but the war is far from over. Through interviews with Linus Torvalds, Richard Stallman, Eric Raymond and others, we learn about the origins of the open-source movement, the Free Software Foundation, and how Linux became the first serious server OS to challenge Microsoft Windows NT.
1980-1995: The Video Game Revolution
While all those people were earnestly trying to use computers to “improve productivity,” some slackers who thought computers might be fun for just playing games. The first video games used technology that had been brought to the mass market by the commercialization of PCs, and as PCs got more powerful, games went hand in hand. Before long, they had created a multibillion dollar industry that’s larger than the movie business and has become a driver of PC technology. From Space Invaders and Pac-Man to movies inspired by video games, video games have been both a cultural force and a shaper of the IT industry.
If you’re interested in learning more:
- Supercade —a killer coffee-table book with color screenshots/photos of all the major video games since Space Invaders
- The Ultimate History of Video Games by Steven L. Kent—just what the title says. Lots of good books about this topic but this is the most comprehensive.
2002-2010: The Social Network
What hath the Internet wrought? In 2012, one person in 7 on Earth had an account on Facebook. WTF? How all did that happen? A glimpse into the post-boom startup culture, and a cameo portraying one of Prof. Fox’s grad school colleagues, and great writing by Aaron Sorkin are just three of the many reasons to watch this dramatization of Zuck’s rise from dropout to billionaire.
[viewing date TBA] A dramatized cinematic account of Takedown, the nonfiction book by Tsutomu Shimomura with John Markoff, on how 90s hacker Kevin Mitnick was caught.
I’m of the “BASIC generation” and like David Brin I bellyache about there not being a good first language for kids to feel the empowerment I felt each time I could say “Look, I made the computer do something cool!”
A colleague recently asked if I had any thoughts on what would be “a good first programming language” for a precocious 9-year-old who was very much into computers and wanted to learn programming, giving me an excuse to agonize about this again.
The executive summary of my current opinion is probably “Scratch if you want training wheels, Python otherwise”, but if this sort of thing interests you, read on.
My colleague’s question gave me an excuse to wring my hands about it some more, and even talk to some other people about it seriously, including Colleen Lewis, a Berkeley computer scientist and educator whose opinion I respect tremendously on such matters, and a bunch of smart colleagues from industry who attended the recent RAD Lab retreat.
We concluded that there’s a handful of absolutely fundamental concepts that are (a) common to the vast majority of programming paradigms/languages and (b) not “natural” in the sense that they have no obvious analog in non-programming-based activities and simply have to be internalized:
- variables and assignment
- manipulating collections of elements (arrays exist even in languages that don’t support user-defined data structures)
- conditional evaluation
- control flow
- subprograms (i.e., procedural abstraction)
(I’ve deliberately omitted OO; while really important, it’s not fundamental in that there’s lots of programs you can write without it.)
Beyond focusing on absorbing those concepts, a “good first language” should make the young programmer feel empowered at being able to do stuff, gradually stripping away the “mystique” of how computers do the cool things they do.
Hence my concerns about Scratch, a popular GUI-based programming environment designed for teaching that we are starting to use at Berkeley for the intro-level CS class. Colleen reassured me that it’s possible to write non-toy programs in Scratch once you ditch the GUI, that it supports constructs like lambda expressions that let you teach important concepts like closures, and that students who complete the new intro course also understand things like objects and class inheritance and are more than ready to tackle any programming language. My concern is that the Scratch GUI environment is so amazingly rich and polished that it might diminish the sense of empowerment—even if you write a cool program, there’s still just too much magic between your program and the machine. The nice thing about Scratch is that it’s widely used in education so (I assume) there’s lots of freely-available supplementary materials to go with the software. (Wearing my “productivity programming” hat, I’d say Scratch is almost too productive because so much is happening beyond the code you wrote.)
The alternative seems to be Python, probably the closest thing to BASIC these days (albeit a much better language, of course). And in fact I found a pretty good book—Hello World: Computer Programming for Kids and Other Beginners—co-authored by a programmer and his young son and written in a kid-friendly yet noncondescending way. The downside is that while Python is a pretty nice language, like all languages it has a few quirky notations, idiosyncrasies, arbitrary-seeming behaviors, etc. While the book goes out of its way to clarify these only as much as needed, their presence might detract from a learning experience…but then, to be fair, the same was true of BASIC way back when, and probably those of us who glamorize learning it have selective amnesia about getting bitten by those idiosyncrasies and learning to work around them.
So maybe my current recommendation is: for a gentle introduction with training wheels and rubber bumpers, Scratch; for something a little more hardcore that you won’t outgrow (Python is used for lots of real programs) but comes complete with real-life idiosyncracies, Python with the above book.
The good news is that while neither is “built in” to today’s PCs like BASIC used to be, they’re both open source free downloads.
Ideas from anyone who’s actually helped their kids learn to program? (I have no kids, only a toucan, and it’s unlikely she’ll learn to program anytime soon.)
From my colleague Matt Welsh (who writes the Volatile & Decentralized blog)…
…now if only it had an actual Ethernet card and was crawling the actual Twitter site!…
Not sure which of my colleagues (maybe Matt?) pointed me at this, or maybe I read it in one of the various pheeds, but the guys at PlayPower have observed that a self-contained Nintendo NES knockoff, packaged in a cheap keyboard and including NES-like game controllers, is sold in India, China and other countries as the “TV Computer” for around US $10! In fact since the NES patents expired a few years ago, these are now 100% legal, and many different manufacturers offer them. PlayPower’s goal is to find a way to use these to bootstrap computer-aided education (basic computer literacy, learning games like the old Oregon Trail, etc.) in developing countries. Like the Atari 2600, there is a vibrant retrocomputing developer community around the NES.
PlayPower’s booth at Maker Faire had a couple of these devices on display. The NES, also called the Famicom in Asia, was a 6502-based cartridge game system whose technology was only half a generation or so ahead of the Atari 2600. The device plugs into a TV (they have both composite video out and RF out, and soldering an undocumented pad on the board switches between NTSC and PAL) and come with a starter cartridge that has various game titles on it, and even seems to have programs that “simulate” word processing and Web browsing (kind of like “my first cell phone” toys simulate making phone calls). Presumably, the carts have ROMs that take over part of the address space when plugged in; the more advanced Atari carts had some additional bank-switching logic (triggered by memory-mapped I/O) that allowed multiplexing lots of different game levels into a fixed amount of address space, typically around 4K.
Intriguingly, Bob Brost at CMU taught a class on NES game development and even produced “nBasic”, a BASIC-level (sort of) language that, although it eliminates the need to program in 6502 assembly, it does not seem to eliminate the need to understand the arcane programming approach required by these devices, i.e. you have to time your code around the vertical retrace interval (only “safe” to write GPU registers during vertical retrace to avoid flickering/loss of picture sync), etc.
So my questions would be: is this level of programming abstraction going to be sufficient to catalyze the development of a lot of “edu-ware” for these devices? Given that the source of this “edu-ware” is ultimately (we hope) going to be programmers in the very countries in which the devices are sold, should we be looking for something that enables a much higher level of abstraction for development—say, something more like Scratch—and port a subset of it to this 8-bit world?
(Update: the Design For Developing Countries wiki has a lot of great information about their progress and preliminary studies.)
(For the curious, the NES has:
- a 1.9MHz 8-bit 6502
- 32K ROM address space for game code & data (typically 16K code, 16K swappable to store different “levels” data)
- 240w x 248h viewable graphics (NTSC or PAL) organized as 8×8 pixel “tiles” for the purposes of color palette assignment, sprite rendering, etc.
- system, background and foreground color palettes (64, 16 and 16 entries respectively)
- up to 64 sprites (8×8 or 8×16 pixels, 3 colors + transparent color, no more than 8 sprites per scanline)
- 4 synthesizer channels (2 square waves, 1 triangle, 1 white noise) with ~6 octave range; one note per channel at a time
I have to admit, the pervasive availability of $10 TV-computers is awfully compelling, as someone who remembers very vividly what remarkable things can be done within the constraints of these old 6502 systems.