Tuesday, March 7, 2017

The CRT is dead, long live the CRT

I am a child of the 80s (and a little bit the 70s), and as a youngster I spent many, many quarters in arcade video games. (Tempest was among my favorites that I was good at.) It might be hard for today’s young adults to imagine the appeal of paying-per-game to play a game that lasted only a few minutes, had to be played standing up (usually), and was located in a pizzeria, bar, movie theater, or video arcade. But the first highly successful home gaming console (the Atari 2600, which sold over 40 million units during its 14-year lifetime) didn’t arrive until 1977, and while arcade games started rapidly improving after the release of Taito’s Space Invaders (1980), home games’ graphics and sound lagged far behind arcade hardware well into the late 1980s, even though Atari and others aggressively licensed the rights to produce home versions of popular arcade games. A typical arcade cabinet game might retail for $4,000, vs. around $200 for a home console. (Not to mention that going to the arcade was a social event.)

Today arcade cabinet games have an ardent following among retrocomputists (e.g. me), collectors, and nostalgists. But perhaps not for long: outside of this niche market, there’s virtually no demand for manufacturing CRT displays anymore, and they are surprisingly labor-intensive to manufacture, as this 5-minute video shows. In particular, few 29-inch “arcade grade” CRTs remain in the world, and the capacity to make or repair them is basically gone.

Without arguing whether new display technologies (plasma, LCD, LED) are better or worse than analog CRTs, it is certainly true that authors of older games had to work against (or more creatively, work with) the color-mixing and display constraints of analog CRTs, which are quite different from those of true discrete-pixel displays. This was especially true when designing games for consoles designed to connect to TV sets: these had the additional constraint that the video signal fed to the TV had to follow the somewhat quirky NTSC standard for analog color video. Famously, the Apple II video circuitry exploits idiosyncrasies of NTSC to produce high-resolution (at the time) graphics for a low (at the time) cost, at the expense of being very tricky to program. The fascinating book Racing the Beam recounts how both the console designers and game designers for the Atari 2600 leveraged the physical and electrical properties of NTSC color to create appealing games on exceedingly low-cost (for its time) hardware, even creating a custom chip to deal with some of the quirks of NTSC (the TIA or Television Interface Adapter, code-named “Stella”). And indeed, while Atari 2600 emulators are still popular and original 2600 hardware can be connected to modern LCD and plasma screens, the color effect is subjectively different from viewing it on old-school analog sets. In contrast (get it?), although arcade video games also used large (29”) CRT displays, they weren’t bound by the signal limitations of NTSC, so they could produce graphics far superior to what home gamers could view even on comparably sized TV sets.

June 12, 2009, was the last day for all US broadcast television stations to switch from analog (NTSC-encoded) broadcasting to digital broadcasting. On that day, NTSC effectively became a dead standard. Now, the hardware that was so ubiquitously associated with it—CRTs—is on a path to meet the same fate. Before it’s gone, get yourself to a “classic games” arcade and take a step back to when the best gaming graphics and sound were found in pizzerias, bars, and candy stores. 

No comments:

Post a Comment

Comments are disabled because the only commenters are spammers, despite Google's best efforts. But I welcome actual comments: Google my name and you can easily direct an email to me, and I'll publish your comment here.

Note: Only a member of this blog may post a comment.