[Previous entry: ""] [Main Index] [Next entry: "Michael Moore canceled"]
10/19/2004 Archived Entry: "computer architecture for fun and no profit"
I'm finally excavating myself from a mountain of work, and finding the time to write blog entries again. However, what's on my mind this 3 a.m. is neither politically relevant nor aimed at helping you with your computer. It's pure geekish musing, so feel free to skip it if you're not technically inclined.
I'm one of that peculiar breed of people who study computer architecture for fun. Thanks to a good friend who let me cherry-pick his library before he carted it off for recycling, I've been reading books such as Architecture and Programming of the B1700/B1800 Series and Assembly Language Programming for the Control Data 6000 Series lately while sitting in the loo. Seriously.
The latter book led me to revisit the architecture of the famed CDC 6600, the first "supercomputer." The first time I studied this machine I wasn't in a position to appreciate how pathbreaking it was. Some have called it the first RISC (Reduced Instruction Set Computer) architecture; certainly it pioneered the "superscalar" technology touted by Intel for the Pentium chip three decades later.
While reading about the 6600, I was shocked to learn that its architect, the legendary Seymour Cray, died several years ago in a car accident. That set me to thinking about other famous names in computer architecture, like Gene Amdahl and Gordon Bell. These were architects from the '60s (with careers extending into the '70s). Why can I name no famous computer architects from the last 20 years?
I blame the microprocessor, for several reasons. When the large-scale integrated circuit lowered the cost of a computer's central processing unit from $10,000+ to $10, the focus shifted away from "supercomputing" and toward small-scale computing (personal computing and embedded computing). This also changed the economics of processor design: when you're designing with individual transistors, you need to design elegant and clever architectures that use them efficiently. But when you can put 300 million transistors on a chip, you can use "brute force" solutions to your design problems. And when processors reach that complexity, they're beyond the reach of individual architects -- they're designed by teams and committees, and in fact much of the actual detail work is done by automated tools.
For aspiring computer architects, this is bad and good, depending on your motivation. Unless you work for a major chip manufacturer, there's no profit in designing a new computer architecture. Why spend a year devising a faster way to compute, when Intel or AMD will bring out a faster chip next year that you can buy for $100? There was a brief surge of interest in new RISC architectures in the '80s, but these days most people want faster versions of what they already own. (Indeed, one of the latest microprocessors I'm using is an enhanced version of a chip designed in 1975.)
But for those who study computer architecture for personal interest, times have never been better. Whereas in the '60s you needed hundreds of thousands of dollars to prototype a new processor, today, for a few thousand dollars, you can have custom chips made to your own design (and the design tools run on an ordinary PC). Or, you can use "field programmable gate arrays" to design and test a single chip...and I've seen evaluation kits for these in the hundreds of dollars. (Better still, you can erase and reprogram many of these chips, so you can try several different ideas.)
I've designed and built one processor of my own -- the "hard way," with TTL logic chips and a wirewrap tool -- and I have ideas for a few more. When I get some free time -- this decade, I hope -- I intend to get one of those FPGA evaluation kits and try a few more designs. For fun, not money. Other people assemble jigsaw puzzles, I tinker with computer designs.
A final musing... I was amazed to discover that the architecture of the equally-legendary IBM System/360 lives on, in the current IBM zSeries. I learned programming on S/360 and S/370 machines, and, again, I didn't fully appreciate them at the time. Arguably this makes the S/360 the most enduring computer architecture of all time.
brad