With all the talk about how prices on this, that, and the other thing are always going up, let’s stop a moment and bow, or at least give a polite nod, toward Silicon Valley, Poughkeepsie, Boston’s Route 128, and Research Triangle Park. In the 40 years, or two generations, between 1970 and 2010, the tech titans of the Desktop PC Era (DPCE) gave us the greatest ongoing upgrade at the biggest continuing discount ever.
The saga of the personal computer is as fantastic a tale as any sci-fi story ever. Truth isn’t just stranger than fiction—it’s often got more magic and miracles in it, too. And that magic, those miracles, continue apace: In 2010, we began the Tablet Computing Era (TCE), as they are now the “new PC” and appear unstoppable in their quest for world domination. Wait. That’s Google. Anyway…
Why You Should Care
The longer you’ve been using (and let’s be proper for once) microcomputers—and some of us had the original Apple, Tandy (Radio Shack) and Timex Sinclair models in the 1970s—the more you can appreciate the astonishing speed of progress. This is a tale that everyone working with computers really should know, and uses terms that everyone really should understand. If you don’t understand a kilo-this from a mega-that, you will never get the full impact of this amazing tale. So read on—you’ll be glad you did.
You can consult the Glossary at the end whenever you see a new term, but there will be no needless “techie talk” and you might very well understand much of it in context. Some of you, of course, are true experts, so don’t hesitate to make corrections in the comments or an email. Now to the story of just how much technological progress has been made in two generations of “personal computing.” It really is an awe-inspiring tale.
Personal Computers Became “PCs” in 1981
For computer users, the 1970s started with time-sharing and ended with a number of companies—Apple, Commodore, Atari, and others—making totally incompatible systems whose major advantage was that they were not kits. For better or worse, “PC consciousness” dates from IBM’s introduction of its first consumer-level personal computer in August of 1981.
Running on an Intel 8088 CPU with a clock speed of 4.77MHz, the IBM-PC came with either 16 or 64KB of RAM, expandable to a whopping 256KB. It connected to a TV or a monitor, and gave you storage options that included one or two 5¼-inch floppy drives, an optional 10MB external hard drive, or your own cassette recorder. The software bundle? It came with an operating system. Nothing else.
With a monitor and a single floppy drive (giving you 180KB storage per single-sided disk) the IBM-PC cost $3005 in 1981 dollars. Depending on how you figure it—Consumer Price Index (CPI) is one common method—in 2010 it would have taken about $2.50 to buy what a dollar bought in 1981. Quick calculation: That IBM-PC computer would have cost $7,512.50 in 2010 dollars. Now let’s see what type of desktop computer those 2010 dollars bought in their own day.
Standard Computers of 2010
Entry-level computers in 2010 were thousands of times faster and more productive than 1981’s IBM-PC, released a bit more than half a generation into the DPCE. The H-P xw8400 was a high-end model when it debuted in 2006, and still a strong performer in 2010, at which time it was a mainstay in the inventory of America’s big computer rental firms. It has dual 2.66GHz quad-core Xeon processors—what Apple’s aging Mac Pro line still uses in 2013—meaning eight separate CPUs.
Now the comparison: A single one runs almost 600 times faster than the IBM CPU, so we’re talking almost 5,000 times as fast with a rough clock speed comparison. Its 160GB hard drive holds close to million (932,000) times as much data as that single floppy. At this writing in mid-May 2013, there are desktop hard drives 4TB in size selling for just over $100, a cost per MB of 1/400th of a cent, versus the floppy’s $30 per MB. That’s 1,200,000 times less expensive.
From CRT to LCD
For monitors, the comparison is between 2010’s 16 million crisp clear colors, precisely displayed by about 2.3 million pixels, with about 9,700 pixels per square inch—and, at Day One in 1970, a black-and-white or primitive color TV with 480 wiggly lines for the entire screen. In the literary landmark year of 1984, the color monitor systems (yes, multiple pieces) were thousands of dollars, and PC-only. Macsters had to wait another year or so for the Macintosh II so they could spend even more on Apple-branded color cards and monitors.
Anyway, over halfway through the first generation of personal computing you still had to spend up to a grand on the “color card” for your PC, install it, then buy a 12- or 13-inch RGB CRT for, say, $3,000. (Dang, that’s $500 per letter, a truly expensive acronym.) In 2010, a good 20-inch monitor was more than it is now, but certainly under $150. If you want a “fairer” comparison, a 13-inch LCD with full HD resolution would have been $79 at Fry’s Computers, the Silicon Deli.
A New Generation
Today, for less money than in 2010, you and any number of affordable desktop setups can leave the original DPCE’ers in the dust, storing a few million times as much, crunching numbers thousands of times faster, and watching videos in huge, beautiful, high-definition color. For a hundred bucks you can buy a decent pocket-sized WiFi tablet incalculably more powerful than the room-sized, air-conditioned behemoth that helped send Apollo 11 to the moon—and you don’t have to be a programmer or astronaut to use it, either.
Feedback/Comments follows the Glossary.
. . . . . . . . . . . . . . . . . . . .
bit: abbreviated lower case “b”; the smallest unit measure for area occupied by data, measuring both where it is processed (RAM) and where it is stored (“media” such as tape, floppies, hard drives, SecureDigital [SD] and other flash memory, etc.); 8 bits = 1 Byte
Byte: abbreviated upper case “B”; 8 bits = 1 Byte; 1024 Bytes, in metric terms, is a kilobyte (kB, see below)
clock speed: CPU speed as measured in hertz (Hz), or cycles per second
CPU: Central Processing Unit, a computer’s “brains,” the fancy calculator
DPCE: Desktop PC Era, a name and acronym for the years circa 1970-2010; just made this one up, how do you like it? 2010 began the TCE (Tablet Computing Era)
GB: Gigabyte, 1024MB, or 1024 x 1024kB (1,073,741,824 Bytes); often considered “a billion” Bytes
k or K: lower/upper case “k/K” means “kilo”; often considered a thousand (more precisely, it’s 1024)
kB, K, or KB: kilobyte, or 1024 Bytes; often considered “a thousand” Bytes
MB: Megabyte, 1024kB, or 1024 x 1024 Bytes (1,048,576 Bytes); often considered “a million” Bytes
medium/media: a substance used for electronic storage of audio, video or data, from wire in early wire audio recorders to such magnetic media as recording tape; computer media progressed from soft-sided to hard-sided floppy disks, then to hard drives with multiple platters, Compact Disc (CD), DVD and, now, Blu-ray
memory: a term for both RAM and storage media, measured in Bytes
pixel(s): term created from “picture element” to describe the basic unit of programmable color in a computer image or display
RAM: Random Access Memory, the “head” or space where the CPU “brain” does its calculations
TB: Terabytes, 1024GB, or 1024 x 1024MB (1,099,511,623,680 Bytes); often considered “a trillion” Bytes
TCE: Tablet Computing Era, 2010-?