Tag Archives: PC

That Was Then, This Is Now

With all the talk about how prices on this, that, and the other thing are always going up, let’s stop a moment and bow, or at least give a polite nod, toward Silicon Valley, Poughkeepsie, and Boston’s Route 128. A rather more reverent appreciation is due entrepreneurialism, capitalism, and the pursuit of happiness. Over the past 40 years, the tech titans of the FPCE (Founding PC Era) have given us the greatest ongoing upgrade at the biggest continuing discount ever. The saga of the personal computer is as fantastic a tale as any sci-fi story ever.

Progress through self-seeking

In fact, truth isn’t just stranger than fiction—it’s often got more magic and miracles in it, too. And, I hasten to add, the progress whose techie little handiwork you enjoy daily is brought to you by a whole parade of people, groups, companies, and cabals all pursuing their own ends, competing more often than cooperating, looking to make a buck, and generally proving Adam Smith right.

PC - Going Back in Time

The longer you’ve been using computers—and some of us had the original Apple, Tandy (Radio Shack), and Timex Sinclair models in the 1970s—the more you can appreciate the astonishing speed of progress. This is a tale that everyone working with computers really should know, and uses terms that everyone really should understand. If you don’t understand a kilo-this from a mega-that, you will never get the full impact of this amazing tale. So read on—you’ll be glad you did.

You can visit PC.net or one of the other great online tech glossaries when you see a new term, but I’ve written such a way that you should understand much of it in context. Some of you, of course, are true experts, so if I’ve erred in any way, by commission or omission, let me know. I’m going to demonstrate just how much technological progress has been made in “personal computing.” It really is an awe-inspiring tale.

Basic computers in 1981

IBM introduced its first consumer-level personal computer in August of 1981, running on an Intel 8088 CPU with a clock speed of 4.77MHz, or 4.77 million cycles per second. It came with either 16 or 64kB of RAM, expandable to a whopping 256kB. It connected to a TV or a monitor, and gave you storage options that included one or two 5¼-inch floppy drives, an optional 10MB external hard drive, or your own cassette recorder. The software bundle? It came with an operating system. Nothing else.

With a monitor and a single floppy drive (giving you 180kB storage per single sided disk) it cost $3005 in 1981 dollars. Depending on how you figure it—Consumer Price Index (CPI) is one common method—today it would take about $2.57 to buy what a dollar bought in 1981. Translation: That IBM-PC computer would cost $7,722.85 (in today’s dollars). Now let’s see what type of desktop computer you can get today.

High-end computers of today

Entry-level computers today are thousands of times faster and more productive than the IBM-PC. The H-P xw8400 was a high-end model in 2006, but it’s still a decent workhorse today and, arguably, is better than many newer models as an entry-level workstation. It features dual 2.66GHz quad-core Xeon processors, meaning eight separate CPUs. A single one runs almost 600 times faster than the IBM CPU, so we’re talking almost 5,000 times as fast with a rough clock speed comparison.

The xw8400’s 160GB hard drive, one-sixth the size of most desktop internal drives these days, holds close to million (932,000) times as much data as that single floppy. There are now hard drives 2TB in size selling for $80—that’s 250MB for a penny, versus the floppy’s 250MB for $7,500 ($30 per MB). That’s 750,000 times less expensive.

For the monitor, the comparison is between today’s 16 million crisp clear colors, precisely displayed by about 2.3 million pixels, with about 9,700 pixels per square inch—and a black-and-white TV with 480 wiggly lines for the entire screen. Today a 20-to-24-inch flat-panel display, bargain basement variety (which are darn good), would set you back as little as $100.

In 1985, when you could get a color MacII for $3,898 without a hard drive, or $5,498 with an internal 40 MB hard drive, you still had to buy a video card and a monitor. That would come to an additional five grand or so. Color system with 40MB hard drive: Over $10,000. Today?

How about we just say, “Infinitely more for infinitely less” and leave it at that?

Bottom line

Today, you can store a million times as much, crunch numbers thousands of times faster, and watch videos in beautiful, high-definition color. For a few hundred bucks you can buy a pocket-sized tablet incalculably more powerful than the room-sized, air-conditioned behemoth that helped send Apollo 11 to the moon—and you don’t have to be a programmer to use it, either.

Advertisements

End of the Desktop PC Era (DPCE)

With all the talk about how prices on this, that, and the other thing are always going up, let’s stop a moment and bow, or at least give a polite nod, toward Silicon Valley, Poughkeepsie, Boston’s Route 128, and Research Triangle Park. In the 40 years, or two generations, between 1970 and 2010, the tech titans of the Desktop PC Era (DPCE) gave us the greatest ongoing upgrade at the biggest continuing discount ever.

The saga of the personal computer is as fantastic a tale as any sci-fi story ever. Truth isn’t just stranger than fiction—it’s often got more magic and miracles in it, too. And that magic, those miracles, continue apace: In 2010, we began the Tablet Computing Era (TCE), as they are now the “new PC” and appear unstoppable in their quest for world domination. Wait. That’s Google. Anyway…

Why You Should Care

The longer you’ve been using (and let’s be proper for once) microcomputers—and some of us had the original Apple, Tandy (Radio Shack) and Timex Sinclair models in the 1970s—the more you can appreciate the astonishing speed of progress. This is a tale that everyone working with computers really should know, and uses terms that everyone really should understand. If you don’t understand a kilo-this from a mega-that, you will never get the full impact of this amazing tale. So read on—you’ll be glad you did.

You can consult the Glossary at the end whenever you see a new term, but there will be no needless “techie talk” and you might very well understand much of it in context. Some of you, of course, are true experts, so don’t hesitate to make corrections in the comments or an email. Now to the story of just how much technological progress has been made in two generations of “personal computing.” It really is an awe-inspiring tale.

Personal Computers Became “PCs” in 1981

For computer users, the 1970s started with time-sharing and ended with a number of companies—Apple, Commodore, Atari, and others—making totally incompatible systems whose major advantage was that they were not kits. For better or worse, “PC consciousness” dates from IBM’s introduction of its first consumer-level personal computer in August of 1981.

Running on an Intel 8088 CPU with a clock speed of 4.77MHz, the IBM-PC came with either 16 or 64KB of RAM, expandable to a whopping 256KB. It connected to a TV or a monitor, and gave you storage options that included one or two 5¼-inch floppy drives, an optional 10MB external hard drive, or your own cassette recorder. The software bundle? It came with an operating system. Nothing else.

With a monitor and a single floppy drive (giving you 180KB storage per single-sided disk) the IBM-PC cost $3005 in 1981 dollars. Depending on how you figure it—Consumer Price Index (CPI) is one common method—in 2010 it would have taken about $2.50 to buy what a dollar bought in 1981. Quick calculation: That IBM-PC computer would have cost $7,512.50 in 2010 dollars. Now let’s see what type of desktop computer those 2010 dollars bought in their own day.

Standard Computers of 2010

Entry-level computers in 2010 were thousands of times faster and more productive than 1981’s IBM-PC, released a bit more than half a generation into the DPCE. The H-P xw8400 was a high-end model when it debuted in 2006, and still a strong performer in 2010, at which time it was a mainstay in the inventory of America’s big computer rental firms. It has dual 2.66GHz quad-core Xeon processors—what Apple’s aging Mac Pro line still uses in 2013—meaning eight separate CPUs.

Now the comparison: A single one runs almost 600 times faster than the IBM CPU, so we’re talking almost 5,000 times as fast with a rough clock speed comparison. Its 160GB hard drive holds close to million (932,000) times as much data as that single floppy. At this writing in mid-May 2013, there are desktop hard drives 4TB in size selling for just over $100, a cost per MB of 1/400th of a cent, versus the floppy’s $30 per MB. That’s 1,200,000 times less expensive.

From CRT to LCD

For monitors, the comparison is between 2010’s 16 million crisp clear colors, precisely displayed by about 2.3 million pixels, with about 9,700 pixels per square inch—and, at Day One in 1970, a black-and-white or primitive color TV with 480 wiggly lines for the entire screen. In the literary landmark year of 1984, the color monitor systems (yes, multiple pieces) were thousands of dollars, and PC-only. Macsters had to wait another year or so for the Macintosh II so they could spend even more on Apple-branded color cards and monitors.

Anyway, over halfway through the first generation of personal computing you still had to spend up to a grand on the “color card” for your PC, install it, then buy a 12- or 13-inch RGB CRT for, say, $3,000. (Dang, that’s $500 per letter, a truly expensive acronym.) In 2010, a good 20-inch monitor was more than it is now, but certainly under $150. If you want a “fairer” comparison, a 13-inch LCD with full HD resolution would have been $79 at Fry’s Computers, the Silicon Deli.

A New Generation

Today, for less money than in 2010, you and any number of affordable desktop setups can leave the original DPCE’ers in the dust, storing a few million times as much, crunching numbers thousands of times faster, and watching videos in huge, beautiful, high-definition color. For a hundred bucks you can buy a decent pocket-sized WiFi tablet incalculably more powerful than the room-sized, air-conditioned behemoth that helped send Apollo 11 to the moon—and you don’t have to be a programmer or astronaut to use it, either.

Feedback/Comments follows the Glossary.

. . . . . . . . . . . . . . . . . . . .

GLOSSARY

bit: abbreviated lower case “b”; the smallest unit measure for area occupied by data, measuring both where it is processed (RAM) and where it is stored (“media” such as tape, floppies, hard drives, SecureDigital [SD] and other flash memory, etc.); 8 bits = 1 Byte

Byte: abbreviated upper case “B”; 8 bits = 1 Byte; 1024 Bytes, in metric terms, is a kilobyte (kB, see below)

clock speed: CPU speed as measured in hertz (Hz), or cycles per second

CPU: Central Processing Unit, a computer’s “brains,” the fancy calculator

DPCE: Desktop PC Era, a name and acronym for the years circa 1970-2010; just made this one up, how do you like it? 2010 began the TCE (Tablet Computing Era)

GB: Gigabyte, 1024MB, or 1024 x 1024kB (1,073,741,824 Bytes); often considered “a billion” Bytes

k or K: lower/upper case “k/K” means “kilo”; often considered a thousand (more precisely, it’s 1024)

kB, K, or KB: kilobyte, or 1024 Bytes; often considered “a thousand” Bytes

MB: Megabyte, 1024kB, or 1024 x 1024 Bytes (1,048,576 Bytes); often considered “a million” Bytes

medium/media: a substance used for electronic storage of audio, video or data, from wire in early wire audio recorders to such magnetic media as recording tape; computer media progressed from soft-sided to hard-sided floppy disks, then to hard drives with multiple platters, Compact Disc (CD), DVD and, now, Blu-ray

memory: a term for both RAM and storage media, measured in Bytes

pixel(s): term created from “picture element” to describe the basic unit of programmable color in a computer image or display

RAM: Random Access Memory, the “head” or space where the CPU “brain” does its calculations

TB: Terabytes, 1024GB, or 1024 x 1024MB (1,099,511,623,680 Bytes); often considered “a trillion” Bytes

TCE: Tablet Computing Era, 2010-?

LifeTech: Cultural Issues in Tech Adoption

iPad with flags of the world

The Apple iPad is not just translated into other languages, but other cultures, as well.

Tech users worldwide bring their cultural biases to the use of new technologies. Even product safety standards are largely based on a nation or region’s reigning sociocultural values. For global companies—what firm isn’t, what with the Internet, FedEx, and consumer electronics with components from different continents?—understanding these cultural issues is essential. Working toward that understanding along with a growing number of consumer-tech firms are researchers, anthropologists, and (yes) philosophers.

Formed from culture and tradition, a people’s collective mental model defines everything, including color. In China, black borders mean a pictured person is deceased, so the first digital photo frames with thick black bezels—as on the original plasma displays—did not do well there. (White has other issues.) Everything from design and production through marketing and sales must pass the culture test, lest a product fail because people don’t “get it”—or worse, because something is silly or offensive. The classic, possibly apocryphal example? Citroën sold few cars in Holland because its name in Dutch means “lemon.”

IBM, True Blue Trailblazer

Making sure to consider all “society-based cultural factors…in the design of technology” is difficult, according to Geert Hofstede. Hofstede first studied, then strategized the international spread of IBM’s business in the 1960s and 1970s, back when “computer rental” meant paying by the hour to use what was essentially a refrigerator-sized tape deck (with no spell-check). At least IBM’s management team was smart enough to put even smarter academic researchers on the job. Hofstede developed the landmark four dimensional framework for adapting technology to particular cultures (later upped to six dimensions with long-term orientation and indulgence).

Writing in 1980, Hofstede posited four adversarial principles at work across human cultures:

  • Weak vs. strong uncertainty avoidance. Some cultures, such as Greece and Japan, place great importance on avoiding ambiguity, especially in interpersonal relations. This explains the Japanese preference for video-calls, which they make in the billions on every phone, PC, and tablet available. Video-calls require being seen, but also positively ID the caller. In Scandinavia and Hong Kong, on the other hand, more ambiguity is tolerated and video-calls are less numerous.
  • Individualism vs. collectivism. The UK and U.S. cultures idealize self-sufficiency and independence, whereas Venezuela and Colombia are proudly collectivist. While people use laptops in the U.S. for a variety of personal and/or corporate reasons, a marketing campaign in Colombia would focus on group collaboration. Traits such as confidence and creativity develop in individualist cultures, while cooperation and conformity are strongly encouraged in collectivist ones.
  • Small vs. large power distance. A large “power distance” exists in cultures like India and the Philippines, where the privileged classes use all the latest tech while the powerless remain “unplugged” on the bottom rung of the socioeconomic ladder. Austria, Sweden, and other Western nations—where high-tech devices are commodities that even “the poor” can afford—have “small” power distances.
  • Masculinity vs. femininity. Cultures that are task-oriented, and emphasize material success, are called “masculine.” Ones that are people-oriented, and value quality of life? They’re “feminine.” Such previous markers as the American female’s mythical affinity for frilly pink things are in flux, however: Apple’s MacBook line now includes “girly” light-as-Air models that guys seem to like just fine.

It’s James T. Kirk’s World, Not Luke Skywalker’s

If you’re keeping track, the world is turning out a lot more like Gene Roddenberry visualized it in Star Trek than how George Lucas did in Star Wars. (And isn’t trekking better than warring anyway?) While we’re still waiting on Death Stars, light sabres, and ’droids with English accents, the technology of the United Federation of Planets has been showing up for some time now—their communicator became our clamshell cellphone, their data cards our flash memory, their tricorder our iPad.

The latest time-warped delivery from the crew of the U.S.S Enterprise? Mobile, universal speech translators. Yow!

Developers have worked on universal translation for years, and a slew of apps are already running on common digital appliances—Apple products from iMac to iPhone, plus Android smartphones, tablets, etc. The goal: instant spoken translations for natural, seamless conversation, with optional onscreen text display, too.

Some voice-translation apps may offer versions for every flavor of computer operating system (OS), too—PC, Linux, OS X—so as to ensure desktop functionality. But Skype and FaceTime videophone calling with built-in, real-time universal translation remains the Holy Grail, so there are major areas of development for both “static” environments like the office (relatively settled and slower-changing) and “dynamic” ones like the mobile market (experimental and faster-changing). You’re going to see better and better apps, for all kinds of devices, starting… well, yesterday.

Jibbigo

Life moves so fast now that it does seem like “just yesterday” that the first version of Jibbigo, Spanish-English, debuted in September, 2009. Many translation apps require a constant Internet connection to access online databases, but Jibbigo is an offline app that needs no phone or data connectivity to function. It now includes 20+ language pairs available from both Google Play and Apple’s App Store.

Developed by Carnegie Mellon professor Dr. Alex Waibel and Mobile Technologies, LLC, Jibbigo was among the first ”mobile language translation apps” and is quite simple to use. Say a phrase in your language and the words appear as text in both languages on the screens of phones or tablets—while being spoken aloud in the target tongue. Lag is apparent, and varies, but is acceptable.

Multitasking language app

Jibbigo has other features that keep it in the leading-edge position, such as free and unlimited online use, an “add name” function, so-called “Regional Bundles” for travel to various nations with neighbors you’d also like to visit, and the ability to translate both written text and speech. The iOS version plays nice with VoiceOver, so vision-impaired users can still use small devices with small screens.

Featured in an episode of Popular Science on the Science Channel in 2010, and a Nova episode dubbed “The Smartest Machine on Earth” that aired in 2011, Jibbigo has attracted plenty of notice. So has another firm with its own first-rate universal translator, a company whose name we hear quite a bit…

Google Translate

Google actually does want to dominate the planet—really, you can read about it on the web!—so it upgraded its Translate app, which began life as a standard, online-only translator. As the number of superior offline apps grew, however, Google got the message: The current version of Google Translate has more than 60 offline language packs.

While it is true that you can access most of Google’s services via any browser, whether it’s running on a MacBook Pro with that dazzling display named after your eyeball or a Google Nexus 7, translation apps are much more useful on smaller digital devices. This is why the “app model” succeeds.

Domination through, er, popularity?

Google Translate’s menu shows you every available language pack. You only download the language pair(s) you want to translate between. Although the company calls them “less comprehensive than their online equivalents,” the smaller dictionaries are still useful and will doubtless be continuously refined. Google Translate also deciphers camera input, including vertical text in Korean, Japanese, and Chinese.

The app has a version for iOS, too, but as yet it has no offline mode and the development roadmap is not being shared with the public. But this—like, oh, everything else—will likely change as we continue to replicate the Star Trek tech environment. Speaking of replication, 3D printers are already producing food, so the day is coming when you just might tell your “kitchen app” that you’re ready for a cup of “Earl Grey, hot.”