Tuesday, July 21, 2009

Houston, give us a reading on the 1202 program alarm

Like many of you that grew up in the 1960s, I have been spending a lot of time online looking at the various commemorative links to the Apollo 11 moon landing that happened 40 years ago this week. I found it fascinating, not just because the event was such a key moment in my teenaged nerd life, but also because it shows how we managed to triumph over technology that wouldn't even be found inside your average watch today, let alone a cell phone or computer. Rather than pepper this email with a lot of links and run the risk of the sp*m gods, please go to strominator.com and you can click on what you want to follow up with more conveniently.

The Apollo spacecraft had three different display units onboard, running two computers: one in the main command module and one in the lunar module. Both weighed 70 pounds, ran at 1 MHz and had about 152 kb of memory.

To get an idea of how primitive the guidance computer was, you didn't have a typewriter interface or a display screen, but a box with mostly numeric input that you had to key in "nouns" and "verbs". You can go here and try the simulator:

The first moon landing was beset with problems. Armstrong had 17 seconds of fuel remaining, after having to take manual control over the lunar module and fly past some obstacles. The site was four miles off course because the module wasn't completely depressurized when it separated from the command module – a small amount of gas pushed it off course. And during the descent, several people documented how many times the guidance computer would get overwhelmed with data inputs and had to be rebooted, because Aldrin had not set one of the radar switches properly and it was filling up the computer with too much data. A young engineer, Stephen Bales, made the critical decision to ignore these warnings. There is a great video segment about it from CBS News that they ran this week.

There are probably hundreds of Web sites with various tributes to the space program, I will just mention two places that I enjoyed reading. First is a special report compiled by EE Times, which has eyewitness accounts from a few of the engineers who worked at NASA, along with a teardown of the space suits used and other technical info about the program.

The other is a list of numerous technological achievements from the space program that have found their way into our lives. And while Tang isn't on the list (and it is dubious whether it should be), there are lots of other things showing just how much innovation NASA had to do to put two men on the moon and bring them back home safely.

Tuesday, July 14, 2009

When did the browser become the next OS?

"We view the Internet as the fourth desktop operating system we have to support after Windows, MacOS, and DOS." That quote was from an executive at McAfee, and DOS gives it away that it was spoken back in 1996.

With the announcement that Google will develop a quick-start operating system by next year for instant-on netbooks, I thought it might be interesting to take a trip down memory lane and remind us how we have gotten to the point where the browser has become the next OS, and probably now moving into first place rather than fourth.

Of course, the smarmy retort to Google's announcement is that we already have a quick-start, ultra-reliable Web OS, it is called OS X and my MacBook takes about five seconds from when I open the lid to when I can be surfing the Web. Unlike many Windows PCs, I don't have to have a degree in advanced power management techniques with a minor in spam and virus prevention to get this to work.

But let's go into the WayBack Machine to the early 1990s and see the context of that McAfee quote.

The first collection of Web browsers literally weren't much to look at, because they only displayed characters and basically just a page of hyperlinked text. This was the then-popular Lynx that was initially designed back in 1992 for Unix and VMS terminal users (that was back when we called them that). Think about this for a moment: this was pre-iporn, pre-IPO Netscape, pre-real Windows -- when the number of Web servers was less than a few hundred. Not very exciting by today's standards.

Then Microsoft got into the game, and things started changing. With the introduction of Windows 95 we had the beginnings of a graphical Internet Explorer, which ironically was licensed from the same code that Netscape would use to create their browser (and eventually Firefox). Windows 95 came with both IE and Windows Explorer, and the two were similarly named for a reason: browsing pages of the Web was the beginnings of something similar to browsing files on your desktop. Things didn't really get integrated until IE v4, which came out about the same time as Windows 98, and they were so integrated that they begat a lawsuit by the Justice Department. At the end of 2002, Microsoft was legally declared a monopolist and had to offer ways to extract IE from Windows going forward for users who wanted to install a different browser.

During the middle 1990s, we began to see better support for TCP/IP protocols inside the Windows OS, although it really wasn't until the second edition of Windows 98 that we saw Microsoft improve upon the browser enough that they could include it as part of their Office 2000 product. Before then, we had separate drivers and add-on utilities that required all sorts of care and feeding to get online, in addition to using AOL and Compuserve dial-up programs.

As an example of how carefully integrated IE was with Windows, when Microsoft released IE v7 along with Vista, initially you needed to verify your license of Windows was legit before you could install the latest version of IE on earlier operating systems. That restriction was later removed.

And lately Microsoft has announced its next version of Office 2010 will have even further Web integration and the ability to create online documents similar to the way Google Docs works. Google Docs is an interesting development of itself, because now documents are stored outside of the desktop and managed through a Web browser. As long as I have an Internet connection, I don't need any software on my local machine to edit a document or calculate a spreadsheet.

So what is the real purpose of an operating system? Originally, it was to manage the various pieces of your PC so that your applications could talk to your printer or your hard drive or display characters on your screen without having to write low-level programs to do these tasks. Three things have happened since the early PC era:

First, as the Web and cloud computing became more powerful, we stopped caring where our information is located. In some sense, having documents in the cloud makes it easier to share them across the planet, and not have to worry about VPNs, local area network file shares, and other things that will get in the way. And we even have cellphones like the Palm Pre that have a Web OS built in, so that applications don't have to be downloaded to the phone but can run in the cloud. At least, when developers will finally get their kits to build these Pre apps later this summer.

Second, as the desktop OS matures, we don't have to worry about the underlying hardware as much because that hardware has gotten more generic and the OS has taken on a bigger role (to match their bigger footprints too). Although printer drivers are still scarce for Vista, and 64-bit apps aren't as plentiful, for the most part we don't need a "thick" desktop OS. Yes, there are enterprise apps that need the OS, and some that need a specific version of Windows too, but most of our computing can be done without really touching much of the OS.

Finally, the browser is the de facto Windows user interface. Perhaps I should say the browser plus Ajax or the browser plus Flash. But most applications that were formerly client/server now just use browser clients, or run inside a browser with minimal desktop downloads. This has been long in coming, but now Rich Internet Applications can be considered on par with local Windows and Mac ones.

So here we are, at the dawn of the new Google OS. We have come full circle: from the green-screen character mode terminals of the mainframe and Unix era to the browser-based Webtops of the modern era. This doesn't mean that Windows 7 or 8 or whatever will become obsolete. Just less important. And given the multiple billions of dollars that Microsoft has made over the years from Windows (and let's not forget dear old DOS), you can imagine that there are some nervous folks up in Redmond these days.

Tuesday, July 7, 2009

How proudly we fail: how 25 innovative tech companies die

I recently wrote a story for Datamation.com that looked at 25 companies that are no longer with us but were ahead of their times with innovative products. Before you write in and say that I missed your favorite, I wanted to take a few moments here and talk about some of the interesting trends that I saw from this list. The reasons for failure could be broken down into five general categories:

Corporate hubris and hijinks. Tech companies don't have the best record when it comes to staying on task, and this is especially true when they merge or start to bleed their best people. Look at Ashton-Tate's dBase. When they were at their height of their powers in the 1980s, thousands of people around the world studied their programming language and built databases on PCs (I was one of them). Then they lost their way and were sold to Borland in 1991, and that was the beginning of the end for both the company and its flagship product. Borland had a competing database product and couldn't sustain dBase. Or Banyan's VINES networking operating system, which also had a loyal customer base and had innovative directory services applications long before they were implemented by Novell and Microsoft. How about Digital Communications Associates, maker of the 3270 Irma boards? They quickly disappeared after 1994 when Attachmate acquired them.

The market evolved past them. Columbia Data Products made the first clone PCs back in 1982, not long after IBM came out with their model. They lasted five years, and the market moved on to more efficient suppliers like Dell and HP. Ironically, we got some other innovation from Columbia that they were less known for, the SCSI storage interface that was used for many years to connect hard drives to PCs. AST Research was another one who had a dominant share of the peripheral expansion market in the 1980s, only to see many of these peripherals integrated into PC motherboards.

Bright people working in the wrong company. Just because you have a collective brain trust doesn't mean that you are going to live long and prosper. Sometimes the chemistry is wrong, or the circumstances not quite right. Take First Virtual Holdings, one of the pioneers of Internet payment systems. Their founders went on to develop key products for Paypal. General Magic founding fathers went on to develop key parts of several phones including iPhone for Apple, Android for Google, and to help start eBay.

The Osborne effect. One company even is notable for its failed strategy of pre-announcing products that killed any demand. Osborne Computers was the early leader of portable PCs that weren't all that portable – at close to 30 pounds and a few inches too big to fit under an airline seat, they were a bear to fly with. Nevertheless, when Osborne announced a new version in 1983, everyone stopped buying the current models.

Engage lawyers. Sue everyone. Repeat as needed. Research in Motion uses this tactic to the present day, even though it has lost its share of suits in the creation of the Blackberry smart phone and millions of dollars. SCO/Caldera Systems has done something similar for early Unix inventors. Sometimes winning a lawsuit can be the death of a company too: Witness Stac Electronics that won $120 million from Microsoft on their disk compression technology, something that is now part and parcel to just about every operating system.

Take a look at my trip down memory lane here: http://tr.im/rcro

About Me

My photo
David Strom has looked at hundreds of computer products over a more than 20 year career in IT and computer journalism. He was the founding editor-in-chief of Network Computing magazine, and now writes for Baseline, Information Security, Tom's Hardware, and the New York Times.