Wednesday, October 31, 2007

A new kind of data center testing facility

I let you in on one of my big passions – I have a certain fondness for visiting data centers. Maybe it is the feeling of power coursing through all those racks of servers, or getting access into the inner sanctum of IT after passing through a series of security checkpoints. Or it could be just seeing how all this gear has been wired up. I always was big on looking at the backs of equipment and checking the cables whenever I got a demo from some vendor.

So when the folks at Schneider Electric and its American Power Conversion subsidiary asked me if I wanted to come to their open house of a new kind of data center, they were talking to the right guy, and I jumped at the chance.

The place is an oddity for several reasons. First, it is built like an actual working data center but with one key difference: there is literally nothing inside it. Instead, the mostly empty building has lots of HVAC equipment, electrical power, and plenty of monitoring and modeling tools. The idea is to have "a facility dedicated to practical solutions, not a not of hype," says Aaron Davis, the chief Marketing Officer for the subsidiary.

Schneider built its data center, which it calls its Electric Technology Center, to serve as a test bed for its customers, to show IT managers what they need to do to reconfigure their own data centers as they have evolved from mainframe-centric to house more distributed systems. It is a great idea and overdue. As IT shops outgrow their data center infrastructures, they want to be able to figure out the power and cooling issues and how companies can retool their data centers appropriately.

If you run a data center, chances are you have some pretty old equipment that you'd like to replace but literally don't have the energy to do it. Your raised floors are probably filled with outdated cabling that is so thick you have lost much of the airflow capacity and cooling ducts. Your air conditioning is on overload because it was never designed to cool racks of gear, and the temperature varies greatly from one aisle to another as a result. Your backup generators and power conditioning equipment is probably not matched to the gear it is backing up, and you have no idea of what should be upgraded first.

Wouldn't it be great to model what you need to do, before you actually have to bring servers down and remodel? That is the essence of the idea behind what Schneider is trying to do with its new testing facility, located outside of St. Louis. Think of it as one big (more than 100,000 square feet) big playroom where you can bring in gear and move it around and test various situations before you have to deploy it in your own shop.

Some companies are fortunate and able to rebuild or relocate their entire data center, something that I got to witness first-hand when the data center at the end of my block was rebuilt to new specs. (See the article here on my night at Rejis when they moved their facility just a few feet.)

But not everyone can just take a former parking lot and erect a new building to serve modern needs. Some IT shops have to do a fair amount of retrofitting, and that's where the St. Louis test bed comes in handy. Firms can build racks and lay them out on the floor, and try out different scenarios to measure airflow, power consumption, and temperature gradients for their gear. There are also two huge temperature controlled testing rooms that can rapidly heat or cool down and be used to see what happens to particular gear.

I am glad that the company picked St. Louis to build their facility, because being the data center groupie that I am I hope to visit often and get to see what they are doing with their customers. Plus, it is a really neat looking building that also serves as a showroom for some of the company's product lines. Schneider bought APC earlier this year, and has merged them with their MGE division, which sells electric power control equipment. While most of us know APC from their battery backup boxes (or we should), they also make large-scale rack power and cooling gear that are designed for data center use.

Their push has been to isolate airflow just around the immediate vicinity of the racks, so you are cooling the smallest air volumes and reducing the amount of power for these cooling needs. This has lots of appeal, particularly these days when everyone is going green and when oil prices continue to reach new highs. At the launch event last week, representatives from the US Department of Energy and spoke about how they are working together to reduce energy usage of data centers. "This is real low-hanging fruit," said Douglas Kaempf, who runs the Industrial Technologies Program at DOE. The Schneider facility has 7 MW of power supplied by the local utility, which is enough to power a reasonable suburb.

Ironically, the Schneider facility is located in between two massive data centers of Mastercard and Citibank, just the other side of the Missouri River from where one of the worst floods happened about 15 years ago. Don't worry – all three are on high ground and have plenty of backup resources too.

If you are looking at a data center remodel, keep this place in mind. The daily rental fee starts at $5,000, depending on customer needs.

Wednesday, October 17, 2007

Choosing a toll-free number

When was the last time your had a business with a toll-free number? Was it back in the mid-1990s, when the "new" area codes 888, 866 and 877 started showing up?

My step-daughter recently asked me "what the deal was with the 877 area code?" She grew up in an era when long distance was always free on her cell phone. It got me thinking about how things have changed with Ma Bell (even saying that will date me, I am sure).

I had an 800 number back in the day when I thought it was important for people to easily call me. This was when I had 128 kbps ISDN "broadband" Internet, and had to pay something like two cents a minute for each call to my ISP (that cut down on my surfing time, to be sure). Most of the time I got wrong numbers, which I paid I think regular long-distance charges for. I think my business phone bill was around $300 a month, including the ISDN access.

Fast forward to today, where my personal phone bill is around $200 a month and the thought of having a "dial-up ISP" and ISDN puts you back in the cretaceous period. Of course that includes several cellular lines, DSL, and unlimited wireline long distance. But there are still some situations where you might want to have a toll-free number for your business, or even personal needs. So what do you do?

The easiest and cheapest way to get a toll-free number is if you already are a Vonage VOIP customer (there are still a few of us diehards around). It costs an extra $5 a month with a $10 activation fee, and you have your choice of 877 and 866 numbers with 100 minute in-bound calls. The number is tied to your existing Vonage line, of course, and it takes seconds to sign up via their Web site. Clearly, these guys get how to do self-service features.

If you aren't a Vonage customer and don't expect a lot of calls, you can get a toll-free number from for $10 a month that includes 30 minutes of inbound calls on one of their messaging plans. After that, the price is 7 cents per minute, which can add up. A better deal is a plan from, where the same $10 a month gets you 200 minutes, and then 4.9 cents per minute after that.

And is just one of a number of Web sites that allow you to type in your name or some catchy seven-letter phrase and see if you can match it to a particular toll-free number, where the first 100 minutes will cost – you got it -- $10 a month plus a $29 activation fee. They offer all the various toll-free prefixes too.

There are a number of differentiating features on all of these plans: some will send your voice mail calls to email-based notifications and voice attachments, some will allow you to have multiple "extensions" on your line for different users, some can forward to different numbers or have a "follow-me" type of service, and some will have toll-free fax tied into the voice line too.

Speaking of Ma Bell, I tried to get information from AT&T's various Web sites about toll-free numbers, but wasn't able to find anything even after I entered my login information as one of their customers. That is shameful, and just goes to show you how far we have with toll-free calling.

Wednesday, October 10, 2007

Learning from NASA

I spent a day at the Kennedy Space Center in Florida this week while I was attending a user group conference nearby in Orlando. As my wife and I rode around the vast complex, I thought about the many things that public relations and tech marketing folks can learn from the way NASA tells its story to the public.

147_0398.JPGNASA has always spent a lot of money on PR. Sending people into space is dangerous and expensive, and continues to be so. And while the company’s reputation isn’t what it used to be compared to the “Right Stuff” 1960s when we were trying to land on the moon, they still do a lot of things right when it comes to getting their message across and educating the public about what it takes to work in space.

I admit it – I am a space junkie and grew up fascinated with astronauts and the whole lunar landing thing. Growing up on Long Island, I knew that the local space contractor was Grumman and that they built the spidery lunar lander, and even had a model of it in my bedroom too. Later, I spent some time at the Cradle of Aviation Museum which isn’t far from where Lindbergh took off for Paris, and got to meet Fred Haise, one of the Apollo 13 astronauts (Bill Paxton plays him in the film version) when he gave a lecture there several years ago. (Just to complete the connections with Lindbergh, now I am living in St. Louis. He also wrote his memories when he was living in Port Washington, NY, where I lived for many years too.)

What made the visit to the space center memorable were the testimonial videos from long time NASA employees – they were short, You-Tube like segments about people that had rather odd jobs, but took pride in doing them -- for decades in some cases. As an example, the various pieces of each shuttle need to be put together in a special building called the Vehicle Assembly Building and then towed over to the launch pad. The guy who drives the tractor that tows this multi-million ton rig talked about how he isn’t exactly NASCAR material – the tractor’s maximum speed is one mph – but when he gets to the pad he has to position the rocket within a sixteenth of an inch for it to be properly launched. You could see him positioning the rocket with a joystick in the video and wonder how cool is that?

Another video was about the guy who runs the recovery operations to pick up the booster rockets once they are ditched in the ocean. NASA recycles them but first they have to track them down after the launch and the process isn’t easy. We saw videos of divers wrangling the boosters – everything is bobbing up and down in the ocean while the drivers try to attack the lines to tow the rockets back to shore.

The best part of the complex for me was the actual firing control room that has been reassembled and shows you what happened the moments before and after one of the Apollo launches. Those of us that grew up glued to our black and white TVs watching many moon launches will find this the iconic techno stage setting fascinating, and it was great to see the attention to detail – the various instrument panels lit up as they came into play during the countdown. Now we have space entrepreneurs that can run their launches remotely over the Internet with just a small staff.

147_0406.JPGEven though I visited the space center when I was very young, I wasn’t prepared for how vast the place is – you need to take a series of bus rides from one site to another, and of course much of it is very much a working industrial site that is off limits to the general public. My wife and I got to go in a simulation ride that shows you what liftoff in the shuttle feels like. And we ate lunch literally underneath the huge Saturn V/Apollo rocket that is lying horizontally and stretches close to 400 feet.

What was special about the space program, then and now, is that it takes the right mix of teamwork and selflessness and ingenuity to pull all this technology off. And while the shuttle fleet is aging, it is a testimonial to how many of them we have launched successfully and how well it really works. While some might argue that sending people into space is a luxury we can’t afford, I like to think that the innovations and sense of discovery continue to inspire many of us in the hi-tech field. I give NASA a lot of points for doing such a great job, and the next time you find yourself in the area do plan on spending some time at the space center, and maybe skip a day at the theme parks that ring Orlando.

About Me

My photo
David Strom has looked at hundreds of computer products over a more than 20 year career in IT and computer journalism. He was the founding editor-in-chief of Network Computing magazine, and now writes for Baseline, Information Security, Tom's Hardware, and the New York Times.