This article is more than 1 year old

Data centers embrace The Great Outdoors

Why your big iron needs Windows windows

Some years ago, when power and cooling issues first started coming to the fore in data center conversations, some of us were joking that the smart thing to do would be to move big data centers to the polar caps and use the cool air keep the iron from overheating.

Of course, the polar caps are melting now, thanks to global warming, not supercomputers, so there goes that idea. But the next best thing is now coming into vogue: Some outfits are opening up the walls of their data centers when the sun goes down.

This idea is pretty obvious, particularly if you have been living with a Windows-Linux cluster as I do. In the winter, I use the cluster to not only support a bunch of websites, but also to dry wet clothes and keep my apartment warm, and when it gets too hot, we just open up the windows and let the cool air in. I figured this out in the first week of operating my own data closet, and it is a bit of a wonder that big data centers haven't figured this out sooner.

But if you think about it, when energy was cheap and compute densities were fairly low, it wasn't that big of a deal to keep paying for chillers and AC units in the data center. Nowadays, with energy costs skyrocketing, and the cost of building a large data center hitting $10 million, companies (particularly large ones with big computing needs) are scrambling to cut expenses. And the answer - at least in some geographical locations blessed by power and natural cool - might be as simple as opening up the walls and letting the air in at night.

Expect a lot of people to try to take credit for this idea, which Intel is calling the Air Economizer rather than something obvious like Direct Global Warming or Natural Cooling.

Back in early August, Advanced Data Centers, an owner and operator of data centers that is headquartered in San Francisco, announced that it had opened a 41,000 square foot data center outside Sacramento, where the nights are cool and where the Sacramento Municipal Utility District is bringing a 45 megawatt substation online and is charging 40 to 50 per cent less than Pacific Gas & Electric does for juice in the Bay Area.

The data center also has 30,000 sq ft of related office space and is located in a brown field site formerly known as McClellan Air Force Base. The base was closed in 2001 and a portion was converted into a business park called appropriately enough McClellan Business Park. (Think of it as The Office with an Air Force-class runway).

Ambient temperatures

Here's the interesting bit about ADC's McClellan data center. A 15-foot high, 100-foot long section of the data center wall opens up and lets outside air into the data center to cool it when the temperature conditions are right. And because of the temperate weather in the Sacramento area, outside air can cool this data center 75 per cent of the time.

When you add in the already lower electricity costs and the fact that the data center will have a Power Usage Effectiveness (PUE) rating of 1.1 (compared with anywhere from 1.8 to 3.0 for a typical data center), it stands to reason that companies will be knocking on ADC's door to move their computers out of San Francisco and into the McClellan center.

PUE is a metric created by the Green Grid industry consortium that is a ratio of total data center power divided by total IT equipment power. Getting a PUE of 1.1 is remarkable, and it is as good as containerized data centers sold by Sun Microsystems, Rackable Systems, and Hewlett-Packard are getting by using containers to house IT gear compactly and water jackets to quickly remove heat from the gear.

The ADC data center in McClellan Business Park will open in the spring of 2009, and ADC is carving it up into 10,000 square foot suites that can deliver about 2 megawatts of juice to computing gear. Michael Cohen, president at ADC, says the facility can be expanded to about 220,000 square feet of data center space, and given these numbers, you can bet that expansion is part of the plan.

McClellan is only an hour and fifteen minutes outside San Francisco - meaning it is outside the earthquake zone and flood plain - and it has its own airport. So it has more appeal than just a good power rating.

Cool running

Intel has also done its own proof of concept on the outside air cooling idea, and with this test, the company was able to cool the data center almost exclusively with outside air that was 90 degrees or cooler. Intel thinks that a correctly placed 10-megawatt data center could save $2.87 million annually in power costs. (ADC reckons its McClellan data center will save between $1 million and $2 million a year).

In Intel's proof of concept, a data center with 900 blade servers was set up in a temperate climate, half with a traditional air conditioner-chiller setup, half with the air economizer. Intel did not attempt to control the humidity in the air economizer setup and did minimal air filtering.

The test ran over 10 months, and the air economizer setup was used to cool the 450 blade servers on the one side of the data center 91 per cent of the time, resulting in a 67 per cent savings in electricity costs compared to the other half of the data center, which used normal AC. This test was done on two 500 square foot data centers - by no means large - but Intel's techies think the idea can scale. (Start buying up land now, speculators).

Intel also tested how running the gear at a higher temperatures and with more dust and humidity can affect failure rates. (Believe me, this is a problem in an Upstate Manhattan window-cooled data center looking out over a bus depot, the A train depot, a garbage truck depot with lots of dust coming in the windows from across Broadway, and a resident Border Collie with long hair).

Intel's test - which only spanned 10 months remember - showed a 4.46 per cent failure rate for gear in the air economizer setup compared to 3.83 per cent in the AC-cooled data center right next to it. My advice: Run this test for four years and get back to me. Commercial-grade servers can take the heat and the dust too - at least in my experience. But I think you'll see more of a divergence in failure rates over time.

If you really want to go crazy, and you are not worried about security, you could try what Microsoft did: build a tent city for your data center.

Microsoft's data center managers wanted to test the idea of how computers could handle the Washington weather, so they put a rack of machines under a metal-framed tent out in the fuel storage depot behind the real Microsoft data center. The tent city had five HP DL585 Opteron boxes running workloads from November 2007 through June 2008, and even with water dripping on the racks, a collapse of the tent during a storm, and a leaf getting sucked into the fans, the servers just kept going. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like