This article is more than 1 year old

Cool technology: Submerged blade servers escape the heat

The beauty of Apollo

Keeping servers cool is a challenge, even in a purpose-built data centre. Imagine for a moment the difficulty of doing so as part of an oil pipeline in the Australian outback, or as part of a military command post in the deserts of Afghanistan.

I can tell you from experience that cooling is a serious issue even during a Canadian summer. My home town of Edmonton averages 25°C during the summer but gets above 30°C for three or four days a year.

This is more than enough to drive me indoors to cower in front of the air conditioner, but it is far worse for a lot of the equipment I have installed outside.

Outdoor equipment in Canada often lives in horrible little metal boxes ironically called "sheds" that bear no resemblance to any structure so spacious.

They are basically a full depth 12U 19in rack bolted to the side of some steel monstrosity made up of nightmares and solar absorption. Inside the box sit 4U of server, 4U of networking and 4U of heating, ventilation and air conditioning.

With the chiller going flat out the temperature doesn't drop below 60°C during the hot days, and stays around 50°C for about four months of the year.

Meeting my needs in my local area is reasonably easy. Go just a few hundred kilometres to the south west and you run into the unremitting hell known as Kamloops. This is a miserable place bereft of moisture where temperatures gleefully push 40°C for weeks on end and sit north of 30°C for months.

Australians and other desert-dwelling madmen, of course, laugh with unrestrained mirth at these temperatures.

None like it hot

At first glance it would seem that the obvious solution to these problems is to replace the horrible little shed things with something better. That is far easier said than done.

For reasons involving bureaucrats (and, I am convinced, demonic pacts) getting external enclosures certified for use with telecoms equipment, on oil pipelines or with various other utilities is way harder than it should be.

The other part of the equation is that the instant we get a new enclosure that reflects heat better, has a better chiller or a more efficient exhaust, some yutz decides we can cram more wattage of computing into this next-generation shed than we did before.

Networking along 6,000 kilometres of oil pipeline is spotty at best so cloud computing is not an option, and there is a truly unholy amount of sensor data to capture, crunch, compress and forward back to base.

In the meantime, something has to look at all those sensors and make decisions about whether or not to freak out about anything, and we have learned from experience to keep the decision-making widgets as close to the possible points of failure as possible.

Even satellite links go down, and when you are 1,000 kilometres from the nearest major settlement response times are measured in hectares of the environment lost.

Oil on troubled servers

So, if the ovens we kindly refer to as shelters are not likely to get any cooler we need servers that can handle higher temperatures.

The first answer that comes to mind is from LiquidCool Solutions, which does not manufacture IT equipment but licenses patents (see PDF here).

Liquidcool claims to be able to run servers "in ambient temperatures of 50°C or higher while maintaining core junction temperatures 30°C cooler than fan/air based cooling". That has got my attention, yes indeedy.

LiquidCool pitches "harsh environments" as a major use case. It claims to be able to reduce power consumption by 40 per cent or more, while reducing power to cool by an order of magnitude. In view of the number of solar-powered systems I work with, I am intrigued.

LiquidCool licenses patents to build fully enclosed submerged server technology. Put your server in a box of oil, seal it and pump the oil out to a radiator. It is a great concept, but I have a few issues.

LiquidCool's website contains discussion about resistance to "sand, salt, gases, corrosion and humidity". Ok, that's neat, but I need to be honest and say that none of those have ever really posed much of a problem for me. Not that they aren't a problem but the real problem is metal dust.

Driller killer

My enemy is a welder. Actually, he is a welder, steamfitter and electrician. He drills holes into my hell-sheds. Little tiny holes with his evil little drills. And he attaches things to those sheds.

You would be truly shocked at how much sand you can get in a computer and it still works fine. Salt, urea (road salt for snow is in fact urea) and even corrosion can occur in stupefying amounts and modern servers just keep working.

I am appalled, amazed and flabbergasted at the stuff I pull out of those sheds at maintenance cycles. I have seen PCBs I cannot unsee.

But metal dust is death. That irritating little welder and his irritating drill create little particles of metal dust that get everywhere. And this stuff kills computers.

Metal dust gets sucked in by fans and settles exactly where you don't want it to settle

Even worse than the welder are explosions. Pipelines have problems. When they go boom – and it happens rather more than we'd like – little bits of metal dust get everywhere.

Metal dust is digital cancer. It doesn't just waltz up to your server and shoot it. It gets sucked in by fans and settles exactly where you don't want it to settle because of the various electromagnetic and electrostatic fields in your chassis.

You can't put filters on the fan intakes. They will be completely impenetrable within a month.

Assuming that there is absolutely no possible way for the metal dust to get into the oil, then a completely sealed, submerged computer might be a solution. Except for the part where swapping computer components out is a job likely to be given to Welder McMetaldust.

The concept of talking him through removing a box of electricity from a vat of oil on the side of a busy highway over a flaky satellite phone is like the start of a lost Lovecraft novel.

Surely there is a better solution?

Water's lovely

In June, HP announced its Apollo 8000 system at the HP Discover conference. What is on the HP website is mostly marketing fluff but I have been able to put together a few details.

Apollo appears to be enclosed liquid cooling, not oil submersion. It has removable dual CPU computer sleds.

Contemplate this for a moment. HP's "dry disconnect" technology offers the efficiency of liquid cooling, and also the ability to hot-swap nodes from a chassis.

With a bit of work a system like this could be engineered to keep the thrice-damned metal dust out, while even the most non-technical among us could be walked through swapping a node over the phone. I think I'm in love.

Alas, all is not quite perfect. Apollo weighs in only at rack strength. No 8U wonders for me yet.

Still, it proves that the technology exists, and that means it is only a matter of time before it comes down from the lofty heights of high-performance computing and works its way into my awful little metal sheds.

Send in the army

Of course, there is another group that buys computers which need some ruggedisation: the military.

Here, HP's Apollo looks nearly perfect. HP claims to be able to exceed Energy Star Platinum certification levels for power efficiency. A military deployment constantly strapped for power will definitely be grateful for that.

HP also claims that thanks to modular plumbing it "can be operational quickly” rather than in days or weeks. Again, this is important if your job is to set up camp, then take it down and move it in short order.

If my conversations with Christopher Kusek at VMworld were any indication, real-world military deployments have even more problems with that dreaded metal dust than my shed systems.

Project Apollo is aimed at the high-performance computing market. It is designed to get more computers into a smaller space while using less power than the competition.

But I can see a much wider market for this: ease of use in places where 20 years ago people wouldn't have thought computers could go.

Never let it be said that HP doesn't innovate. Now let's hope it can bring this technology to the wider world, and fast. I have sheds to fill. ®

More about

TIP US OFF

Send us news


Other stories you might like