Oil bath, anyone? Intel servers take a dip

Oil bath, anyone? Intel servers take a dip

In Eastern cultures, taking an oil bath is part of a weekly ritual which is supposed to relax, restore and cool down the body.

But what about applying the same concept to data centers? Could it be cooled using mineral oil?

That's literally what Intel has done at its Rio Rancho facility in New Mexico -- dunked them in oil and kept them there.

After engaging in some experimentation, the company is recommending it as a viable option for to use for data center cooling, as energy bills can comprise a big portion of overheads. It says the mineral oil's cooling effect improves energy efficiency and reduces energy costs.

According to Intel, the practice also helps improve server performance, which enables them to run faster.

The chip maker's Rio Rancho data center hosts large supercomputers for the state of New Mexico. As the center had the space and experienced personnel necessary to conduct the experiment, it was an ideal location to host the experiment.

The oil bath experiment

Seven servers were placed in large vats of mineral oil and stood up on end, with the oil circulating through the servers to remove heat. They were compared for a year with seven other servers that had air cooling, with the same workloads running through both batches.

Intel found that it needed only 2 to 3 percent more energy to cool them on top of the energy needed to run them, instead of the 60 percent additional energy that's typically required for air-cooling servers.

At the end of the yearlong experiment, the company took the servers apart and conducted failure analysis. The oil did not damage the servers or cause any problem.

Photo of server in mineral oil bath provided by Intel

"It's been around for a long time for industrial transformers, so it was a natural extension for data centers," said Mike Patterson, Intel's senior power and thermal architect.

Intel collaborated with Green Revolution Cooling, an Austin, Tex.-based company that supplied the oil, the special vats and helped make changes to the servers to adapt them for oil. It also helped Intel with analyzing their performance.

Instead of a rack configuration which might have 40 servers stacked up, the form factor for the oil cooling system was a rack lying down. The configuration was set up "so you can see the front panels, switches and lights, but the servers are immersed in the oil itself," Patterson said.

As an extra energy saving step, Intel took the fans out of the servers, which resulted in some jerry-rigging to prevent a shutdown. Typically, if a fan stops working in a server, the server might decide to shut down because it cannot cool down. Removing the fan might lead the server to make the same assumption, so technicians put a jumper wire in that makes it look like the fan is there.

Aside from this, they did not have to make major alterations other than plugging the vent, since the majority of components inside the server worked just fine in oil.

"The oil flows through the server, wires, chips and components and does a very nice job of removing heat from the server," Patterson said.

The oil is pumped from outside through a radiator like in a car, then it's cooled by the air circulating through it. It's then sent into the oil vats.

Servicing the servers is no different than usual, although Patterson admitted technicians had to get used to first draining the oil from the servers before commencing work.

He expects there will be some hesitation before the market adapts this cooling technique, which although cutting-edge right now, is poised to take off once early adopters get over their reservations.

So is the biggest roadblock the challenges of adapting to mindset that serves can be cooled while immersed in oil?

"It's getting used to doing things so drastically differently," Patterson said.

In terms of cost, when compared to a high-end containment system, the oil cooling process comes out ahead by being about 10 to 20 percent cheaper, he said.

Intel itself has no plans to convert its data centers to this method of cooling, although it will be one of several options it reviews when it plans future centers.

It conducted this experiment because it wanted to ascertain how it worked and be able to tell customers about the different options available, so they would fully "explore for themselves what is the lowest cost of ownership and make informed decisions," Patterson said.

Topics: