Five Top Ways to Cut Your Data Center's Energy Bills

Five Top Ways to Cut Your Data Center's Energy Bills

The quickest way to green IT is to cut your data center's energy bills. That's easier said than done. But Kenneth Brill, executive director of the Uptime Institute, has some free advice on how you can cut your use of electricity.

For the full story, check out his article at Forbes. Here are his top five tips:

Correctly set the temperature and relative humidity set control points on the cooling units
Brill has some counterintuitive advice here: Using the coldest air is not best. He warns that "Cold intake air is actually bad for reliability because it causes water to condensate inside the hardware." If you use air that is too cold --- colder than 59 degrees Farenheit --- it will cause water to condense inside the hardware. His conclusion: "The temperature of the air leaving the cooling unit should be within a few degrees of the desired hardware air-intake temperature. A good equipment intake temperature is 72°F."

Determine the number of cooling units running as compared to the actual heat load
Most data centers have far more cooling units than they need, he says. In fact, he concludes: "the typical computer room on average has three times more cooling running than is required by the actual heat load." Not only that, but "the computer rooms with the most excessive cooling had the highest percentage of hot spots." So he says you should reduce the amount of cooling to your heat load.

Validate that all cooling units are capable of delivering rated capacity
Just because a cooling unit is rated to give a certain amount of cooling doesn't mean that it's actually delivering it. Poor maintenance or installation is commonly the problem. Incorrect piping, plugged filters, and compressors undercharged with refrigerant are all issues. His solution: Contract with a third-party, independent of your current contractor, to do a complete survey and show you the results.

Deliver cold air where it is most needed
Sounds obvious, doesn't it? But knowing this and doing this are often separate things. Cooling is usually done by randomly mixing cool and hot air. Among his recommendations are to have separate hot and cool aisles, and have the temperature difference between them of at least 10 degrees.

Eliminate dehumidification and humidification
Brill says that because computer rooms should only contain hardware and not people, you should restrict the amount of outside air coming into it. As a result, you shouldn't need dehumidification and humidification.

He recommends: "As a simple diagnostic, have someone do an inventory of the dehumidification or humidification indicator lights on every cooling unit and record the physical location of each unit relative to each other."

Doing this, he says, will probably take only 30 minutes, and "is likely to indicate humidification on one cooling unit with the immediately adjacent unit simultaneously de-humidifying." Clearly, that's a massive waste of energy.

Brill's Bottom line
Taking these five steps, Brill says, can save substantial amounts of energy. He notes: "One lightly loaded data center cut total energy consumption by 25% and chronic hot spots magically disappeared."

He warns, though, that you need some serious engineering oversight when taking his advice.