Green Computing at Google

Green Computing at Google

The Dalles is an ideal location for Google's "Project 02" -- the company's code name for its not-so-secret new data center. The 30-acre site in Oregon was chosen for its ready access to cheap electricity from the 1.8 million kilowatt hydroelectric dam just upriver, as well as cheap lands and an accommodating local government. The same factors influenced its purchase of 215 acres in Lenoir, N.C. earlier this year: cheap land, cheap energy and room to grow.

Google's recent land grab results from a sobering reality: Large server farms need voracious quantities of energy for their operations, and Google wants to do that in the most environmentally responsible manner possible. Despite the environmental impacts of dams, using renewable energy from an already-existing dam is the best alternative for a company with Google's energy needs and a credo of "Don't Be Evil."

"All else being equal, we would much prefer greener sources of power," said Bill Weihl, head of Energy Strategy at Google. But as he was quick to point out, Google's thirst for power must be balanced with its business realities - in this case the significant costs of its energy choices.

Part of Weihl's job is to take these costs into account, and in doing so he brings a wider view of the total cost of ownership to IT decisions.

"The [IT] industry has been driven by price and performance for 25 years, not as much by total cost, and certainly not by energy efficiency or environmental concerns," Weihl said. He believes that's begun to change in the last few years, in part due to Google's influence.

Google has embraced a wide range of environmentally friendly practices. Some of them, like the company's fleet of 32 Wi-fi-enabled, biodiesel-powered employee shuttle buses, are not options for many smaller companies. Other projects, like the 1.6-megawatt rooftop solar installation just being completed on Google's Mountain View, Calif., headquarters, may pose daunting up-front costs, but will pay for themselves very quickly.

The True Cost of Ownership

Many of Google's green-IT initiatives are readily replicable by others. Weihl singled out a couple of ways that just about any business can quickly start reducing its IT costs and environmental impacts at the same time. For example, looking at total cost of ownership -- of large-scale purchases like solar power or smaller ones like server power supplies -- can bring long-term savings to a company.

Take power-supply efficiency. In the U.S., power comes out of the wall at 120 volts of alternating-current (AC) electricity. The computer's power supply converts that to lower voltages of direct-current (DC) electricity for use in other parts of the machine: from 1.25 to 12 volts for components like the disk drives, memory, processors and graphics cards.

The power supplies in most computer systems, Weihl said, are still being built to standards set in 1981, the year of the first IBM PC. But over the years, the voltage needs of computers have steadily shifted, and now computer power supplies must output energy in many different voltages to all these components -- like having four power supplies instead of one.

Last fall, at Intel's Developer Forum, Weihl and colleague Urs Höelzle presented a white paper on that topic. They explained that while the typical efficiency of personal computers and servers ranges from 50 to 70 percent, spending a little more per machine has enabled Google to make its servers as high as 92 percent efficient.

Weihl and Höelzle explained how the company has worked with its vendors to build servers that use only a 12V power supply. Voltage regulator modules on the motherboard do the conversions to lower voltages, eliminating the need for multiple power supplies.

Weihl said the changes Google has adopted in its servers are small and low-risk, and they're now working with Intel and other manufacturers to make these more-efficient power supplies available to the entire industry.

Keeping Their Cool

Increasing the efficiency of its servers has the added benefit of reducing the heat output of each machine. In the last five years, the amount of heat generated by denser, more powerful computers has climbed dramatically. Data from Sun shows that between 2004 and 2005 alone, data centers' energy use per foot tripled from 40 watts per square foot to 120 watts.

Filling a data center with more computers using more power makes cooling costs more expensive than the costs of the building itself. A recent article in Computer magazine found the cost of powering and cooling computers worldwide in 2005 was $26.1 billion, half the total amount spent on buying new servers that year. And escalating energy prices coupled with steadily increasing computing power means that soon -- some estimates say within the next five years -- cooling your data center will be more expensive than filling it with computers.

All the more reason to find the most effective -- and cheapest -- ways to reduce heat output from your computers. Google won't discuss the specifics of how they design or cool their data centers (Google's secrecy on the specifics of its operations rivals that of the National Security Agency), but at least part of the cooling at The Dalles data center comes from evaporative cooling, water pumped through the center from the Columbia River.

Weihl said that although using this kind of cooling is not applicable to smaller companies, evaporative cooling is saving Google about a quarter of the cooling costs, and possibly more. "I don't have the numbers for how much we save, but it's well worth doing, from an energy and a dollar point of view," he added.

The Power of Management Software

Moving from the large scale to much smaller -- but no less effective -- ways to green computing, Bill Weihl said Google is looking very closely at a resource that most companies already have, but may not be using: a computer's power-management software.

Simply activating the power-save features on a fleet of corporate desktops -- which in Google's case could be as high as 12,000 computers for its employees worldwide -- can save 50 or 60 percent of the energy wasted when computers are left on, and idle, for 16 hours a day.

These features -- like dimming the monitor, spinning down the hard disks, putting the drive to sleep -- have been standard features on laptops from the beginning, but they're also regular features of most desktop computers made in the last few years.

But the U.S. Environmental Protection Agency has found that only a tiny fraction of computers -- as low as 5 percent -- actually use these power management features. When they do, it's worth it. In one case study, General Electric saved $6.5 million in electricity costs a year simply by changing computers' settings.

On July 20, the EPA will implement its Energy Star 4.0 standards, the first major overhaul of the energy-efficiency standards since 2000. In addition to requiring 80 percent efficiency in the power supply (a big step up from the 65 percent most computers currently get), one of the new requirements for Energy Star certification is having these power-management systems turned on in new computers when they are shipped. Displays must be set to go to sleep after 15 minutes of idling and the whole computer must sleep after 30 idle minutes.

Although Hewlett-Packard has already rolled out Energy Star 4.0-compliant computers, and other manufacturers will soon join the list, these requirements apply only to new computers.

Power management is one of Weihl's top strategies for saving energy -- and money. To that end, Google has already begun activating power management on the existing computers at its headquarters and Weihl said the company will expand the practice to all its facilities worldwide over the next couple months.

Looking at the bottom line, both for short-term and long-term energy efficiency, Weihl says there is a fundamentally more important step companies should consider.

"The first thing is to really ask what is the performance per watt of the IT equipment -- not just what's its price and what's it's performance, but what's its performance per watt. Ask vendors: 'How efficient is the power supply, how efficient is the motherboard and the energy conversion there, what can you tell me about the performance per watt?'"

Because there aren't standard benchmarks yet for performance per watt, Weihl said it can be difficult to compare products, but even asking the question of vendors is a good start, and can help make a big difference across the industry.

"The more that vendors understand that customers care about efficiency, and are willing to pay a little extra [up-front] for it, the more they will innovate, and the cost of that extra efficiency will come down."


Mathew Wheeland is Managing Editor of Greener World Media, publisher of Greener Computing News.