Skip to main content

RMI

Data centers and distributed renewables: two peas in a POD

<p>How can the Internet and mobile networks use clean energy efficiently? Performance-optimized data centers are one place to start.</p>

Last month Google hosted the event “How green is the Internet?” which brought together 100 experts from industry, academia, government and NGOs to explore emerging questions around the environmental impacts and benefits of the information and communications technology (ICT) sector. With the growing electrification and connectedness of our society -- including smartphones, laptop computers, cellular and Wi-Fi networks -- it’s a timely topic.

It’s commonly reported that 1 to 3 percent of U.S. electricity use (PDF) comes from data centers alone. In fact, this figure is actually much larger once energy use by access networks -- the fixed and wireless infrastructure that connects users and their devices to data centers -- is considered.

No one can say for sure (yet) exactly how much energy is consumed by access networks, but Bruce Nordman from Lawrence Berkeley National Laboratory took a shot at estimating U.S. network energy use in 2008 (PDF) and the Centre for Energy-Efficient Telecommunications recently estimated the energy use of mobile access networks (PDF). Their findings indicate that access networks may use an amount of energy on the same order of magnitude as data centers, and that this consumption is rapidly growing.

The Internet and mobile and broadband networks are here to stay. How can we make sure that the ICT sector uses clean, green energy, and uses it as efficiently as possible?

Data centers going green

Last year, the New York State Energy Research and Development Authority (NYSERDA) partnered with Clarkson University, AMD, HP, GE and others to undertake a project that will demonstrate the viability of renewables-powered data centers. Rocky Mountain Institute (RMI) recently has joined Clarkson researchers on this project, which envisions a network of distributed green data centers (DGDCs) co-located with renewable energy resources. Imagine many small, geographically distributed performance-optimized data centers (PODs) that can operate either with the electricity grid as an interconnected resource or as an off-grid electrical island.

At first, this might appear rather pedestrian; people in remote areas have operated off-grid for decades, right? But a major paradigm shift is at work: The PODs are designed to shift “load” by migrating server workloads to other PODs via fiber optic connections when local, inherently variable renewables aren’t producing enough power, using these resources much more effectively. The PODs would be completely self-contained, with their only external interfaces being power input, network connections and potentially a connection for cooling and/or waste heat recovery.

The DGDC concept ties into the RMI's energy vision in these key ways:

1. Optimal use of distributed, renewable energy resources: Distributed PODs would allow renewable energy to be used at its source, reducing or eliminating the need for costly transmission infrastructure (and associated line losses and other inefficiencies), ultimately making viable the development of additional (and otherwise uneconomic) renewables. In a nutshell, this is the Small is Profitable logic at work in the ICT sector.

2. Managing the variability of renewable resources: The PODs approach would provide a flexible load that could help manage the variability of certain renewables, such as wind and solar. This ability to manage and stabilize the variability of wind and solar power is a core concept of RMI’s 2050 vision for the U.S. electricity system. The PODs approach provides this flexibility from the ability to transfer computing load from one POD to another as renewable supply at a given site fluctuates, essentially providing firming capacity for these renewables.

3. Increasing demand-side flexibility: If PODs are interconnected with the broader electricity system, their flexibility also can enable the provision of ancillary services to the grid. For example, a POD’s ability to quickly come on- or off-line would allow a power system operator to use the POD as a dispatchable load. This capability -- termed demand response -- can help to address grid stability problems by improving the balance between generation and load. If sited in areas where electric power markets exist, PODs could use their flexibility and controllability to provide (and be compensated for) ancillary services as market dynamics dictate.

4. Enabling microgrids: In a microgrid, controllability is king. To function, any electricity grid -- be it macro or micro -- must balance instantaneous power consumption and generation at all times. The challenge for a microgrid operator is doing this with a more limited set of energy resources than typical power systems, which usually span enormous geographic areas. With this in mind, it’s easy to see the potential for a symbiotic relationship between PODs and microgrids, with PODs providing tremendous value through ultra-flexible, adaptable load, while also benefiting from the increased reliability provided by that microgrid.

5. Clean energy for the developing world: The distributed POD concept has particular promise for the developing world. There are almost 4.7 billion people without computers and access to the Internet -- but that won’t last for long, and, in undeveloped areas, diesel generators are often the choice power source for new access networks in nations with unreliable electric grids. With developing nations already considering clean renewables as an alternative energy source for cell towers, the POD concept appears very promising.

At the end of the day, RMI’s primary focus is on reducing fossil fuel energy use, and the obvious low-hanging fruit are the kWhs wasted by system inefficiencies. The DGDC concept -- as realized via PODs -- can eliminate a significant amount of wasted energy, primarily by reducing transmission and distribution losses, but also through best practice cooling methods and other efficiency gains. Under a conventional scenario, only 12 percent of the energy in a chunk of coal burned by a coal-fired power plant is used in final computer processing. You need 860 W of coal power to drive 100 W of computing. By comparison, you’d need to just 690 W of renewable power to do the same 100 W of computing with PODs.

Data centers and the ICT industry consume a lot of energy, and that energy usage is rapidly increasing. The industry is heavily reliant on the electricity grid for its power needs because, with the exception of large-scale hydro-electric power, the large geographic footprint of distributed renewables makes it difficult to integrate these sources at the site of conventional, large-scale data centers. The DGDCs concept could change all of that. By directly powering PODs with clean, renewable energy, this idea offers a potential solution to the issue of data center energy use faced by our increasingly digital society. While there are many issues to work out in the pilot projects to come, we are excited by the potential of the DGDCs concept and by the many ways it may complement and amplify our existing efforts.

James Sherwood, an analyst at Rocky Mountain Institute, and Pier Marzocca, associate professor at Clarkson University, contributed to this post.

Solar panel image by jarnoslava V via Shutterstock

More on this topic

More by This Author