What Intel and VMware know about your data center

Practical Magic

What Intel and VMware know about your data center

An engineer and active rack servers in a data center.
Shutterstock Gorodenkoff
Annual global IP traffic will reach 3.3 zettabytes by 2021, and data centers are critical to routing internet traffic.

This article was adapted from the newsletter VERGE Weekly, running Wednesdays. Subscribe here

There are data centers, and then there is the data center industry.

The latter has become epitomized by the big cloud services companies such as Amazon, Google, IBM and Microsoft, as well as global colocation and hosting companies such as Equinix and Digital Realty. Many of those companies, and their frenemies, are investing deeply in energy efficiency measures and renewable energy. Wisely so.

But plenty of companies still manage their data center activities — email servers, customer service databases and other applications — within their own facilities, or "on premises," to use the term preferred by those in the IT world.

Those installations aren’t shrinking: Two surveys from 2018 point to a heightened pace of computer server purchases that are being driven by corporate investments in artificial intelligence, blockchain and other emerging technologies. The financial services industry alone, for example, could spend up to $14 billion on big data technologies between 2018 and 2021.

Newsflash: The people running onsite data centers are just as concerned with how much power their IT infrastructure consumes — and where it comes from — as those in the business of selling data center services. That’s because power just happens to be the largest operational cost associated with using these technologies. And that’s why at least two prominent companies — virtualization software company VMware and tech architecture behemoth Intel — are putting considerable energy (if you will) into developing better ways for companies to keep tabs on that sort of information and to manage it accordingly.

This week during its annual research and development conference, VMware introduced a service called the Carbon Avoidance Meter (CAM) — developed in collaboration with the company’s sustainability team — that uses telemetry and other data generated by its software to calculate "carbon scores" related to data center operations in real time. CAM will be sold by the company's professional services organization to VMware customers.

The idea is to help companies not just better understand energy consumption patterns but also to get a better grip on what sort of energy is running the servers where most of their processing loads are being managed — be it generated by coal, nuclear or wind power plants. Using that information, companies might opt to move those loads from "dirty" locations to places where the electricity comes from more sustainable sources. "We will be able to give much more visibility to the folks who are in charge of this," said Mornay van der Walt, vice president of research and development for VMware.

The new offering was cooked up by VMware’s engineers, but the original catalyst came from the sustainability team, managed by vice president of sustainability strategy Nicola Acutt, who happens to report to the software company’s chief technology officer. Among other things, CAM will use data being gathered by WattTime, which is using sensors and artificial intelligence to help companies better understand the generation sources behind the electricity they actually use at their physical location. WattTime just last week snagged a grant from Google.org that will go toward funding a satellite network capable of tracking power plant emissions from space.

Technically speaking, the product should be in customers’ hands by the end of 2019. And, yes, this is a revenue-generating service, even though VMware wasn’t ready to discuss pricing. "CAM is a proof point of what happens when you set a challenge to an engineering community," Acutt said.

VMware isn’t the only corporate putting considerable resources behind embedding sustainability metrics and concerns into dashboards originally designed to handle more traditional operational and efficiency measures.

Intel’s data center software group, led by general manager Jeff Klaus, is sharply focused on helping companies better manage their data center environments by collecting data about minimum and maximum power usage, heat levels, server use and so on. The effort is important because Intel’s technology can gather information across diverse computer hardware platforms — most data centers have at least three brands represented in the racks housed within their facilities, Klaus reminded me when we spoke earlier this spring. "All the [original equipment manufacturers] have their own language and method of reporting that information," he said. (Some of them, by the way, license what Intel develops or at least parts of it.)

Artificial intelligence will be increasingly central to automating energy usage, based on the defined levels established by the company. Specific servers could be powered down or switched on in dynamic or scheduled ways, Klaus suggested. Google has been testing these principles (although not necessarily with Intel’s technology) in its own data centers. Increasingly, more traditional businesses will have that option, too.