I'm gearing up for the IBM Smarter Buildings Forum today in New York City (disclosure: I am a paid speaker at the event) and I've just spent two days with some of the most brilliant people in the world of building diagnostics, so information and intelligence are on my mind.
I'm sure many of us have seen one of the online flash videos about the exponential explosion of information flow worldwide in the past decade. There is no doubt that the amount of information we absorb every day is orders of magnitude larger than what our forebears were required to process.
Rhetorical question: Does all of this extra information and make us smarter? Not-so-rhetorical answer: Not necessarily. As with any kind of simulation modeling exercise, the "garbage in, garbage out" problem exists. By itself, raw information serves to paralyze rather than empower.
To make good decisions in buildings, for example, could require processing gigabytes of information each day -- far more than a normal human being can process effectively.
Building automation/management systems (BAS/BMS) provide real-time data on a wide range of operational and environmental conditions inside modern buildings. Originally created to manage energy, these systems now track indoor environmental conditions -- CO2 levels, humidity, temperature, etc. -- as well as building security. As the number of desired metrics has expanded, so has the quantity of information, not all of it necessarily useful.
Into this breach have stepped a number of energy analytics platforms that demonstrate patterns of energy intensity by hour and by day. These dashboards can now follow trends in key metrics that allow building managers to detect anomalous behavior in building systems and, in some cases, in individual pieces of equipment. The ability to automatically diagnose and detect faults is a huge breakthrough in moving from building information to building intelligence.
There's no doubt that tomorrow's buildings -- more complicated buildings -- all have to be significantly smarter than they are today. Because saving 20, 30 even 50 percent ain't gonna cut it anymore. Starting pretty much now, every new building needs to be net-zero energy consuming, if we are going to stand a chance at hitting the 80 percent carbon emissions reductions we need to hit by 2050.
At the same time, these significantly more complex buildings also must be a lot simpler to operate. The reality is that most of the people managing buildings today are too busy and too stretched to deal with large volumes of information to synthesize. What these unsung heroes need is a hand in making multiple difficult decisions. This is where smarter buildings comes in.
At today's IBM Smarter Buildings Forum, we will be looking at three case studies about how building-level analytics can help building managers make decisions that will reduce energy use -- even in 100-year-old buildings -- by up to 40 percent and dramatically improve building performance in the most challenging situations.
This is a great place to start; it may be what many of our older buildings are capable of saving cost-effectively under our antiquated Ego-nomic system, but it can only be considered a starting point: a place from which we go significantly beyond.
Much like the Communist theoreticians wanted to use capitalist hands to build a worker's paradise, Ego-nomic tools are beginning to set the stage for an eco-nomic future. In one of this week's featured blogs, USGBC's Lane Burt discusses improvements in the tax code affectionately known as Section 179D, which provides tax credits for buildings that reduce energy by 50 percent compared to the 2001 energy code.
Eagle-eyed readers will note one of the issues that USGBC, NRDC and the Real Estate Roundtable is trying to correct in this legislation -- that the baseline benchmark is already antiquated and expected savings are significantly eroded when compared to a building built to the 2010 energy code. Other proposed fixes include dividing the tax break in to a prescriptive element, where a portion of the rebate is paid for installing better equipment, and a performance element, where the remainder of the potential benefit is paid after field performance verifies the expected savings.
If anyone is confused about the difference between fake geothermal (i.e. ground-source heat pumps) and real geothermal, which takes the earth's heat directly from the ground and employs it for heating water and other productive use, should check out Adam Aston's piece on the Peppermill resort-casino in Reno, Nevada. Relax. I love ground-source heat pumps, but using the ground as a heat sink is no more "free" energy than using the air. Both require electricity to create mechanical energy to convert the temperature differential between the coils and the surrounding heat transfer media to heat and/or cool.
This week's Look-Grandpa-I-picked-up-the-$20-bill-you said-was-fake-but-it's-real! Award goes to Canada's National Research Council Institute for Research in Construction for developing a new, more durable form of concrete. By essentially tripling the durability of ordinary concrete and increasing the durability of high-strength concrete by 50 percent, this new material can significantly reduce the lifecycle-embodied energy of building and maintaining basic infrastructure.
Given that well over $1 trillion of water and transportation infrastructure improvements are necessary over the next decade or so, if the field-testing of this new material proves out, could result in a very significant savings not only environmentally but also economically. Now that's building intelligence.
Image CC licensed by Flickr user Gideon Tsang.