Get all the news you need about Smart Buildings with the Memoori newsletter

Which research categories are you interested in?

The power efficiencies we expect from microchips according to Moore’s law are decreasing, leading to growing power demand from computing technology. However, we are gaining energy efficiency from smart technology in buildings, cities, and elsewhere.

That smart-tech generates massive amounts of data, which then demands huge amounts of power. We can and we must build better data infrastructure if we are to maintain a balance between energy and data.

In April 1965, Gordon Moore, microchip pioneer and co-founder of Intel, published his famous observation that the number of transistors, in a dense integrated circuit, doubles approximately every two years. This rate of technological advancement has fluctuated over the years but even at its fastest, it was about 18 months, within range of Moore’s calculations. Now, the chip industry admits that its guiding principle is set for a downward trend. This decline is not equal for all chip-makers, however.

“Moore’s Law is only for the rich,” says Linley Gwennap, principal analyst at The Linley Group. “Big chip makers like Intel can afford the multibillion-dollar investments in chip factories to stay on the edge in manufacturing, which yields faster, cheaper, and smaller chips with each generation of chip-making equipment. Costs are rising rapidly, and only those who can charge a lot of money for chips will move to 14-nanometer or 10-nanometer manufacturing from the current 28-nanometer mainstream technology today.”

Intel remains a dominant force in the chip industry, serving approximately 80% of the $500 billion market. Other leading chip-makers; ARM, AMD, and IBM, simply cannot compete on cost due to the vastly different economies of scale in comparison to Intel. However, markets evolve and trends change priorities, such as the rise of energy efficiency. Recently, the chasing pack of chip-makers has turned to power-efficient server chip-designs in an attempt to snatch a share of a growing market — not in power-hungry computing but in the billions of small devices that make of the Internet of Things (IoT).

The IoT is taking data to new levels, many single applications are soon expected to generate as much data as the whole worldwide web does today. One billion drop cams, for example, will generate approximately 500 exabytes (1018 bytes) of data per month, according to analysis by ARM. That’s more than double all data transmitted on the entire internet during a typical month in this year — an average of 137 exabytes per month for fixed internet traffic and 29 exabytes mobile internet traffic in 2019, according to estimates from Cisco.

In the future we often talk about, when most buildings and cities will be smart, the world will have an unfathomable amount of data to deal with. This will create a variety of challenges, not least increasing energy consumption and the knock-on impact on climate change. At the heart of this debate are our growing and multiplying data centers, where huge facilities filled with towers of servers, are cooled 24 hours a day, 365 days a year.

“This is by far the biggest lever in the industry where the carbon cost to produce servers, storage, and networking equipment and the carbon impact to power it all is largely wasted. Common server utilization rates average between 10 and 20% across the industry,” writes James Hamilton, Amazon’s Data Infrastructure Guru, as he sails around the world, blogging and running data centers from his boat.

“Turned around, that’s 80% to 90% wastage and there is no topic more important to address around data center environmental impact than utilization. The industry could easily deliver improvements in the 2x to 4x range, and this is where I focus a large part of my day job,” adds Hamilton. “The greenest power is that which is not consumed.”

The IoT is flooding buildings and cities with sensors and other devices to increase energy efficiency. The resulting torrent of data requires power-hungry data infrastructure, such as data centers and connectivity. Data centers can utilize Building Internet of Things (BIoT) technology to reduce the efficiency of the data facilities themselves, however. So where does the balance land between declining Moore’s Law, growing data rates, and other strategies being developed?

“Encouragingly, typical-use efficiency seems to be going strong, based on tests performed since 2008 on AMD’s chip line. Through 2020, by our calculations for an AMD initiative, typical-use efficiency will double every 1.5 years or so, putting it back to the same rate seen during the heyday of Moore’s Law,” says Hamilton in a blog on power and water. “These gains come from aggressive improvements to circuit design, component integration, and software, as well as power-management schemes that put unused circuits into low-power states whenever possible.”

Efficiency is the power station you don’t need to build, even if that efficiency demands more power. The IoT goes further than just energy efficiency and as data rates spiral out-of-control from the wide range of data-rich applications, our data infrastructure will need to be at the cutting-edge of data-driven energy efficiency. Renewable energy is also fundamental to the solution, but in our smart future, we will depend on the energy efficiency of data.