A Blog by Jonathan Low

 

Aug 22, 2019

The Growth In AI Usage Comes With Exponential Increases In Demand For Energy

There is a cost to every benefit. The question is whether anyone is really looking at these costs - and whether they are being calculated accurately. JL

Karishma Vanjani reports in MarketWatch:

A neural network can emit 17 times more carbon dioxide than an average American does in a year, and five times the lifetime emissions of a car. The energy efficiency of the average data center has deteriorated for the first time in a decade, underscoring the effect that AI usage is having on the environment. Neural networks require custom-built hardware for training. GPUs, and TPUs, are accelerators that deliver improvement in the amount of time it takes to train a neural network.Though newer processors can achieve similar results in a shorter time, they do so by consuming more power than legacy hardware.
Advances in technology can allow you to order food by voice or unlock your phone with your face, but those new capabilities could take a toll on the environment.
Enhanced tech capabilities are being developed through the use of artificial-intelligence approaches like neural networks, which detect patterns in speech and images by training programs across countless data points. That process constantly crunches reams of information on power-hungry servers in data centers that use a substantial amount of energy to power, cool and monitor the servers.
The result: Training a neural network can emit 17 times more carbon dioxide than an average American does in a year, and five times the lifetime emissions of an average car.
Those are the findings of a recent paper by researchers at the University of Massachusetts, Amherst, which highlighted the substantial power generated by AI technologies. An annual survey of power use in data centers showed that the energy efficiency of the average data center has deteriorated for the first time this year in more than a decade of measurement, underscoring the effect that increasing AI usage is having on the environment.


While common models could be trained or developed on a server or laptop, neural networks now require custom-built hardware for training. Graphics processing units, or GPUs, and Tensor processing units, or TPUs, are common accelerators that deliver an improvement in the amount of time it takes to train a neural network to complete a desired task.
Though the newer processors can achieve similar results in a shorter time frame, they do so by consuming more power than legacy hardware. Researchers at Alphabet Inc.’s GOOGL, -0.34% GOOG, -0.49%  Google have highlighted that central processing units, or CPUs, are more “energy proportional” than GPUs and TPUs because they use less wattage to do the same amount of work.
The emission rates calculated by the UMass researchers are based on workloads performed in on-premises data centers, but most developers don’t have their own AI-powered servers sitting around. Instead, they use cloud data centers owned by some of the largest tech companies in the U.S., like cloud-computing stalwarts Amazon.com Inc. AMZN, -0.92% , Microsoft Corp. MSFT, -1.51%  and Google. Some large tech companies run their own data centers to host popular services, such as Facebook Inc. FB, -1.35%  and Apple Inc. AAPL, -0.19%, which view in-house data centers as a way to protect their competitive positioning and gain more control over product development.


Forrester analyst Chris Gardner said companies tend to prioritize technological development over energy awareness, just as consumers don’t often think of the environmental impact of their favorite apps.
“Average enterprises building data centers are racking up servers,” he said. “They look at it from the perspective of the algorithms they are going to run. They don’t really care much about the hardware used.”
A Microsoft spokesperson said that the company provides cloud services that are 93% more energy efficient and 98% more carbon efficient than traditional data centers because of renewable energy use. Apple uses 100% renewable energy for all of its data centers and Facebook supports all of its new data centers with 100% renewable sources, according to sustainability reports put out by the companies.


Amazon is currently supplying 50% of energy needs for its global infrastructure from renewable sources, with “a long-term commitment” to get that to 100%, according to a spokeswoman. That was not enough for an employee group that sought a full climate-change resolution from Amazon investors at this year’s shareholder meeting.
“Sustainability initiatives are now quite important to investors,” Jennifer Cooke, a research director at IDC, said in an email. “I believe that more organizations will turn to data-center partners (larger multi-tenant data centers, service providers) to help them achieve their sustainability goals.”


Today, there are more than 7,000 mega data centers worldwide, IDC reports, and the energy efficiency of these centers is determined by how effectively they use power, through a metric known as power usage effectiveness, or PUE.
IDC considers a PUE score of 1.2 to be “very efficient,” and Microsoft says on its website that all new data centers have a 1.125 PUE, meaning they are even more efficient. But the average data center still has room for improvement, notching a score PUE of 1.67 this year, according to an Uptime Institute survey, up a tenth from a year earlier. That outcome marks the first time PUE values have deteriorated since at least 2007.
In general, cash-rich tech giants have found incentive to work toward carbon neutrality, but smaller enterprises are lagging. They tend to be more concerned about expediting the development cycle of projects, regardless of the environmental cost.
As per the Global e-Sustainability Initiative, a Brussels-based communications industry group, the carbon footprint of the information and communications technology sector is projected to be 2.3% of global emissions by 2020, though sectors like transportation account for considerably more.
The public cloud services market is expected to grow to $214.3 billion this year, according to Gartner, up from $182.4 billion in 2018. This could open up more opportunities for smaller enterprises to leverage the cloud, which may lessen the need for small data centers that aren’t as energy efficient.

0 comments:

Post a Comment