|
Ever since I published an essay exploring the relationship between climate change and the Internet, I have endeavored to bring this subject to the fore as often as possible (and in relevant fora and discussions) since the responsibility of creating a more sustainable world falls on all communities and stakeholder groups. It is particularly pressing now—at a time when international interest in curbing climate change is strengthening, while it is juxtaposed with the receding commitments of the United States government vis-à-vis climate change and the environment under the Trump administration, which was reflected in his first official budget proposal.
Such instances where I have highlighted this topic included advocating for more environmentally friendly practices, such as reducing energy use and/or transitioning to renewable energy sources like solar and wind, at the global Internet Governance Forum (IGF), which was held in Guadalajara, Mexico, in December 2016. The Dynamic Coalition on the Internet and Climate Change (DCICC), which was a focus of the aforementioned essay, submitted its annual report leading up to the IGF, and was represented at the Dynamic Coalition (DC) main session where we updated the IGF community about our work and progress made in 2016. I was able to facilitate two breakout sessions at the Internet Society (ISOC)-sponsored Collaborative Leadership Exchange (CLX) as well—one where we discussed the Sustainable Development Goals (SDGs), and another that focused solely on the Internet, information and communications technologies (ICTs), and the environment. The work has only just begun, however, and is continuing in earnest. For instance, I was appointed as the focal point for a European Dialogue on Internet Governance (EuroDIG) workshop examining digital pollution and the effects on the environment (such as electronic waste (e-waste) and energy consumption), and I am co-organizing the DCICC annual session at the 2017 WSIS Forum.
So far, most of the feedback I have received from individuals across the Internet governance community about raising this issue has been positive. I greatly appreciate the support that has been shown, and the relevance of maintaining this discussion was further reinforced by a World Health Organization (WHO) publication that was released earlier this month (March) regarding technology, e-waste, and the environment:
“The WHO also noted [in their Inheriting a Sustainable World: Atlas on Children’s Health and the Environment report [PDF] the importance of properly managing emerging environmental hazards like electronic and electrical waste. Without proper recycling, this can lead to children being exposed to dangerous toxins known to harm intellectual development and cause attention deficits, as well as more serious conditions like lung disease and cancer.”
With the proliferation of the Internet of Things (IoT), the dangers raised by the WHO’s report are even more pressing. Yet, e-waste is only one part of the problem. As more and more people come online, more devices are going to come online as well, which is going to further add the need for power consumption by the Internet and ICTs. This point was explicitly raised in a personal email exchange between Vint Cerf—one of the “fathers of the Internet” who co-invented TCP/IP—and I. We were discussing Google’s transition to fully renewable energy use for its data centers, and he posed two questions. After Vint gave me his consent to share the information from our exchange, I decided to publish it here as a follow-up to my October 2016 essay. The following was my substantial answer to his questions (which are listed below in bold). Also, for full disclosure, note that I often refer to Google as a case study because (1) Vint is vice president and chief Internet evangelist at Google, (2) his inquiry regarding Google’s data center efficiency is specifically what prompted the discussion, and (3) Google has been committed to reducing its carbon footprint for years as well as sharing that insight with other stakeholders, specifically in the private sector and technical community.
1. “Do you know whether the aggregate power requirements for the data centers exceed the power requirements for all the laptops, desktops, mobiles, tablets, home routers and Wi-Fi units, etc.?”
I do not have this information, but I can imagine it is a great deal when multiplied by the billions of devices that exist. I found two articles that list the wattage for various electronics (one from Daft Logic, the other from the American Council for the Energy Efficient Economy). I am not sure, though, if those numbers would reflect the various realities (and policy environments) of various non-U.S. electronics.
2. “What fraction of the power consumption does the Internet (and its access devices) take?”
I wish there was an easy number to cite, but unfortunately the numbers are constantly in flux—based on myriad factors taken into account during analysis as well as the number of devices and various optimizations to infrastructure like data centers (e.g., using renewable/green energy, using artificial intelligence (AI) to help increase efficiency, etc.). They often also do not take into account global numbers (as doing so would likely be much more difficult). Having said that, I found many sources that can help shed light on this question (while also shedding light on the first question he posed above):
To begin, the 2008 Global e-Sustainability initiative (GeSI) SMART2020 report, which examined how to enable the low carbon economy in the information age, indicated: “ICTs currently contribute 2 percent to 3 percent of global greenhouse gas (GHG) emissions.” To put this into perspective and even based on 2008 numbers, “If the Internet were a country, it would rank as the fifth-largest for energy consumption.” Note, however, that the 2015 GeSI Smarter2030 report stressed, “ICT emissions as a percentage of global emissions will decrease over time,” and the GeSI revised the percentage of total global carbon emissions predicted in their 2008 report “due to a range of investments companies in the sector have been making to reduce their emissions and to the expected improvements in the efficiency of ICT devices ... [Therefore,] the ICT sector’s emissions ‘footprint’ is expected to decrease to 1.97 percent of global emissions by 2030, compared to 2.3 percent in 2020.”
Bear in mind as well that the numbers are constantly changing in terms of the environmental impact of the Internet. For instance, as reported in The Verge, Google “used some 4,402,836 megawatt-hours (MWh) of electricity in 2014 (equivalent to the amount of energy consumed by 366,903 U.S. households),” but that number is being offset by the amount of renewable energy and other innovations powering its infrastructure as well. Furthermore, according to CCCB Lab:
“The first thing that emerges after surveying various sources is that nobody knows for sure. In 2010, The Guardian came up with the figure of 300 million tons of [carbon dioxide (CO2)] per year, ‘as much as all the coal, oil and gas burned in Turkey or Poland in one year.’ A controversial article titled “Power, Pollution, and the Internet” in The New York Times put the figure at 30 billion watts of electricity in 2011, ‘roughly equivalent to the output of 30 nuclear power plants.’ And according to Gartner consultants, the Internet was responsible for 2 percent of global emissions in 2007, outstripping the carbon footprint of the aviation industry. A more recent study by the Melbourne, Austraila-based Centre for Energy-Efficient Telecommunications (CEET) estimated in 2013 that the telecommunications industry as a whole emits 830 million tons of carbon dioxide a year—[accounting for 1.5 percent to 2 percent of the world’s energy consumption]—and that the energy demands of the internet could double by 2020. Jon Koomey—[a research fellow at Stanford University’s Steyer-Taylor Center for Energy Policy who has been studying Internet energy effects since 2000 and identified a long-term trend in energy-efficiency of computing that has come to be known as Koomey’s Law]—estimates that the direct electricity use of all the elements that make up the Internet is probably around 10 percent of total electricity consumption, but he emphasizes that it is very difficult to calculate exact figures: ‘You can use a computer to play video games or write a text and not be online, and this energy use is often counted as part of the Internet even though it isn’t actually the case.’”
Additionally, in a 2015 article published in The Atlantic, the following data was purported:
“According to the U.S. Energy Information Administration, in 2012 global electricity consumption was 19,710 billion kilowatt-hours (kWh). Using Google’s estimate [of its data center’s energy use] and electricity-consumption data from the CIA World Factbook, they’re using about as much electricity annually as the entire country of Turkey. (Honestly, that number seems impossibly high considering that in 2011 Google disclosed that it used merely 260 million watts of power, at the time noted for being slightly more than the entire electricity consumption of Salt Lake City.) In its 2013 sustainability report, Facebook stated its data centers used 986 million kWh of electricity—around the same amount consumed by Burkina Faso in 2012 ... The impact of data centers—really, of computation in general—isn’t something that really galvanizes the public, partly because that impact typically happens at a remove from everyday life. The average amount of power to charge a phone or a laptop is negligible, but the amount of power required to stream a video or use an app on either device invokes services from data centers distributed across the globe, each of which uses energy to perform various processes that travel through the network to the device. One study ... estimated that a smartphone streaming an hour of video on a weekly basis uses more power annually than a new refrigerator” [emphasis mine].
Another perspective to consider is how growth affects the numbers. For example, after interviewing Dr. Mike Hazas, one of the researchers from Lancaster University’s School of Computing and Communications involved in a study that warned how “the rapid growth of remote digital sensors and devices connected to the Internet [and the IoT] has the potential to bring unprecedented and, in principle, almost unlimited rises in energy consumed by smart technologies,” the writer of this article shared the following data:
“The increase in data use has brought with it an associated rise in energy use, despite improvements in energy efficiencies. Current estimates suggest the Internet accounts for 5 percent of global electricity use but is growing faster, at 7 percent a year, than total global energy consumption at 3 percent. Some predictions claim information technologies could account for as much as 20 percent of total energy use by 2030.”
Conversely, in 2013, The Register reported: “The information and technology ecosystem now represents around 10 percent of the world’s electricity generation.” It based this data on an August 2013 report written by Digital Power Group (DPG) CEO Mark P. Mills titled The Cloud Begins With Coal: Big Data, Big Networks, Big Infrastructure, and Big Power (disclaimer: it was sponsored by the American Coal Association, a pro-coal lobbying group). He wrote:
“Based on a mid-range estimate, the world’s [ICT] ecosystem uses about 1,500 terawatt-hours (TWh) of electricity annually, equal to all the electric generation of Japan and Germany combined—as much electricity as was used for global illumination in 1985. The ICT ecosystem now approaches 10 percent of world electricity generation. Or in other energy terms—the zettabyte era already uses about 50 percent more energy than global aviation ... Hourly Internet traffic will soon exceed the annual traffic of the year 2000. And demand for data and bandwidth and the associated infrastructure are growing rapidly not just to enable new consumer products and video, but also to drive revolutions in everything from healthcare to cars, and from factories to farms. Historically, demand for bits has grown faster than the energy efficiency of using them. In order for worldwide ICT electric demand to merely double in a decade, unprecedented improvements in efficiency will be needed now” [emphasis theirs].
The Registry’s report also emphasized the following about power consumption regarding personal devices: “Reduced to personal terms, although charging up a single tablet or smartphone requires a negligible amount of electricity, using either to watch an hour of video weekly consumes annually more electricity in the remote networks than two new refrigerators use in a year. And as the world continues to electrify, migrating towards one refrigerator per household, it also evolves towards several smartphones and equivalent per person” [emphasis theirs]. (A methodology note from The Register: “This example used publicly available data on the average power utilization of a telecom network, the cost of wireless network infrastructure, and the energy that goes into making a tablet, although it ignored the data centers the video is served out of, and tablet charging” (in other words—though Google has purported that the cost of a Google search is 0.0003 kilowatt-hours (kWh) of energy—the likely cost is higher due to the power cost lurking in the non-Google systems used to deliver the data and perform the search). “[Furthermore,] the report’s figure reflects not just the cost of data centers—according to a 2007 report by the Environmental Protection Agency (EPA), U.S. data centers consumed 1.5 percent of U.S. electricity production, and was projected to rise to 3 percent by 2011—but also the power involved in fabbing chips and the power consumption of digital devices and the networks they hang off).”
It is important to highlight, however, that regarding the stated fact that direct electricity use of the Internet is probably around 10 percent of total electricity consumption, Koomey said the same thing during his keynote address at Google’s How Green is the Internet Summit in June 2013, but he immediately added that “the number does not tell us very much” (source). His words were further reinforced by the slides he presented at the event. On slide 7, he shared a graph based on data from a 2013 study using information collected for Sweden in circa 2010 that showed annual electricity use (GWh/year) across various technological devices. It showed that user PCs accounted for approximately 1,800 GWh/year compared to the second-most energy consuming devices: data centers and third-party local-area networks (LANs), which were responsible for close to 1,300 GWh/year). Other user equipment accounted for around 700 GWh/year, while the lowest-ranked technology, Internet Protocol (IP) core network was responsible for around 250 GWh/year. But whether this trend has been sustained from 2010, though, is unclear. (The Google event itself was bolstered by a blog post that was written that same month, which corresponded with the release of a report by the Lawrence Berkeley National Laboratory (Berkeley Lab) titled The Energy Efficiency Potential of Cloud-based Software: A U.S. Case Study. It showed that “migrating all U.S. office workers to the cloud could save up to 87 percent of information technology (IT) energy use—about 23 billion kilowatt-hours (KWh) of electricity annually, or enough to power the city of Los Angeles for a year” (Berkeley Lab also made their model publically available “so other researchers and experts can plug in their own assumptions and help refine and improve the results.” Bear in mind that, ultimately, the goal in this case was not to emphasize the effects of personal electronics, but energy efficiency and management overall of larger technical infrastructure).
There is also information available from a 2013 Time article that directly addresses some of the specifics regarding Vint’s second question and criticizes Mills’ study:
“It’s important to note that the amount of energy used by any smartphone will vary widely depending on how much wireless data the device is using, as well as the amount of power consumed in making those wireless connections—estimates for which vary. The above examples assume a relatively heavy use of 1.58 GB a month—a figure taken from a survey of Verizon iPhone users last year. That accounts for the high-end estimate of the total power the phone would be consuming over the course of a year. NPD Connected Intelligence, by contrast, estimates that the average smartphone is using about 1 gigabyte (GB) of cellular data a month, and in the same survey that reported high data use from Verizon iPhone users, T-Mobile iPhone users reported just 0.19 GB of data use a month—though that’s much lower than any other service. Beyond the amount of wireless data being streamed, total energy consumption also depends on estimates of how much energy is consumed per GB of data. The top example assumes that every GB burns through 19 kilowatts (kW) of electricity. That would be close to a worst-case model. The CEET assumes a much lower estimate of 2 kWh per GB of wireless data, which would lead to a much lower electricity consumption estimate as well—as little as 4.6 kWh a year with the low T-Mobile data use. In the original version of the post, I should have noted that there is a significant range in estimates of power use by wireless networks, and that this study goes with the very high end.”
A note on the calculations on smartphone energy use: this comes from an email by Max Luke, a policy associate at the Breakthrough Institute, which posted about Mills’ study. He wrote:
“Last year [in 2012], the average iPhone customer used 1.58 GB of data a month, which times 12 is 19 GB per year. The most recent data put out by ATKearney for the mobile industry association GSMA (p. 69) says that each GB requires 19 kW. That means the average iPhone uses (19kW X 19 GB) 361 kWh of electricity per year. In addition, ATKearney calculates each connection at 23.4 kWh. That brings the total to 384.4 kWh. The electricity used annually to charge the iPhone is 3.5 kWh, raising the total to 388 kWh per year. The EPA’s Energy Star shows refrigerators with efficiency as low as 322 kWh annually.”
The Time article continued: “Breakthrough ran the numbers on the iPhone specifically—Mills’ endnotes (see page 44 in the report) refer to smartphones and tablets more generally—but Luke notes that Mills confirmed the calculations. These estimates are at the very high end—other researchers have argued that power use by smartphones is much lower. And the Mills study itself has come in for strong criticism from other experts.”
As this Forbes article noted:
”[Koomey said] he ‘spent years debunking’ Mills’ claims and published a paper in 2000 that directly contradicted his findings. Koomey [added] he was shocked to see Mills ‘rehashing’ his ideas now. ‘If he is making this claim again, that would be just crazy, outrageous,’ Koomey said. ‘What we found in 2000 is that a refrigerator used 2,000 times more electricity than the networking electricity of a wireless Palm Pilot. He is not a credible source of information.’ [Moreover,] Gernot Heiser, a professor at the University of New South Wales in Sydney and co-author of a 2010 study on power consumption in smartphones, echoed Koomey’s sentiments [that Mills’ work was flawed]. Heiser said Mills’ work ‘seems blatantly wrong.’ He said Mills overestimates the amount of power used by a modern smartphone, in this case a Galaxy S III, by more than four times. ‘I’d have to have a quick look to see how they arrive at this figure, but it certainly looks like baloney to me,’ Heiser said.”
Quoting from the Time article, “Gang Zhou, an associate professor of computer science at the College of Williams and Mary, was less direct in attacking Mills’ claims, but nonetheless said his measurements for the power consumption of smartphones was at least ‘one or two magnitude’ higher than they should be. Nonetheless, Zhou added that the subject of data center electricity usage is an important issue and it ‘should raise concern.’”
Koomey also reinforced the aforementioned criticism. In a 2013 article titled “Jonathan Koomey: Stop worrying about IT power consumption,” the author of the article wrote:
“By 2010, for example, data centers accounted for approximately 1.3 percent of worldwide electricity use and 2 percent of U.S. electricity use, according to Koomey’s August 2011 paper, “Growth in Data Center Electricity Use, 2005 to 2010.” This amount is growing, certainly, but at a far slower rate than we previously imagined. Still, that article helped inspire an industry-wide interest in the nexus of technology and energy efficiency that might otherwise have taken years to develop. “It was the process of debunking those claims that led me to spend a lot more time on data center electricity use and also on the electricity use of all sorts of computing devices,’ Koomey recalled. As he dug into the numbers, he actually discovered that efficiency has been improving since the days of vacuum tubes, a thesis he explored in his ‘One Great Idea’ presentation at the 2012 VERGE conference in Washington, D.C. This is one thing making the explosion of mobile devices such as smartphones and tablet computers viable, along with the associated reductions in the power consumption associated with client computing devices. Consider that a desktop computer uses roughly 150 kWh to 200 kWh of electricity annually, compared with 50 to 70 kWh for a notebook PC, 12 kWh for a tablet or 2 kWh for a smartphone. It’s also a very important development for the so-called Internet of Things, the vast network of sensors emerging to support a huge array of applications related to green buildings, intelligent transportation systems and so on. Despite suggestions otherwise, these applications should have very little impact on overall IT power consumption.”
Conclusion
Based on the outdated and often contradictory information available, I would stress that the ultimate answer to Vint’s question is that, unfortunately, it is inconclusive. Even a follow-up question Vint posted about the merits of switching to LED lighting in offsetting the power consumption of ICTs was undermined by a New Republic story that argued (according to the aforementioned Time article):
“The greenest building in New York City [at the time]—the Bank of America Tower, which earned the Leadership in Energy and Environmental Design‘s (LEED) highest Platinum rating—was actually one of the city’s biggest energy hogs. Author Sam Roudman argued that all the skyscraper’s environmentally friendly add-ons—the waterless urinals, the daylight dimming controls, the rainwater harvesting—were outweighed by the fact that the building used ‘more energy per square foot than any comparably sized office building in Manhattan,’ consuming more than twice as much energy per square foot as the 80-year-old (though recently renovated) Empire State Building.”
What is not undermined, however, is my rationale for exploring this topic more within the Internet community. While the Internet and ICTs are not the main contributor to climate change (compared to, say, energy production in general), there are a few considerations to keep in mind:
1. The issue of energy needed for infrastructure such as data centers as well as electronic devices (regardless of size or scope) is essentially two sides to the same coin, but data center/server operators generally have much more centralized control over how such centers/servers are powered than end-users.
2. Private sector data centers are becoming more efficient and are increasingly run by renewable energy, but many Internet exchange points (IXPs), for instance, as well as other critical infrastructure and non-private sector structures (such as government servers) are not. (See, for example, the abovementioned Atlantic article: “But that’s leverage available to companies operating at the scale of Facebook and Google [to galvanize states to cut non-renewable/fossil fuel energy sources]. It’s not really something that smaller colocation services can pull off. Relative to the entire data-center industry—data centers run on university campuses, enterprise colocation providers, hospitals, government agencies, banks—companies like Facebook and Google are a pronounced, but still minor piece of the larger data-center landscape. Some smaller companies have been able to push for changes, but they tend to need one of the heavy-hitter companies to act as muscle first” [emphasis mine]).
3. As more people come online, more and more data will be generated—to the point where the amount of energy needed to power the infrastructure that supports such data could grow exponentially. As Mills’ report stressed:
“Future growth in electricity to power the global ICT ecosystem is anchored in just two variables: demand (how fast traffic grows) and supply (how fast technology efficiency improves). As costs keep plummeting, how fast do another billion people buy smartphones and join wireless broadband networks where they will use 1,000 times more data per person than they do today; how fast do another billion, or more, join the Internet at all; how fast do a trillion machines and devices join the Internet to fuel the information appetite of Big Data? Can engineers invent, and companies deploy, more efficient ICT hardware faster than data traffic grows?”
Addressing each of these points—and what the Internet governance community can do about it—is critical. Given the inconclusive nature of this article, it is better to err on the side of caution—that is, address concerns related to energy and the environment within our domain, especially when investing in infrastructure upgrades. For instance, Koomey argued, “For in-house data centers that are standard business facilities, there is a strong case from both a cost and environmental perspective for going to the cloud.”
This also involves sharing best practices, solutions, and working collaboratively to help make current infrastructure more efficient and sustainable as well as better plan for the future (which of course includes policy discussions) as well as examining our entire production process and incorporating a more circular economy. By extending this logic to ICTs, it also includes not merely infrastructure and processes governing the Internet, but also aspects of the information society such as wireless infrastructure (e.g., towers and routers), wired infrastructure (e.g., manufacturing and laying fiber (including underwater cable)), the recyclability and sustainability of Internet-connected devices (e.g., manufacturing processes, recycling, and resource acquisition), and where the materials for such devices will come from in order to help the next billion(s) get online.
Sponsored byDNIB.com
Sponsored byVerisign
Sponsored byVerisign
Sponsored byRadix
Sponsored byCSC
Sponsored byWhoisXML API
Sponsored byIPv4.Global