It’s a trope amongst internet users that ‘all manner of virtual memory is stored in the cloud’ – the apt metaphor for the vast interconnected storage infrastructure that delivers to us the results of our Google searches in seconds. But how many understand that each of those searches made, or other use – including game playing, or downloading music, apps or even lectures, constitutes a massive energy wastage?
Indeed, the energy consuming ‘cloud’ is something that has to be tamed if we are to have any chance of an energy future after we reach twenty years past Peak Oil (which occurred in 2005) and our existing oil stores are reduced to what they were in 1985. To put it another way: Can a world of some 8 billion live on the same oil as 6 billion living in 1985? The answer would seem to be ‘no’ given we are talking about high EROEI(energy returned on energy invested) oil, which means yielding 15 to 16 gallons of oil for every oil used to drill. The other side, apart from Peak Oil concerns, are the climate science issues: if we continue to use so much fossil fuels (including coal) we will speed the onset of a runaway greenhouse planet.
Consider this, given all the primary net users, from banks, to defense contractors, government agencies, media companies and the billion global Facebook users, barely 6-12% of energy available is utilized to perform computations, upload security apps or make friend-to-friend connections. All the rest is wasted, by which I mean the remaining energy 88%-94% is used merely for backup and redundancy in case servers fail. (Many large data centers actually include thousands of linked lead acid batteries – in addition to generators – to power computers in case of a grid failure as brief as a few hundredths of a second.)
The extraneous use to support redundancy and ensure servers are running, also has led to violations such as reported in the NY Times (Sept. 23, p. 20).(For example amazon.com was cited for 20 violations in Northern Virginia over 3 years for running generators without an environmental permit). Meanwhile, a few companies, i.e. Google and Facebook, are using re-engineered software and cooling systems to decrease wasted power.
Some readers, for example, may have caught a glimpse of Google’s ‘innards’ featured on ABC News three nights ago, where somewhat higher ambient temperatures are now used to reduce the need for greater cooling. Still, Google’s data centers consume nearly 300 million watts annually (ibid.) while Facebook’s consume 60 million watts. According to Mark Bramfitt a former utility company exec quoted in the Times piece:
“It’s just not sustainable. They’re going to hit a brick wall.”
Just keeping millions of users’ emails (which they refuse to delete) stored in servers takes a hell of a lot of energy. Just one company’s clients, e.g. for the NYSE, generates up to 2,000 gigabytes of data per day that must be stored for years. It is estimated that some 1.8 trillion gigabytes of digital information was created globally last year – extracting nearly 90 billion kilowatt hours from the grid. As long as that information needs storage or re-use it’s a continual energy drain.
Meanwhile, consumers haven’t processed how much they contribute to the energy drain, simply sending enormous audio files or video files back and forth, or playing virtual games. It wouldn’t be so bad if the actual uses only required the limited amounts of energy to perform the real time actions- whether uploading, downloading, sharing audio files or whatever, but they also demand enormous backup energy held in reserve. Once this energy is tied up it's not available for other uses, including to the grid.
According to David Capuccio, a managing VP and chief of research at Gartner – a technology research firm, typical server utilizations reach 7-12%, the rest is redundant backup waste. As he puts it (ibid.):
“That’s how we’ve over-provisioned and run data centers for years. ‘Let’s overbuild just in case we need it’. That level of comfort costs a lot of money. It costs a lot of energy.”
In the latter regard, many users forget or don’t process that servers aren’t the only components in data centers consuming energy. One also has: industrial cooling systems to keep the hardware temperature controlled, circuitry to keep backup batteries charged, and simple “ohms” dissipation in the extensive wiring. All of these ‘eat’ electrical energy’. The fact is that the energy dissipated in the latter can often translate into as much as 30 times the amount of electricity used to carry out the basic purposes of the data center.
More recently, to check the extent of all this, a company called Viridity was brought on board to conduct diagnostic testing at data centers. Ironically, its engineers also found that its facility – like dozens of others surveyed – used most of its power on servers that were doing little except consuming electricity.
Viridity's exterior tests backed up what was discovered at Viridity itsellf. In one sample of 333 servers monitored in 2010, more than half were found to be “comatose” in the jargon of the monitors. Nearly three fourths of the servers in the sample were using less than 10% of their computational power on average, to process data. The other 90% was being used in reserve to prop up the servers in case of failure.
Who are to blame for a lot of these problems? We users! According to Bruce Taylor, Vice President of the Uptime Institute:
“If you tell somebody they can’t access Youtube or download Netflix, they tell you it’s a God-given right.”
That attitude permeates not only ordinary netizens but those operating large data processing centers at banks, for industry, the New York Stock Exchange and government. NO one wants to even contemplate loss of data or interruption of a transaction because of a fractional second glitch in energy available. Hence, all that waste-energy in zombie servers which do little other than to support the main servers….in case they go down.
Perhaps what we all need to do is make more conscious choices and trade-offs in our uses. If we send photos to a friend or relative on a particular day, we agree not to use Youtube to view or download videos. If we download from Youtube, we agree not to go online for hours to partake in virtual online games or whatever. In this way, if all net users make trade off decisions we can make the total usage go down and hopefully let the main server centers know they can release some of that backup power.
Of course, industry and government also have to be willing to do their share, and we need a more efficient grid system overall. The one we have is decades overdue for revamping, as I noted in assorted blog posts about the possible effects of coronal mass ejections.
Perhaps if Mitt Romney volunteered to put that extra $2 trillion he wants to waste on the Pentagon toward grid revamping and infrastructure repair, he might be taken more seriously. But on the other hand, so far nothing Romney has said makes any sense and often contradicts his earlier statements. In this sense, one has to go along with Barack Obama’s perceptive take of “Romnesia”!
Alas, we citizens, faced with an ever more frugal energy future as oil supplies wind down can’t afford to put a guy in the White House who promises to consume even more fossil fuels and at a faster rate. Perhaps leaving us with as few working computers as the characters portrayed on the new series ‘Revolution’ within a decade!