March 25, 2012
As incredible as it sounds, and as I wrote in The IT Energy Efficiency Imperative, a typical data center uses less than 5% of the energy it consumes for actual computing –the rest is lost due to various overheads and inefficiencies. If that wasn’t bad enough, data centers also generate huge piles of e-waste each year as computer servers are replaced by more powerful and, ironically, more energy efficient models.
While this may sound terrible, first keep in mind that it usually takes less energy and natural resources to communicate and consume goods digitally rather than physically, so more investment in IT is often net positive for the environment.
However, underlying all those boasts about new, “energy efficient” data centers, there is usually a dirty little secret: the servers are often woefully under-utilized. Most work at less than 10 percent of their capacity, but suck up about half of the energy that they would when running at full capacity. So all over the world, millions of servers operate like mostly empty delivery trucks, consuming lots of resources, but delivering very little.
True, most IT departments have tried to tackle poor server utilization by consolidating applications onto fewer servers through the use of virtualization technology. Unfortunately, many consolidation projects have stalled due to financial constraints, organizational politics and staffing shortages. As such, server utilization across the enterprise remains very low – often less than 10 percent. Even worse, as each generation of server hardware has become more powerful, utilization has tended to decrease even further as applications and IT operational practices have been unable to take advantage of it.
Without significant changes, utilization will continue to get worse and more money and opportunities for computing will be wasted as an even larger percentage of the world’s IT capacity sits idle. Particularly troubling, at least among the IT professionals I talk to, is that they seem resigned to low levels of utilization because they say it’s the only way they can confidently ensure good performance and reliability when demand spikes.
So what’s an IT pro who wants to improve utilization to do – without breaking applications and getting fired? While virtualization management software can help, real progress will require the cooperation of a contingent not normally associated with IT energy efficiency: architects and developers of software applications.
Currently, most applications are developed with little thought for how much energy and IT resources they consume. The IT pros don’t know the quantity of hardware they will require to function properly, so, under the maxim “better safe than sorry,” applications are given far more than they need. Like someone who buys a mansion just because they might throw a large party one day, applications are often permanently allocated with energy sucking computing resources to support a peak load that occurs rarely, if ever.
Unfortunately, incentives for developers to help improve utilization to respectable levels are typically weak and often non-existent. Once IT resources have been allocated to an application, the costs to the application owner remain relatively constant regardless of utilization levels, even if servers are turned off.
However, the emergence of public cloud computing platforms is demonstrating that the right incentives can help improve IT resource use effectiveness. As organizations pay for cloud resources by the hour, developers are naturally inclined to scale applications up and down in response to demand, to avoid incurring costs for resources they don’t need. Vendors are taking notice of this trend and are creating auto-scaling solutions and tools (such as Microsoft’s WASABi) that make it easier for developers to reduce the costs of hosting applications in the cloud.
If CFOs knew that the expensive servers in their data centers were less than 10 percent utilized, it’s likely that some additional justification would be required before more servers were approved for purchase. IT departments would demand solutions from vendors that help them improve server utilization. In turn, the IT department would benefit from having cloud-like incentives in place to encourage developers to create lean, resource efficient applications that don’t act like they’re the only thing in the data center.
Given the increasingly digital and connected nature of the world’s societies and economies, the demand for data center-dependent applications is growing at an accelerating pace. As such, maximizing the utilization of data center resources with the help of software developers will be an imperative for any organization that wants to contain costs and reduce energy use over the long term. It’s perhaps an even greater imperative for the world and its energy-strapped economies as the IT sector continues to grow its power footprint, carbon emissions and e-waste streams.
Mark Aggar is the Senior Director of Technology Strategy for Environmental Sustainability at Microsoft. He can be reached via his Technology Treading Lightly blog at http://blogs.technet.com/markaggar. Learn more about Microsoft’s sustainability efforts at www.microsoft.com/environment