Most agree that cloud computing is inherently more efficient that on premise computing in each of several dimensions. Last November, I went after two of the easiest to argue gains: utilization and the ability to sell excess capacity (Datacenter Renewable Power Done Right):
Cloud computing is a fundamentally more efficiently way to operate compute infrastructure. The increases in efficiency driven by the cloud are many but a strong primary driver is increased utilization. All companies have to provision their compute infrastructure for peak usage. But, they only monetize the actual usage which goes up and down over time. What this leads to incredibly low average utilization levels with 30% being extraordinarily high and 10 to 20% the norm. Cloud computing gets an easy win on this dimension. When non-correlated workloads from a diverse set of customers are hosted in the cloud, the peak to average flattens dramatically. Immediately effective utilization sky rockets. Where 70 to 80% of the resources are usually wasted, this number climbs rapidly with scale in cloud computing services flattening the peak to average ratio.
To further increase the efficiency of the cloud, Amazon Web Services added an interesting innovation where they sell the remaining capacity not fully consumed by this natural flattening of the peak to average. These troughs are sold on a spot market and customers are often able to buy computing at less the amortized cost of the equipment they are using (Amazon EC2 Spot Instances). Customers get clear benefit. And, it turns out, it’s profitable to sell unused capacity at any price over the marginal cost of power. This means the provider gets clear benefit as well. And, with higher utilization, the environment gets clear benefit as well.
Back in June, Lawrence Berkeley National Labs released a study that went after the same question quantitatively and across a much broader set of dimensions. I first came across the report via coverage in Network Computing: Cloud Data Centers: Power Savings or Power drain? The paper was funded by Google which admittedly has an interest in cloud computing and high scale computing in general. But, even understanding that possible bias or influence, the paper is of interest. From Google’s summary of the findings (How Green is the Internet?):
Funded by Google, Lawrence Berkeley National Laboratory investigated the energy impact of cloud computing. Their research indicates that moving all office workers in the United States to the cloud could reduce the energy used by information technology by up to 87%.
These energy savings are mainly driven by increased data center efficiency when using cloud services (email, calendars, and more). The cloud supports many products at a time, so it can more efficiently distribute resources among many users. That means we can do more with less energy.
The paper attempts to quantify the gains achieved by moving workloads to the cloud by looking at all relevant dimensions of savings some fairly small and some quite substantial. From my perspective, there is room to debate any of the data one way or the other but the case is sufficiently clear that it’s hard to argue that there aren’t substantial environmental gains.
Now, what I would really love to see is an analysis of an inefficient, poorly utilized private datacenter who’s operators wants to be green and needs to compare the installation of a fossil fuel consuming fuel cell power system with dropping the same workload down onto one of the major cloud computing platforms :-).
The paper is in full available at: The Energy Efficiency Potential of Cloud-Based Software: A US Case Study.