Sunday, November 21, 2010

Achieving a PUE of 1.10 is a challenge under any circumstances but the vast majority of facilities that do approach this mark are using air-side economization. Essentially using outside air to cool the facility. Air-side economization brings some complexities such as requiring particulate filters and being less effective in climates that are both hot and humid. Nonetheless, even with the challenges, air-side economization is one of the best techniques, if not the best, of improving datacenter efficiency.

 

As a heat transport water is both effective and efficient. The challenges of using water in open-circuit datacenter cooling designs are largely social and regulatory. Most of the planet is enshrouded by two fluids, air and water. It’s a bit ironic that emitting heat into one of the two is considered uninteresting while emitting heat into the other brings much more scrutiny. Dumping heat into air concerns few and is the cooling technique of choice for nearly all data centers (Google Saint-Ghislain Belgium being a notable exception). However, if the same heat in the same quantities is released in water, it causes considerably more environmental concern even though it’s the same amount of heat being released to the environment. Heat pollution is the primary concern.

 

The obvious technique to avoid some of the impact of thermal pollution is massive dilution. Use seawater or very high volumes of river water such that the local change is immeasurably small. Seawater has been used in industrial and building cooling applications. Seawater brings challenges in that it is incredibly corrosive which drives up build and maintenance costs but it has been used in a wide range of high-scale applications with success. Freshwater cooling relieves some of the corrosion concerns and has been used effectively for many large-scale cooling requirements including nuclear power plants. I’ve noticed there is often excellent fishing downstream of these facilities so there clearly is substantial environmental impact caused by these thermal emissions but this need  not be the case. There exist water cooling techniques with far less environmental impact.

 

For example, the cities of Zurich and Geneva and the universities of ETH and Cornell use water for some of their heating & cooling requirements. This technique is effective and its impact on the environment can be made arbitrarily small. In a slightly different approach, the city of Toronto employs deep lake water cooling to cool buildings in its downtown core. In this design, the warm water intake is taken in 3.1 miles off shore at a depth of 272’. The city of Toronto avoids any concerns about thermal pollution by using the exhaust water from the cooling system as their utility water intake so the slightly warmer water is not directly released back into the environment.

 

Given the advantages of water over air in cooling applications and given that the environmental concerns can be mitigated, why not use the technique more broadly in datacenters? One of the prime reasons is that water is not always available. Another is that regulatory concerns bring more scrutiny and, even excellent designs without measurable environmental impact, will still take longer to get approved than a conventional air-cooled approaches. However it can be done and it does produce a very power efficient facility. The DeepGreen datacenter project in Switzerland perhaps the best examples I’ve seen so far. 

Before looking at the innovative mechanical systems used in Deepgreen, the summary statistics look excellent:

·         46MW with 129k sq ft of raised floor (with upgrade to 70MW possible)

·         Estimated PUE of 1.1

·         Hydro and nuclear sourced power

·         356 W/sq ft average

·         5,200 racks with average rack power of 8.8kW and maximum rack power of 20kW

·         Power cost: $0.094/kW hr (compares well across EU).

·         28 mid-voltage 2.5 MW generators with 48 hours of onsite diesel

 

The 46MW facility is located in the heart of Switzerland on Lake Walensee:

 


Google Maps: http://maps.google.com/maps?f=q&source=s_q&hl=de&geocode=&q=Werkhof+Bi%C3%A4sche,+Mollis,+Schweiz&sll=37.0625,-95.677068&sspn=45.601981,84.990234&ie=UTF8&hq=&hnear=Werkhof,+8872+Mollis,+Glarus,+Schweiz&ll=47.129016,9.09462&spn=0.307857,0.663986&t=p&z=11

The overall datacenter is well engineered but it is the mechanical systems that are most worthy of note. Following on the diagram below, this facility is cooled using 43F source water from 197’ below the surface. The source water is brought in through dual redundant intake pipes to the pumping station with 6 high-capacity pumps in a 2*(N+1) configuration. The pumps move 668,000 gallons per hour at full cooling load.

 

The fairly clean lake water is run through a heat exchanger (not shown in the diagram below) to cool the closed-system, chilled water loop used  in the datacenter. The use of a heat exchanger avoids bringing impurities or life forms into the datacenter chilled water loop. The chilled water loop forms part of a conventional within-the-datacenter cooling system design. The difference is they have completely eliminated process-based cooling (air conditioning) and water towers avoiding both the purchase cost and the power these equipment would have consumed. In the diagrams below you’ll see the Deepgreen design followed by a conventional datacenter cooling system for comparison purposes:

 

Conventional datacenter mechanical system for comparision:

 

The Deepgreen facility mitigates the impact of thermal pollution through a combination of dilution, low deltaT, and deep water release.

 

I’ve been conversing with Andre Oppermann, the CTO of Deepgreen for nearly a year on this project. Early on, I was skeptical they would be able to work through the environmental concerns in any reasonable time frame. I wasn’t worried about the design – its well engineered. My concerns were primarily centered around slowdowns in permitting and environmental impact studies. They have done a good job of working through those issues and I really like the resultant design. Thanks to Andre for sending all this detail my way. It’s a super interesting project and I’m glad we can now talk about it publically.

 

If you are interested in a state of the art facility in Switzerland, I recommend you contact Andre, the CTO of Deepgreen, at: oppermann@deepgreen.ch.

 

                                                                                --jrh

 

James Hamilton

e: jrh@mvdirona.com

w: http://www.mvdirona.com

b: http://blog.mvdirona.com / http://perspectives.mvdirona.com

 

Sunday, November 21, 2010 9:19:42 AM (Pacific Standard Time, UTC-08:00)  #    Comments [0] - Trackback
Hardware
Comments are closed.

Disclaimer: The opinions expressed here are my own and do not necessarily represent those of current or past employers.

Archive
<November 2010>
SunMonTueWedThuFriSat
31123456
78910111213
14151617181920
21222324252627
2829301234
567891011

Categories
This Blog
Member Login
All Content © 2014, James Hamilton
Theme created by Christoph De Baene / Modified 2007.10.28 by James Hamilton