Advertisement
Guest User

Untitled

a guest
Dec 15th, 2018
71
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 12.52 KB | None | 0 0
  1. Busting Liquid Cooling Myths!
  2.  
  3. Liquid cooling is taking the market over by the storm and makes the air cooling look not only outdated but also uneconomical and wasteful in comparison. In the times of continuous increase in power density of racks, together with rising social awareness of humanity's impact on environment it seems to be the only option and have no real alternative. However, like with every case of new technology, there are those who remain sceptical. Let's try to take a look at common myths and misunderstandings.
  4.  
  5. Liquid cooling is a technology of heat removal from components and equipment which uses heat transferring fluid instead of air (gas) to take away the heat produced by these components and prevent the negative impact of heat on them, providing good conditions for proper functioning of equipment and machines in the facility. Liquid cooling is significantly more effective than air cooling and consumes much less resources. It is not only cheaper in maintenance and upkeep, but also creates a lot smaller carbon footprint, reducing the strain on the environment. This type of cooling was used for a very long time, in last years making appearance as a way of reducing heat in datacenters, server rooms and cryptomining facilities. Currently available liquid cooling solutions are reliable, affordable and safe, and can provide 80-100% heat removal rate with as much as 10 times smaller carbon footprint. On August 7th, 2018, the article titled "Five Barriers to Adoption of Liquid Cooling in Data Centers" was published on datacenterknowledge.com. The author recalls arguments against the liquid cooling technology, claiming that it's expensive, complicated to introduce and maintain, dangerous and unreliable. This is simply not true, and this article will try to explain why.
  6.  
  7. Myth 1: "Two cooling systems to manage instead of one"
  8. Immersion cooling allows to get rid of the HVAC (Heating, Ventilation, Air Conditioning) entirely. As said before, the effectiveness is extremely high and the air cooling is no longer needed; the liquid cooling is used to replace the air cooling, not supplement it. When it comes to management, the liquid cooling systems require roughly three- to four times less maintenance than HVAC, which cuts the costs and simplifies the management. Air conditioning requires a lot of maintenance and checking to be sure it runs flawlessly. Air conditioning produces high changes of temperature which results in condensation, creating moist environment (ironically), in which bacteria and mold thrive, so the whole system needs constant checking for it, or else it becomes outright dangerous to health. It also needs quarterly filters replacement and service procedures, checking for leaks of refrigerant, frozen coils, clogged drainages. All of this goes away with liquid cooling because it works on simple principle of enclosed and isolated system. Even in cases where the liquid cooling doesn't remove 100% of heat but, let's say, 90%, the needed air system amount is reduced due to smaller rack packed in more compressed manner.
  9. Another matter that reduces the required amount of maintenance is higher overall efficiency of the liquid cooling system. Better running temperatures, less dust and vibrations, less humidity improve the systems' performance and reduce the wear of machines. That results in improved MTBF (Mean Time Between Failures) of the equipment and components, indirectly lowering the upkeep costs due to lower amount of required maintenance, hardware replacement and workhours needed to keep up with it.
  10.  
  11.  
  12. Myth 2: "No standards"
  13. The American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) introduced their first "Thermal Guidelines" as early as 2011. LCL.GOV and OCP started liquid cooling standardisation effort for wider adoption. IBM conservative standards are already available. Those standards are out there, available for anybody willing to spend few minutes on acquisition of knowledge. The liquid cooling is used in IT since 1960s, it's not a new and untested idea at all. The only difference is that it was used only in very big undertakings, and now it's available for anybody. Being open for innovation is the most important part of IT business and cloud sector is no different.
  14.  
  15. ASHRAE introduced Thermal Guidelines for Liquid-Cooled Data-Processing Environments in 2011 and then Liquid Cooling Guidelines for Datacom Equipment Centers 2nd Edition in 2014. Thermal Guidelines for Data Processing Environments 3rd Edition also included insight into other considerations for liquid cooling, including condensation, operation, water-flow rate, pressure, velocity and quality, as well as information on interface connections and infrastructure of heat-rejection devices. More information available at:
  16.  
  17. Ashrae Thermal Guidelines
  18. https://datacenters.lbl.gov/sites/all/files/ASHRAE%20Thermal%20Guidelines_%20SVLG%202015.pdf 
  19.  
  20. Conservative IBM guidelines: https://www.ibm.com/support/knowledgecenter/en/POWER8/p8had/p8had_wc_overview.htm
  21.  
  22. Open Specification for a Liquid Cooled Server Rack https://datacenters.lbl.gov/sites/default/files/Open%20Specification%20Presentation%20DCD.pdf
  23.  
  24. Direct Liquid Cooling for Electronic Equipment https://eta.lbl.gov/sites/all/files/publications/direct_liquid_cooling.pdf
  25.  
  26.  
  27. Myth 3:"The customer, first of all, has to come with their own IT equipment ready for liquid cooling, and it's not very standardized - we can't simply connect it and let it run."
  28.  
  29. DO PRZEROBIENIA
  30. WELL.. IN MOST CASES YOU CAN.
  31. Of course DLC / ILC is the thing in case of hosting / cloud services. But with collocation it's exactly the same - you take the infrastructure, you provide the cooling part. In case of DLC - you unscrew the heatsinks and replace it with cooling modules - for example with LGA3647 Next Gen DLC Module pictured above. Sockets are standard and you just provide rack level distribution infrastructure - manifold or LDU - you can see Liquid Distribution Unit with CPC dry break quick disconnect couplings.
  32. Of course in case of Immersion cooling you simply dunk the system in ILC enclosure - and that's it. If you need to return those you will put customer systems into cleaning bath for few minutes and server will be provided to customer or for maintenance shiny as new.
  33.  
  34. Myth 4:"Electrocution"
  35. The probability of electrocution is not any higher in comparison with air cooling system or, for that matter, any other electrical system. The liquid is contained in ducts, does not come into contact with equipment and/or the person handling it and does not pose a threat to safety; even if a leak happens (and it can happen with air conditioning too) the risk of electrocution is very low. There are negative pressure systems available that practically eliminate the risk of leak because if circuit gets cut, the air fills the interior of the duct and does not allow the liquid to leak out.
  36.  
  37. Myth 5: Corrosion "As with any system that involved water flowing through pipes, corrosion is an issue in liquid cooling."
  38.  
  39.  
  40. Liquid cooling actually mitigates the risk of corrosion that occurs in air cooled environments.There are many ways to mitigate the probability of corrosion, but it's not happening in pipes as described.
  41. Quality engineering and design requires an understanding of material compatibility. Galvanic corrosion (sometimes called dissimilar metal corrosion) is the process by which the materials in contact with each other oxidize or corrode. However, there are three conditions that must exist for galvanic corrosion to occur. First of all, there must be two electrochemically dissimilar metals present. Secondly, there must be an electrically conductive path between the two metals. And third, there must be a conductive path for the metal ions to move from the more anodic metal to the more cathodic metal. If any of these three conditions aren't occuring, galvanic corrosion will not happen.
  42. Often when design requires that dissimilar metals come in contact, the galvanic compatibility is managed by finishing and plating. The finishing and plating selected facilitate the dissimilar materials being in contact and protect the base materials from corrosion.
  43. There are easy way to avoid all of these problems. In controlled and mostly sterile datacenter environment we will not observe liquid cooling corrosion issue. Even in the harsh and uncontrolled environments the engineers often manage to successfully mitigate any possible corrosion issues. Designers just pay attention to material compatibility, and with engineering fluids in use there is no electrically conductive path between metals.
  44. The corrosion can be observed on daily basis in IT environment. It's not spectacular, but it occurs nonetheless, most commonly in form of "Tin Whiskers". This microscopic, crystalline metallurgical phenomenon involving the spontaneous growth of tiny hairs from a metallic surface occurs in electrical devices when metals form long whisker-like projections over time. This is the corrosion that kills PSUs, motherboards, cards and fans. We can avoid that with less air handling, stress and dust, by implementing liquid cooling, which does not cause these issues.
  45.  
  46.  
  47.  
  48.  
  49.  
  50. Myth 6: "Operational Complexity
  51. - The more components you have, the more likely you are to have failure. When you have chip cooling, with water going to every CPU or GPU in a server, you're adding a lot of components"
  52. One of the most important advantages of liquid cooling over the air cooling, right after operating expenditures (OPEX) is the reduction of area of the facility (as the liquid cooling allows for much more tightly packed components) and simplification of maintenance. The physical size reduction of data center can go to as far as 6 times compared to the same data center with equal computing power that is using air cooling. Logically, on the same area up to 6 times more equipment and components can be packed, all of this simply by switching the cooling system. Go-to market - construction schedule is also faster by 30% as proved by 3M case study.
  53. Reduced site and structural construction when compared to traditional build of equal computing power, no need for PDUs, RPPs, busway or CRAH/CRACs in datacenter space, fewer pieces of critical equipment in data hall space, less server fans, CARCs, air handlers, chillers, dehumidifiers and filters. Less racks means reduction in fiber and copper wiring needed. Better utilisation and efficiency of electrical and mechanical systems reduce the amount of equipment without sacrificing redundancy. Intel processors running on 55°C provide up to 20% more efficiency. As discussed in first section, increase of Mean Time Between Failures reduces the amount of scheduled maintenance needed. All of this means practical reduction in complexity and less CAPEX/OPEX.
  54.  
  55. Summary:
  56. Benefits of liquid cooling are known:
  57. - Increase in rack power density (from 20 kW to 100 kW +)
  58. - Lower data center footprint, fewer server racks and interconnects
  59. - 30-50 less energy use & cost
  60. - 10-20% increase in computing power of liquid cooled processors and gpu's
  61. - Increased reliability of equipment
  62. - Higher power density of processors
  63. - Fewer Pieces of Critical Equipment in Data Hall Space
  64. - Simplified electrical and mechanical topology
  65. - Faster go to market - reduced Site & Structural Construction Compared to Traditional Build
  66. - Reduction/elimination of fan vibrations
  67. - Smaller CAPEX and greatly just fraction of traditional datacenter OPEX
  68. - Decrease Total Cost of Ownership (TCO)
  69. - Reuse of usually wasted heat
  70. Certainly there is A LOT TO IMPROVE in regards to general knowledge about liquid cooling technologies. I would appeal to respectable senior managers expressing their opinions in the article: we are at your service. Liquid cooling is known and mature technology but it was tied to very specific systems (HPC/ Mainframe systems) so it was not widely recognised. Until now. With more GPUs and current 300W and future 400W chips we observe "back to the future" in case of liquid cooling. YES there are new, open, safe, flexible liquid cooling systems. And it's worth consideration! The best way to get educated opinion on current state of technology is to deal with it based on POC systems. Im personally interested in any specific project. I would love te discuss your experiences and requirements. If OVH can fit 30-50 thousand servers on 25x25m space you can too! Example: OVH Cubic datacentre: located nearby Roubaix centre. 6 levels, 96 racks on each level, total number= 576 racks. Depending on infrastructure density - 60 to 90 servers per rack which gives between 30 to 50 thousand servers. This is why you'll be cooling your servers with liquid. Soon.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement