Discussing Liquid Cooling in Data Centres with David O’Reilly, Vice President and General Manager, Secure Power, Schneider Electric
September 20, 2019
By Owen Hurst
As the Vice President and General Manager of Schneider Electric’s Secure Power division David O’Reilly draws on his many years of executive experience to lead a national team with the goal of delivering unique energy management solutions.
David is uniquely placed as a result of his many years of executive experience with a variety of Fortune 100 to Fortune 1000 companies to work closely with clients in a variety of fields to ensure they achieve their desired outcome. Panel Builder & Systems Integrator had the opportunity to ask David a few questions about the potential of liquid cooling in data centres, which are rapidly increasing in size and number.
What advantages do liquid cooling options offer, particularly for the largescale data centres that are becoming the norm?
Computing is rapidly becoming more complex, and following suit, chip power densities are increasing in both GPUs and CPUs. Increased power densities translates to more power use and heat production, which demands a larger volume of space for air-cooled heatsinks – an inefficient use of already limited space in a data centre.
Liquid cooling solutions offer a more compact way to manage increased heat in advancing data centres. This allows users to simplify their physical infrastructure, moving away from complex, large air-cooling layouts to simple case-based liquid coolers and radiators. Further, basic liquid cooling solutions rival and outperform even the best air-cooling technology in effectiveness and efficiency, offering long term savings for operators of both large and small facilities.
Are liquid cooling options available to largescale centres as well as smaller data centre operations?
There are several liquid cooling options available today that address both ends of the IT spectrum – from Hyperscale to Edge. These range across several technologies, from immersion cooling to direct-to-chip systems.
Standard liquid cooling systems fit within two fundamental classifications: single-phase, which uses fluid that remains in a liquid state, and two-phase, which uses partial or complete boiling liquid on the hot side and condensed water on the cool side. Adding further flexibility, several types of fluids are available including water, oils, and engineered dielectric fluids.
This white paper, Liquid Cooling Technologies for Data Centres and Edge Application, offers a more in-depth look at the main technologies available today.
What level of maintenance do liquid cooling options require?
The level of maintenance required for liquid cooling systems will vary depending on how the system is set up. But differentiating it from an air-cooling system requires several tasks, some more complex than others. Periodically, it’s important to check fluid chemistry if using water, and other maintenance activities typical to water loop maintenance, such as checking strainers, verifying correct fluid levels, etc.
If a solution uses dielectrics, where the IT equipment is fully submerged, this will require its own maintenance tasks, such as checking the submerged equipment to ensure a proper seal to keep out the liquid.
Are Schneider’s liquid cooling options available for data centres that use enclosures by different manufacturers? How compatible or flexible are the liquid cooling options?
Schneider Electric is working to provide solutions that enable all types of liquid cooling at the edge and in large data centres, both retrofitted and new. For example, one Schneider Electric partner, called Iceotope, is a start-up out of the United Kingdom that provides a chassis-based immersion solution for IT to be integrated in any compatible rack.
Are there any particular industry segments (healthcare, institutional, manufacturing, etc.) that utilize data centres that have embraced the technology more than others? Are there any segments that are lagging behind?
High performance computing has been using different forms of liquid cooling for many years. For example, block-chain mining has recently deployed liquid cooling to improve efficiency, while hyperscale operators are deploying liquid cooling in small scales for GPU accelerated applications and are now investigating deployment at a larger scale.
Unique applications such as harsh environment and light industrial are expected to leverage liquid cooling as immersion technology removes the delicate fans of an air-cooled system and seals the rack completely.
Finally, any segment that will use AI or GPU intensive applications like machine learning will likely look to adopt liquid cooling due to GPU’s higher power densities. Meanwhile, we expect to see enterprise data centres lag in adoption as generally, enterprise data centre operators are slower to adopt new technologies until its success has been proven over time.
What type of physical space requirements are necessary for liquid cooling options?
With liquid cooling, users can recognize compaction within the white space, where the equipment is installed, through modest reduction in the gray space, or back end. Compact liquid cooling systems allow for a reduction of space taken up both by the visible racks, the supporting infrastructure, and the outdoor heat rejection space, reducing the necessary footprint of the entire facility and allowing for a more efficient use of space.
What type of liquid is primarily utilized for liquid cooling and is this an environmentally friendly option?
At a high level, for immersed technology, there are two main categories of fluids: Hydrocarbon based (oil based), and engineered fluids that are PFC based, both coming in many variations depending on manufacturer. All are typically non-toxic, but some oil-based fluids can have flash point & flammability issues. Meanwhile, some engineered fluids can have high Global Warming Potential (GWP), but this is typically mitigated by minimizing release to the atmosphere during operation and maintenance.
In both cases, the cooling may be broken down into two segments: smaller volumes of engineered fluids or oils used in proximity to the IT, and larger volumes of water used to transport thermal energy outdoors.
For direct to chip technology, many systems deliver water directly to a cold plate on the chip. This process involves running water contacting the plate, drawing the heat away from the chip as it circulates. Water is the best option for costs and environmental purposes, however cold plate applications suffer a slow adoption rate due to fear of system leaks, and often only remove 50 – 80% of the IT heat, compared to nearly 100% with a submerged system.
How have liquid cooling options for data centres progressed since they were first introduced? As technology in general becomes smaller in size, including data storage technology, are liquid cooling options following a similar trend?
Liquid cooling has been around for decades. IBM mainframes in the late 1970s through early 1990s used dedicated precision chillers to support cooling during compute loads. Looking at current generations, we’ve become much more efficient, using warmer water that is cooled by simply using water-to-air radiators ejecting heat directly to ambient air, eliminating the cost-intensive cooling equipment of the past.
The concept has evolved over time as well, introducing immersion cooling. This practice submerges the entire server, capturing almost all heat released by the electronics in the water circuit, keeping the facility overall much cooler and more efficient.
Overall, liquid cooling systems have followed the technology they are applied to. We expect this trend to continue as IT technology continues to shrink, as liquid cooling systems have proven more than capable of handling the compaction happening with IT equipment and storage.