Liquid Cooling Technologies Driving Data Centre Efficiency


Dr Kelley Mullick joins Iceotope Technologies
Iceotope Technologies has announced the appointment of Kelley A. Mullick PhD, as Vice President, Technology Advancement and Alliances. Recognised for her expertise in immersion and cold plate liquid cooling, Kelley joins the company from Intel Corporation where she worked in product management and strategy for the data centre and AI group, where she developed Intel’s first immersion cooling warranty, announced at Open Compute Project (OCP) 2022. Kelley also holds a BSc in Chemistry and Biology from Walsh University, an MSc in Chemical Engineering from the University of Akron and a PhD in Chemical Engineering from Ohio State University. David Craig, CEO, Iceotope Technologies says, “Kelley is a welcome addition to the Iceotope team. She joins us as the market is turning increasingly to liquid cooling to solve a range of challenges from increasing processor output and efficiency to delivering greater data centre space optimisation and reducing the energy waste and inefficiencies associated with air-cooling for greater data centre sustainability. Kelley is a dynamic and results-oriented problem solver who brings solid systems engineering know-how. With many industry accolades, she is also a champion for diversity and inclusion having personally developed initiatives for women and under-represented minorities.” Kelley says, “As a systems engineer I fixate on technical requirements in tandem with business requirements to drive solutions. Today, existing challenges to mitigate against the climate emergency are joined by the technological expedients of AI applications such as ChatGPT. These compute-intensive operations need the support of compute-intensive infrastructure. The limitations and inefficiencies of air cooling are well known. Only precision immersion liquid cooling can meet the environmental needs of all processor board components in a familiar form factor that fits with the way we design data centres and carry out moves, adds and changes. “With the focus on sustainability at Intel, I became familiar with all types of liquid cooling. When I appraised Iceotope’s technology, I saw complete differentiation from anything else in the market. In addition to all the benefits of liquid cooling, it offers high levels of heat reuse, almost completely eliminates the use of water, and offers greater compute density and scalability than other solutions like cold-plate and tank immersion. It is the technology of the future that I want to invest my calories in.” Kelley to build out Iceotope’s ecosystem With responsibilities for building and maintaining alliances with OEMs and technology partnerships, Kelley’s role will also make Iceotope technology more accessible to the wider market. The company currently has alliances with leading global vendors including IT giants, HPE and Lenovo, as well as physical infrastructure manufacturers, nVent and Schneider Electric, and technology supply chain specialists, Avnet. As things stand, Iceotope precision liquid cooling solutions can be supplied with a warranty almost anywhere around the globe. By augmenting its ecosystem with additional technology and channel partners, Iceotope can build upon its aptitude for ease of installation and use, to make precision liquid cooling the first choice for new data centre developments as well as upgrading existing facilities as operators strive for greater cooling efficiency and reliability, and increased operational sustainability. Engineered to cool the whole IT stack from hyperscale to the extreme edge, Iceotope’s patented chassis-level precision liquid cooling offers up to 96% water reduction, up to 40% power reduction, and up to 40% carbon emissions reduction per kW of ITE . Kelley, a champion for minorities in tech Kelley is passionate about diversity and inclusion. She has worked throughout her career to help prepare and resource women as well as other underrepresented minorities to be confident and successful in their own careers. In addition to creating programmes in the workplace, she has also invested her personal time in developing free-to-access online materials in support of greater equality in the workforce.

Showcase the next generation in modular data centres
Mission Critical Facilities International (MCFI) is collaborating with Iceotope and nVent at Supercomputing 2022 (SC22), held November 13-18 in Dallas at the Kay Bailey Hutchison Convention Center. Together, the companies will showcase the features and benefits of their prefabricated all-in-one data centre solutions, highlighting fully-integrated, precision immersion liquid cooling solutions. MCFI’s liquid-cooled containers allow precision immersion liquid cooling to be deployed as a stand-alone solution in any location and climate - even at the far edge.  MCFI is leading the next generation of modular/prefabricated data centres with its customisable GENIUS solutions as well as MicroGENIUS, a sustainable microgrid communications shelter that delivers efficient, grid-independent energy solutions. Both solutions provide for reduced CapEx and OpEx costs, enhanced speed to market, global repeatability and scale, sustainable designs, reduced carbon building materials and zero-emission technology.  MCFI’s energy-efficient, scalable and cost-effective containerised/prefabricated data centre solution features innovative integrations with Iceotope’s precision immersion technology and nVent’s electrical connection and protection solutions. The MCFI solution allows for high-density computing anywhere, combining high-density loads alongside standard IT loads. It also eliminates mechanical cooling in the data centre while maximising free cooling to reduce energy consumption/cost by applying a hybrid water cooling technique that utilises the return water from the rear door heat exchangers to feed the Iceotope precision immersion technology.  The alliance is beneficial to enterprise data centres, high-performance and edge computing, smart manufacturing, content delivery, telemedicine, AI and virtual reality. The combined solution reduces execution complexities, lowers costs and eases the implementation of liquid cooling in retrofit and new build environments. “The MCFI, Iceotope and nVent relationships further exemplify the importance of collaborative commitments in developing innovative and sustainable solutions for the future of digital infrastructure and our planet,” says Patrick Giangrosso, Vice President at MCFI.  Visit MCFI, Iceotope and nVent at SC22, Booth 427 for a deeper dive into modular data centre solutions and the latest innovations in liquid cooling technologies.

Rising temperatures highlight need for liquid cooling systems
The rising frequency of extreme weather periods in Europe necessitates a move towards liquid cooling systems, suggests a sector expert. This warning follows record-breaking temperatures in the UK last month, with some locations exceeding 40°C. As a result, a number of high-profile service providers in the nation experienced outages that impacted customer services, the effects of which were felt as far as in the US. One operator attributed the failure to ‘unseasonal temperatures’. However, with the UK MET Office warning that heatwaves are set to become more frequent, more intense and long-lasting, Gemma Reeves, Data Centre Specialist at Alfa Laval, believes that data centres will need to transition to liquid cooling systems in order to cope. She says: “The temperatures observed last month are a sign of what is to come. Summers are continuing to get hotter by the year, so it’s important that data centres are able to manage the heat effectively. “Mechanical cooling methods have long been growing unfit for the needs of the modern data centre, with last month’s weather only serving to highlight this. As both outside temperatures and rack densities continue to rise, more efficient approaches to cooling will clearly be necessary.” Traditional mechanical cooling systems make use of an electronically powered chiller, which creates cold air to be distributed by a ventilation system. However, most mechanical cooling systems in the UK are designed for a maximum outdoor temperature of 32°C - a figure which continues to be regularly exceeded. Gemma believes that liquid cooling can solve this challenge. Cooling with dielectric fluid rather than air means that the cooling systems may be run at much higher temperatures. Liquid cooled principles such as direct-to-chip, single-phase immersive IT chassis, or single-phase immersive tub allow the servers to remain cool despite much higher outdoor air temperatures, while maintaining lower energy consumption and providing options for onward heat reuse. In studies, this has also been shown to increase the lifetime of servers due to maintaining a stable stasis. Gemma concludes: “The data centre sector remains in an era of air-based cooling. That said, July’s recent heatwave may be the stark reminder the sector needs that these systems are not sustainable in the long term. “Liquid cooling is truly the future of data centres, this technique allows us to cool quicker and more efficiently than ever before, which will be a key consideration with temperatures on the rise.”

DataQube Global has the rights to market products of LiquidCool Solutions
DataQube Global has announced that it has obtained exclusive rights to market products of LiquidCool Solutions (LCS) in various markets around the globe. In addition, DataQube has agreed to make an investment in LCS.   The agreement covers LCS' ZPServer and also the newly launched miniNODE, a next generation harsh environment sealed cooling solution technology solution, developed using eco-friendly dielectric fluids, and intended for mission critical infrastructures where reliability, low maintenance and equipment longevity are key. DataQube Global is planning to deploy LCS’s miniNODE across its portfolio of edge data centre solutions by the end of 2022, to assist clients in deploying edge technology. Incorporating LCS’ immersive cooling technology into the design architecture of DataQube Global’s edge data centre products deliver a range of operational and performance advantages, including low maintenance, reduced downtime and extended component shelf life, the LCS’ novel miniNODE cooling solution, along with the ability to deliver 1400 times more cooling power than air. The LCS technology fully supports DataQube in its mission to deploy edge data centre systems that are eco-friendly. Unlike other solutions, DataQube Global’s unique person-free layout reduces power consumption and CO2 emissions by up to 50% as the energy transfer is primarily dedicated to powering computers. Exploiting next generation cooling technologies such as those developed by LCS offers the potential to reduce these figures further. “We have already secured a major deal in the US to augment our presence in north America.” says David Keegan, Group CEO of DataQube Global. “Investing into LiquidCool Solutions cements our position as a serious player in the data centre industry and a force to be reconciled with.” “We are extremely happy to formalise our relationship with DataQube Global. Their rapidly expanding presence in edge computing and harsh environment markets provides LCS new opportunities and complements the growth plans of DataQube. The relationship with DataQube is a key element for introducing our patented chassis-based single-phase immersion technology to the burgeoning edge and data centre markets.” concludes Ken Krei, CEO of LiquidCool Solutions.

Inspur Information and JD Cloud launch liquid-cooled server
Inspur Information and JD Cloud have announced they have jointly launched the liquid-cooled rack server ORS3000S. The server utilises cold-plate liquid cooling to reduce data centre power consumption by 45% compared to traditional air-cooled rack servers, making it a green solution that dramatically reduces total cost of ownership (TCO). Cold-plate liquid cooling technology allows ORS3000S to improve heat dissipation efficiency by 40%. It adopts a centralised power supply design with N+N redundancy that is capable of meeting the demands of whole rack power supply, and can function at the highest efficiency throughout operation due to power balance optimisation. This results in an overall efficiency increase of 10% when compared to a distributed power supply. Pre-installation at the factory, plus efficient operations and maintenance (O&M) allow for 5-10x faster delivery and deployment. The ORS3000S has been widely deployed in JD Cloud data centres, providing computing power support for JD during major shopping events. It brings a performance increase of 34–56% while minimising power usage effectiveness (PUE), carbon emissions and energy consumption. Inspur Information has been a pioneer in direct and indirect cooling. With new heat conduction technologies such as phase-change temperature uniformity, micro-channel cooling and immersion cooling, Inspur achieves a 30–50% optimisation in the comprehensive energy efficiency of the cooling system. This is achieved via cooling improvements throughout the server design, including a micro/nano-cavity, phase-change, and uniform temperature design for high-power components such as the CPU and GPU. This improves heat dissipation performance by 150% compared to traditional air cooling technologies. Experienced in the industrial application of liquid cooling, Inspur has built one of the world’s largest liquid-cooled data centre production facilities with an annual manufacturing capacity of 100,000 servers. This includes a full-chain liquid-cooling smart manufacturing solution covering R&D, testing, and delivery for the mass production of cold-plate liquid-cooled rack servers. As a result, the PUE for data centres is less than 1.1, and the entire delivery cycle takes five to seven days. Inspur Information’s cold-plate, heat-pipe, and immersion liquid-cooled products have been deployed at a large scale. In addition, Inspur offers complete solutions for liquid-cooled data centres, including primary and secondary liquid cooling circulation and the coolant distribution unit (CDU). This total solution enables a full-path liquid cooling circulation for data centres with the overall PUE reaching the design limit of less than 1.1. Inspur holds more than 100 core patents in the liquid cooling, and has participated in the formulation of technical standards and test specifications for cold-plate and immersion liquid-cooled products in data centres. The company is committed to and will continue to lead the rapid development of the liquid cooling industry and the large-scale application of innovative liquid cooling technology.

DCNN Exclusive: Making sustainability gains with liquid cooling
This piece was written by Stuart Crump, Director of Sales at Iceotope Technologies Limited on how liquid cooling could be vital in the race to net zero. Environmental, Social and Governance (ESG) objectives have started to drive data centre business goals as the world transitions to a low carbon economy. Sustainability is no longer being viewed as a cost on business, indeed many customers are now using sustainability as a criterion for vendor selection. Positive action to reduce emissions is not only good for the planet, it’s also good for business. It will also signpost efficient data centres to an enlightened market. New developments in liquid cooling can assist data centre sustainability targets by significantly reducing facility energy consumption for mechanical services, decreasing water use, and providing a platform for high-grade reusable heat. Together, the characteristics of liquid cooling adds up to bottom-line benefits as well as ecological advantages to data centre operators, helping deliver competitive advantage in this highly commercialised sector. According to the IEA, data centres account for around 1% of global electricity demand. While data centre workloads and internet traffic have multiplied dramatically since 2015, energy use has remained relatively flat. However, demand for more digital services is growing at an astounding rate. For every bit of data that travels the network, a further five bits are transmitted within and among data centres. Immersion liquid cooling can greatly benefit data centre sustainability by significantly reducing overall cooling energy requirements by up to 80%. Data centre operators and customers now understand that air-cooled ITE environments are reaching the limits of their effectiveness.  As compute densities increase, the energy demands of individual servers and racks spiral upwards. Legacy air-cooled data halls cannot move the volume of cool air through the racks required by the latest CPU and GPU systems to maintain operating temperature. This means they must have a plan that includes liquid cooling if these sites are to remain viable. Liquid cooling techniques, such as precision immersion cooling circulates small volumes of a harmless dielectric compound across the surface of the server removing almost 100% of the heat generated by the electronic components. The latest solutions use a sealed chassis that enables IT equipment including servers and storage devices to be easily added or removed from racks with minimal disruption and no mess. Precision liquid cooling removes the requirement for server fans by eliminating the need to blow cool air over the IT components. Removing air cooling infrastructure from data centres also removes the capital expense of some cooling plant, as well as the operational costs of installation, power, servicing and maintenance. Removal of fans and plant not only produces an immediate benefit in terms of reducing noise in the technical area, it also frees up useful space in racks and cabinets as well as in plant rooms. Space efficiency equates to either facilities which are smaller in physical footprint, or the ability to host larger numbers of high density racks. Importantly, precision liquid cooling provides futureproof, scalable infrastructure to meet the provisioning requirements of tomorrow’s workloads and storage needs, Precision cooling and data centre water use The media reports widely on the lack of clean water for irrigation and consumption in drought hit areas from around the world. However, what has sometimes been called the data centre’s ‘dirty little secret’ is the volume of potable water required to operate certain data centres. Many air cooled data centres need water and lots of it. A small 1MW data centre using a conventional air-cooling process can use around 25.5 million litres of water every year. With mainly air-cooled processes, the data centre industry is currently consuming billions of litres of water each year. On the other hand, precision immersion liquid cooling consumes zero water in most cases and can be installed anywhere – including many existing data centres. The water in the cooling system, allowing for maintenance and water loop refreshes, can easily reduce data centre water use by more than 95%. The benefit of all this hot air… Creating a revenue generator from a cost item on the balance sheet is an ultimate dream come true. Currently, air cooled data centres eject heat into the atmosphere in the vast majority of cases. Liquid cooling techniques which capture and remove high-grade heat from the servers offers the capability to redirect this heat to district heating networks, industrial applications and other schemes. Using well established techniques this revenue stream, or sustainability project, could help to heat industrial sites and local facilities, such as schools and hospitals.  Climate change, government intervention with emission standards and public and investor pressure has helped drive change in the wider data centre business outlook. Savings and new revenue streams that benefit the organisations sustainability credentials warrant a critical review of their cost/benefit. There is the opportunity for data centres to move away from previous notions of how data centres operate towards much greater efficiency and sustainable operations.

EuroEXA reports European Exascale programme innovation
At launch one of the largest projects ever funded by EU Horizon 2020, EuroEXA aimed to develop technologies to meet the demands of Exascale high-performance computing (HPC) requirements and provide a ground-breaking platform for breakthrough processor-intensive applications. EuroEXA brought together expertise from a range of disciplines across Europe, from leading technologists to end-user companies and academic organisations to design a solution capable of scaling to peak performance of 400 PetaFLOPS, with a peak power system envelope of 30MW that approaches PUE parity using renewables and chassis-level precision liquid cooling. Dr Georgios Goumas EuroEXA Project Coordinator says, “Today, High-Performance Computing is ubiquitous and touches every aspect of human life. The need for massively scalable platforms to support AI-driven technology are critical to facilitate advances in every sector, from enabling more predictive medical diagnoses, treatment and outcomes to providing more accurate weather modelling so that, e.g. agriculture, can manage the effects of climate change on food production.” EuroEXA Demonstrates EU Innovation on an equal footing with RoW Meeting the need for a platform that answers the call for increased sustainability and lower operational carbon footprint, the 16-partner strong coalition delivered an energy-efficient solution. To do so, the partners overcame challenges throughout the development stack, including energy efficiency, resilience, performance and scalability, programmability, and practicality. The resulting innovations enable a more compact and cooler system, reducing both the cost per PetaFLOPS and its environmental impact; is robust and resilient across every component and manages faults without extended downtime; provides a manageable platform that will continue to provide Exascale performance as it grows in size and complexity; harnesses open-source systems to ensure the widest possible range of applications, ensuring it is relevant and able to impact real-world applications. The project extended and matured leading European software stack components and productive programming model support for FPGA and exascale platforms, with advances in Maxeler MaxJ, OmpSs, GPI and BeeGFS. It built expertise in state-of-the-art FPGA programming through the porting and optimisation of 13 FPGA-accelerated applications, in the HPC domains of Climate and Weather, Physics and Energy, and Life Sciences and Bioinformatics, at multiple centres across Europe. EuroEXA innovation being applied today at ECMWF and Neurasmus A prototype of a weather prediction model component extracted from ECMWF’s IFS suite demonstrated significantly better energy-to-solution than on current HPC nodes, achieving a 3x improvement over an optimised GPU version running on an NVIDIA Volta GPU. Such an improvement in execution efficiency provides an exciting avenue for more power-efficient weather prediction in the future. Further successful outcomes were made through the healthcare partnership with the Neurasmus programme at the Amsterdam UMC, where brain activity is being investigated. The platform was used to generate more accurate neuron simulations than has previously been possible, helping to predict more accurately healthcare outcomes for patients. EuroEXA legacy – extensive FPGA testbed an aid to further developments Outcomes generated include deploying what is believed to be the world’s largest network cluster of FPGA (Field Programmable Gate Array) testbeds, configured to drive high-speed multi-protocol interconnect, with Ethernet switches providing low-latency and high-switching bandwidth. The original proposal was for three FPGA clusters across the European partnership. However, COVID-19 travel restrictions necessitated an increased resource of 22 testbeds, developed in various partner locations. This has benefited the project by accelerating through the massively increased permutations and iterations available, which has also provided a blueprint for several partners to develop high-performance FPGA-based technologies. Partners in the programme have committed to further technology developments to support the advances made by the EuroEXA project and which are now targeted at other applications.

Leading partners join forces with Equinix to test sustainable data centre innovations
Equinix has announced the opening of its first Co-Innovation Facility (CIF), located in its DC15 International Business Exchange (IBX) data centre at the Equinix Ashburn Campus in the Washington, D.C. area. A component of Equinix's Data Centre of the Future initiative, the CIF is a new capability that enables partners to work with Equinix on trialling and developing innovations. These innovations, such as identifying a path to clean hydrogen-enabled fuel cells or deploying more capable battery solutions, will be used to help define the future of sustainable digital infrastructure and services globally. Sustainable innovations, including liquid cooling, high-density cooling, intelligent power management and on-site prime power generation, will be incubated in the CIF in partnership with leading data centre technology innovators including Bloom Energy, ZutaCore, Virtual Power Systems (VPS) and Natron. In collaboration with Equinix, these partners will test core and edge technologies with a focus on proving reliability, efficiency and cost to build. These include: Generator-less and UPS-less Data Centres (Bloom Energy) – utilising on-site solid oxide fuel cells enables the data centre to generate redundant cleaner energy on-grid, and potentially eliminates the need for fossil fuel-powered generators and power-consuming Uninterrupted Power Supply (UPS) systems.High-Density Liquid Cooling (ZutaCore) – highly efficient, direct-on-chip, waterless, two-phase liquid cooled rack systems, capable of cooling upwards of 100kW per rack in a light, compact design. Eliminates risk of IT meltdown, minimises use of scarce resources including energy, land, construction and water, and dramatically shrinks the data centre footprint.Software-Defined Power (VPS) with cabinet-mounted Battery Energy Storage (Natron Energy) – cabinet power management and battery energy storage system manages power draw and minimises power stranding to near zero per cent, leading to a potential 30-50% improvement of power efficiency. "ZutaCore is honoured to be featured at the CIF and partner with Equinix to advance the proliferation of liquid cooling on a global scale,” says Udi Paret, President of ZutaCore.   “Together we aim to prove that liquid cooling is an essential technology in realising fundamental business objectives for data centres of today and into the future. HyperCool liquid cooling solutions deliver unparalleled performance and sustainability benefits to directly address sustainability imperatives. With little to no infrastructure change, it consistently provides easy to deploy and maintain, environmentally friendly, economically attractive liquid cooling to support the highest core-count, high power and most dense requirements for a range of customer needs from the cloud to the edge."

The adoption of alternative data centre cooling to keep climate change in check
DataQube, together with Primaria, is championing the adoption of alternative data centre cooling refrigerants in response to European regulations to phase out greenhouse gases. Field trials are currently underway to establish the feasibility of replacing legacy HFCs – (fluorinated hydrocarbons) coolants with a next-generation refrigerant that efficiently carries heat and delivers a lower environmental impact. The two main refrigerants currently used in data centre cooling systems are R134a and especially R410a. Whilst both have an ozone depletion potential (ODP) of zero, their global warming potential (GWP) ratings of 1430 and 2088 respectively are a thousand times higher than carbon dioxide. R-32 on the other hand, because of its efficient heat conveying capabilities which can reduce total energy usage by up to 10% and due to its chemical structure has a GWP rating that is up to 68% lower at just 675. “The environmental impact of the data centre industry is significant, estimated at between 5-9% of global electricity usage and more than 2% of all CO2 emissions.” Says David Keegan, CEO of DataQube.  “In light of COP 26 targets, the industry as a whole needs to rethink its overall energy usage if it is to become climate neutral by 2030, and our novel system is set to play a major part in green initiatives.” “For data centre service providers it’s important that their operations are state of the art when it comes to energy efficiency and GWP (of the refrigerants used) since it impacts both their balance sheet and their sustainability,” comments Henrik Abrink, Managing Director of Primaria “With the development and implementation of R-32 in the DataQube cooling units we have taken a step further to deliver high added value on both counts in a solution that is already proving to be the most energy efficient edge data centre system on the market.” Unlike conventional data centre infrastructure, DataQube, because of its unique person-free layout, in an alternative way reduces power consumption by as much as 56% and CO2 emissions by as much as 56% as the energy transfer is primarily dedicated to powering computers. Exploiting next generation cooling products such as R-32 together with immersive cooling in its core infrastructure offers the potential to reduce these figures further. DataQube’s efficient use of space, combined with optimised IT capacity makes for a smaller physical footprint because less land, raw materials and power are needed from the outset.  Moreover, any surplus energy may be reused for district heating, making the system truly sustainable.  

The latest trends and developments in cooling solutions
This article was contributed to DCNN by nVent, on the latest trends and developments in cooling. For facility planners, thermal engineers, architects and managers responsible for implementing powerful IT Equipment (ITE), the goal is to maintain high availability at minimal operational costs, while minimising energy consumption. To do so, all equipment must be kept below a specified temperature range. However, legacy cooling in data centres, server rooms and other IT environments use technology based on traditional air conditioning systems that alone struggle to keep up with the rising heat load demands of today’s high-density, high-performance connected technologies. Consequently, sustainability gets sacrificed, and facilities can experience equipment failures, unplanned downtime and soaring energy costs. To run the latest and greatest IT equipment, liquid cooling has become the standard, and high-performance servers are designed with liquid cooling installed. In the right applications, these will offer a strong return on investment and total cost of ownership when compared to non-liquid approaches. In recent years, the range of liquid cooling solutions has advanced to help meet the unique protection needs of applications in virtually any environment. Whether for smaller decentralised edge computing, harsh environments, or large data centre installations, no one size fits all approach exists for thermal management. This article shares key considerations for choosing among the comprehensive range of standard and customised air, indirect and direct water-cooling solutions. It also provides guidance on identifying which solution will best protect your ITE assets and your bottom line. Understanding the range of cooling solutions The advanced thermal management solutions that exist today offer the breadth, flexibility and modularity needed to meet unique application needs and address a range of ITE heat-load challenges. For reference, the primary cooling approaches include: Air cooled – Heat is transferred directly to the room air and cooled via traditional data centre cooling Indirect water-cooled – Heat is transferred indirectly to water through an air-to-water heat exchanger located within the row or single cabinet Direct water-cooled – Heat is transferred directly to an attached heat transfer component, such as a cold plate. Hybrid direct and indirect water-cooled – Selective cooling of highest energy-consuming components with direct contact liquid cooling and the balance of the cabinet is cooled via secondary air-to-water cooling device, such as a Rear Door Cooler (RDC).  Air cooling is becoming less feasible in high density data centres, as heat loads increase and server racks become so densely configured that air circulation is impeded. Today’s air cooling solutions generally can manage 10 kilowatts or less cost effectively. Data centres that try to cope by increasing air velocity can quickly become a wind-tunnel-like environment that is difficult to work in. With this in mind, as energy needs increase, so does the likelihood of needing a liquid cooling component to your thermal management strategy. Liquid cooling systems offer effective solutions for achieving required temperature parameters and lowering energy consumption of the cooling system, thus lowering operating costs. Liquid provides a much greater heat transfer capacity – 3,500 times higher than that of air – because it is denser than air. Subsequently, direct-contact liquid cooling (direct-to-chip) in which a coldplate is placed directly on processors inside the server presents the highest efficiencies. The coldplate has internal micro channels and an inlet and outlet through which liquid is circulated to carry away heat. Low-profile coldplates used in direct-contact liquid cooling also have the advantage of taking up much less rack space than traditional heat sinks. However, while liquid cooling offers huge advantages in moving heat, managing the entire heat load of the rack with liquid cooling methods can be unnecessary and cost prohibitive for some applications. In many cases, a hybrid solution – combining both liquid and air cooling – often is a more accessible and scalable deployment option, that effectively leverages the highly efficient heat transfer of liquids. For example, more current cooling designs include aisle containment and rack-based cooling. These models increase efficiency and often incorporate an air-to-liquid heat transfer to leverage the higher heat transfer qualities of liquids. Choosing the right solution for your application To determine the most efficient and effective cooling technologies and layouts for your specific IT hardware, it is important to evaluate the current and future thermal profile of the environment, and model the necessary infrastructure modifications or layout changes. As you do, also consider these important factors: Existing equipment capabilities. The placement of IT equipment, air handlers, close-coupled cooling, and direct liquid cooling technologies within the data centre, server room, or other ITE environment is critical to the efficient use of available space and cooling capacity.Current cooling needs and anticipated future ones, taking into account potential technological advances, business growth and other objectives.Current, pending and trending global standards as well as relevant regulations.Resources vs. return on investment and total cost of ownership. The placement of IT equipment, air handlers, close-coupled cooling, and direct liquid cooling technologies within the data centre, server room or other ITE environment is critical to the efficient use of available space and cooling capacity. Lead time for facility upgrades and capital planning requirements mandate comprehensive planning. For example, a university or research facility may only have 6-10 racks, but require liquid cooling because of the high-performance computing power In one recent case, a global IT original equipment manufacturer needed robust cooling for its data centre. In comparing an air cooling solution and a liquid one, they found some critical differences. Both solutions work, but the liquid cooling solution offered significant footprint and energy efficiency advantages. It also set the OEM up for future upgrades as advances in technology increase the performance power and density needs. Depending on your specific application and environmental needs, myriad solutions exist. So, you need to weigh the options to determine the most appropriate cooling solution. Once you have determined the right cooling solution for your application, you will need to gather relevant data on your facility requirements to help ensure a smooth, streamlined installation. This will entail providing detailed information on the equipment, configuration, thermal attributes and design needs of your primary loop, secondary loop, and architecture, as well as any unique requirements. You also will want to create a routine maintenance plan. For liquid cooling solutions, this will include scheduling required inspections and ordering replacement parts in advance to make sure the fluid in the system is within safe operating range.



Translate »