Advertise on DCNN Advertise on DCNN Advertise on DCNN

Liquid Cooling


Redefining liquid cooling from the server to the switch
By Nathan Blom, CCO, Iceotope Liquid cooling has long been a focal point in discussions surrounding data centres, and rightfully so, as these facilities are at the epicentre of an unprecedented data explosion. The explosive growth of the internet, cloud services, IoT devices, social media, and AI has fuelled an unparalleled surge in data generation, intensifying the strain on rack densities and placing substantial demands on data centre cooling systems. In fact, cooling power alone accounts for a staggering 40% of a data centre's total energy consumption. However, the need for efficient IT infrastructure cooling extends beyond data centres. Enterprise organisations are also looking for ways to reduce costs, maximise revenue and accelerate sustainability objectives. Not to mention the fact that reducing energy consumption is rapidly becoming one of the top priorities for telcos with thousands of sites in remote locations, making the reduction of maintenance costs key as well. Liquid cooling technologies have emerged as a highly efficient solution for dissipating heat from IT equipment, regardless of the setting. Whether it's within a data centre, on-premises data hall, cloud environment, or at the edge, liquid cooling is proving its versatility. While most applications have centred on cooling server components, new applications are rapidly materialising across the entire IT infrastructure spectrum. BT Group, in a ground-breaking move, initiated trials of liquid cooling technologies across its networks to enhance energy efficiency and reduce consumption as part of its commitment to achieving net zero status. BT kicked off the trials with a network switch cooled using Iceotope’s Precision Liquid Cooling technology and Juniper Networks QFX Series Switches. With 90% of its overall energy consumption coming from networks, it’s easy to see why reducing energy consumption is such a high priority. In a similar vein, Meta released a study last year confirming the practicality, efficiency and effectiveness of precision liquid cooling technology to meet the cooling requirements of high-density storage disks. Global data storage is growing at such a rate there is an increased need for improved thermal cooling solutions. Liquid cooling for high-density storage is proving to be a viable alternative as it can mitigate for variances and improve consistency. Ultimately, it lowers overall power consumption and improves ESG compliance. Liquid cooling technologies are changing the game when it comes to removing heat from the IT stack. While each of the technologies on the market today have their time and place, there is a reason we are seeing precision liquid cooling in trials that are broadening the use case for liquid cooling. It also ensures maximum efficiency and reliability as it uses a small amount of dielectric coolant to precisely target and remove heat from the hottest components of the server. This approach not only eliminates the need for traditional air-cooling systems, but it allows for greater flexibility in designing IT solutions than any other solution on the market today. There are no hotspots that can slow down performance, no wasted physical space on unnecessary cooling infrastructure, and minimal need for water consumption. As the demand for data increases, the importance of efficient and sustainable IT infrastructure cooling cannot be overstated. Liquid cooling, and precision liquid cooling in particular, is at the forefront of this journey. Whether it's reducing the environmental footprint of data centres, enhancing the energy efficiency of telecommunication networks, or meeting the ever-increasing demands of high-density storage, liquid cooling offers versatile and effective solutions. These trials and applications are not just milestones, they represent a pivotal shift toward a future where cooling is smarter, greener, and more adaptable, empowering businesses to meet their evolving IT demands while contributing to a more sustainable world.

Vertiv enhances manufacturing capacity for chilled water solutions
Vertiv has unveiled an upgraded testing room at its thermal management centre near Tognana, Italy. This sizeable investment significantly increases the facility’s testing capacity and manufacturing capabilities in the existing space and demonstrates its ongoing commitment to the advancement of chilled water systems to help drive liquid cooling adoption. It also shows the company’s support of increasing demands on data centres, including high-performance computing, artificial intelligence (AI) and generative AI (GenAI). The upgraded testing room will allow Vertiv to do standard and tailored tests of customer equipment, spanning all of the cooling solutions in its product portfolio - both air and water-cooled, balancing a thermal load greater than 2MW with a chamber air temperature up to 55°C. Moreover, it will also be able to test units equipped with low global warming potential (GWP) refrigerants. This upgrade comes at a critical time for the data centre industry. Operators are expanding rapidly to meet increasing capacity needs, whilst at the same time seeking to minimise their environmental impact. More than 100 European data centre operators and trade associations have signed The Climate Neutral Data Centre Pact committing to climate neutrality by 2030. Chilled water systems play a key role in helping to reach this goal by enabling operators to upscale data centre capacity whilst simultaneously limiting direct and indirect emissions. These systems apply low-GWP refrigerants that enable significant reduction of direct and indirect CO2 emissions, decreasing a data centre’s carbon footprint. “Resource-efficient chilled water solutions are important to the sustainable growth of the data centre industry, and we must continue to focus on how we can evolve and improve the technologies for operators and the environment,” says Karsten Winther, President EMEA at Vertiv. “We are proud of the market-leading work we have achieved for our customers. For example, we worked with sustainability focused colocation provider, Green Mountain, to deploy 5MW of high-efficiency chilled water cooling systems. Enhancing the testing and manufacturing capacity at our thermal management facility allows us to continue to innovate in this space and deliver even more value to the industry and our customers.” “In December, we will introduce our new Vertiv Liebert AFC high capacity, inverter screw with low-GWP refrigerant chiller up to 2200kW to the EMEA market. This new testing room will enable us to test these larger capacity units,” says Roberto Felisi, Senior Director, Thermal Global Core Offering and EMEA Business Leader at Vertiv. “We continue to explore opportunities to further invest in our capabilities to support projected growth and demand for thermal management systems, particularly liquid cooling solutions.” To celebrate the latest expansion, Vertiv welcomed employees’ families for a special visit to the facility, including the Thermal Management Customer Experience Centre. The open day featured activities designed specifically to engage children and young people, offering insights into data centres and the significance of thermal management. Highlights included guided tours of production lines and primary laboratories, as well as Vertiv’s own data centre which provided a first-hand look at the machines in action. Depending on age, the young visitors could partake in thermodynamics workshops and explore topics such as cold generation and the behaviour of hot and cold air particles. They also got the chance to experience Vertiv’s cutting-edge augmented reality applications, like the Vertiv XR app, and navigate a virtual data centre. Click here for more latest news.

B­­T takes the plunge with new liquid cooling trials
BT Group has announced that it is trialling several liquid cooling technologies that could substantially improve energy consumption and efficiency metrics in its networks and IT infrastructure, in pursuit of its commitment to becoming a net zero business by the end of March 2031. The group will trial precision liquid cooled network switches using a solution provided by Iceotope and Juniper Networks QFX Series switches, which are widely used in existing network cloud architectures. Ahead of the trial, they have together demonstrated a replica ‘set-up’ using an HP x86 server at BT’s Sustainability Festival. The demonstration showed how power used to cool a network switch typically deployed in a data centre could be significantly reduced. All electronic and electrical systems generate heat during operation that must be dissipated to maintain working capability. Like most large data centres, network and IT equipment across its estate is currently cooled using air-based systems. As network capacity and demands increase, next generation IT and network hardware will have to work harder and will become hotter. Consequently, the power needed to cool them will increase, driving up energy consumption and operational cost. BT Group is, therefore, exploring numerous alternative cooling techniques and in addition to its trial with Iceotope and Juniper, the company will trial the following liquid cooling systems. Precision liquid cooled networking servers and data centre equipment, with Iceotope and Juniper Full immersion of networking servers in an immersion tank, with Immersion4 Liquid-cooled cold plates of networking equipment in a cooling enclosure, with Nexalus Cooling using sprayed-on partial immersion of data centre equipment, with Airsys. Typically, these techniques bring several benefits including a 40-50% reduction in power needed to cool systems vs air cooling, higher equipment density saving on real estate footprint and therefore further power usage reductions, and reduced material usage-reducing carbon footprint. Further, rather than heat dissipated into the air, liquid cooling systems can channel exhausted heat to be reused to heat other parts of a building. Liquid cooling enabling equipment can also be deployed in more environmentally challenging environments such as areas with more contaminants. Maria Cuevas, Networks Research Director, BT Group, says, “As the UK’s largest provider of fixed-line broadband and mobile services in the UK, it isn’t a surprise that over 90% of our overall energy consumption – and nearly 95% of our electricity - comes from our networks. In a world of advancing technology and growing data demands, it’s critical that we continue to innovate for energy efficiency solutions. Liquid cooling for network and IT infrastructure is one part of a much bigger jigsaw but is an area we’re very excited to explore with our technology partners.”

Omdia: The data centre market is healthy and ready for AI demand
The recent explosion of high-profile AI successes and investment announcements has captured the attention and imagination of the business world. In light of the latest AI media frenzy, new research from Omdia reveals that the data centre market has a heightened awareness of practical applications for AI that promise to improve productivity and lower costs. The collective evidence so far says this will not just be another flash in the pan. Colocation businesses, including both multi-tenant and single tenant data centre providers, are expected to be riding this wave of new AI growth. Some of these companies have adapted their data centre designs to enable higher rack power density. The power consumption of servers configured for AI training is akin to high-performance computing (HPC) clusters for scientific research. “The colocation providers able to provide the highest rack densities and access to liquid cooling will now have the upper hand in the market for data centre space,” says, Alan Howard, Principal Analyst at Omdia. Research from Omdia projects continued strong growth in the colocation market and it’s likely the proliferation of AI hardware will be an added tailwind to growth. The colocation industry is quite healthy and is expected to reach $65.2bn in 2027, with a five year growth CAGR of 9.4%, according to Omdia’s Colocation Services Tracker - 2023. Depending on how the acceleration in AI hardware deployments materialises, colocation data centre revenue could get a significant boost over the next few years. The top three colocation service providers in the world are Equinix, Digital Realty, and NTT Global Data Centres (NTT GDC). Between them, they operate over 700 data centres and have over 100 construction projects underway as covered in Omdia’s Data Centre Building Tracker – 1H23. These three companies represent 33% of the total 2022 revenue of $41.6bn, according to Omdia’s Colocation Services Tracker - 2023. Not all data centres can handle AI or HPC equipment, but these companies and numerous other noteworthy colocation service providers have been anticipating this emerging growth trend. Data centres built over the last couple years and many of those under construction, have been designed and architected to accommodate these high-power density equipment racks. These data centre design and architecture properties include high-density power distribution management and precision cooling for thermal management to protect servers. In some cases, colocation customers require direct to chip liquid cooling, which requires special data centre plumbing designs to provide customers access to a liquid cooling loop, or the option to install immersion cooling tanks where the hottest servers are sunken into a bath of non-conductive fluids. Alan concludes, “Achieving these advanced data centre operating characteristics are not for the faint of heart or those companies with an aversion to high capital expenditures (capex). Colocation companies like Equinix, Digital Realty, NTT GDC, Flexential, DataBank, Compass, Aligned, Iron Mountain, and a host of others are in the business of taking that capital risk to build data centres so that enterprises and cloud service providers don’t have to.”

Why hybrid cooling is the future for data centres
Gordon Johnson, Senior CFD Manager, Subzero Engineering Rising rack and power densities are driving significant interest in liquid cooling for many reasons. Yet, the suggestion that one size fits all ignores one of the most fundamental aspects of potentially hindering adoption - that many data centre applications will continue to utilise air as the most efficient and cost-effective solution for their cooling requirements. The future is undoubtedly hybrid, and by using air cooling, containment, and liquid cooling together, owners and operators can optimise and future-proof their data centre environments. Today, many data centres are experiencing increasing power density per IT rack, rising to levels that just a few years ago seemed extreme and out of reach, but today are considered both common and typical while simultaneously deploying air cooling. In 2020 for example, the Uptime Institute found that due to compute-intensive workloads, racks with densities of 20kW and higher are becoming a reality for many data centres. This increase has left data centre stakeholders wondering if air-cooled IT equipment (ITE) along with containment used to separate the cold supply air from the hot exhaust air has finally reached its limits and if liquid cooling is the long-term solution. However, the answer is not as simple as yes or no. Moving forward, it’s expected that data centres will transition from 100% air cooling to a hybrid model, encompassing air and liquid-cooled solutions with all new and existing air-cooled data centres requiring containment to improve efficiency, performance, and sustainability. Additionally, those moving to liquid cooling may still require containment to support their mission-critical applications, depending on the type of server technology deployed. One might ask why the debate of air versus liquid cooling is such a hot topic in the industry right now? To answer this question, we need to understand what’s driving the need for liquid cooling, the other options, and how can we evaluate these options while continuing to utilise air as the primary cooling mechanism. Can air and liquid cooling coexist? For those who are newer to the industry, this is a position we’ve been in before, with air and liquid cooling successfully coexisting, while removing substantial amounts of heat via intra-board air-to-water heat exchangers. This process continued until the industry shifted primarily to CMOS technology in the 1990s, and we’ve been using air cooling in our data centres ever since. With air being the primary source used to cool data centres, ASHRAE (American Society of Heating, Refrigeration, and Air Conditioning Engineers) has worked towards making this technology as efficient and sustainable as possible. Since 2004, it has published a common set of criteria for cooling IT servers with the participation of ITE and cooling system manufacturers entitled ‘TC9.9 Thermal Guidelines for Data Processing Environments’. ASHRAE has focused on the efficiency and reliability of cooling the ITE in the data centre. Several revisions have been published with the latest being released in 2021 (revision 5). This latest generation TC9.9 highlights a new class of high-density air-cooled ITE (H1 class) which focuses more on cooling high-density servers and racks with a trade-off in terms of energy efficiency due to lower cooling supply air temperatures recommended to cool the ITE. As to the question of whether or not air and liquid cooling can coexist in the data centre white space, it’s done so for decades already, and moving forward, many experts expect to see these two cooling technologies coexisting for years to come. What do server power trends reveal? It’s easy to assume that when it comes to cooling, a one-size will fit all in terms of power and cooling consumption, both now and in the future, but that’s not accurate. It’s more important to focus on the actual workload for the data centre that we’re designing or operating. In the past, a common assumption with air cooling was that once you went above 25kW per rack, it was time to transition to liquid cooling. But the industry has made some changes in regards to this, enabling data centres to cool up to and even exceed 35kW per rack with traditional air cooling. Scientific data centres, which include largely GPU-driven applications like machine learning, AI, and high analytics like crypto mining, are the areas of the industry that typically are transitioning or moving towards liquid cooling. But if you look at some other workloads like the cloud and most businesses, the growth rate is rising but it still makes sense for air cooling in terms of cost. The key is to look at this issue from a business perspective, what are we trying to accomplish with each data centre? What’s driving server power growth? Up to around 2010, businesses utilised single-core processors, but once available, they transitioned to multi-core processors, however, there still was a relatively flat power consumption with these dual and quad-core processors. This enabled server manufacturers to concentrate on lower airflow rates for cooling ITE, which resulted in better overall efficiency. Around 2018, with the size of these processors continually shrinking, higher multi-core processors became the norm and with these reaching their performance limits, the only way to continue to achieve the new levels of performance by compute-intensive applications is by increasing power consumption. Server manufacturers have been packing in as much as they can to servers, but because of CPU power consumption, in some cases, data centres were having difficulty removing the heat with air cooling, creating a need for alternative cooling solutions such as liquid. Server manufacturers have also been increasing the temperature delta across servers for several years now, which again has been great for efficiency since the higher the temperature delta, the less airflow that’s needed to remove the heat. However, server manufacturers are, in turn, reaching their limits, resulting in data centre operators having to increase the airflow to cool high-density servers and to keep up with increasing power consumption. Additional options for air cooling Thankfully, there are several approaches the industry is embracing to cool power densities up to and even greater than 35kW per rack successfully, often with traditional air cooling. These options start with deploying either cold or hot aisle containment. If no containment is used typically, rack densities should be no higher than 5kW per rack, with additional supply airflow needed to compensate for recirculation air and hot spots. What about lowering temperatures? In 2021, ASHRAE released their 5th generation TC9.9, which highlighted a new class of high-density air-cooled IT equipment, which will need to use more restrictive supply temperatures than the previous class of servers. At some point, high-density servers and racks will also need to transition from air to liquid cooling, especially with CPUs and GPUs expected to exceed 500W per processor or higher in the next few years. But this transition is not automatic and isn’t going to be for everyone. Liquid cooling is not going to be the ideal solution or remedy for all future cooling requirements. Instead, the selection of liquid cooling instead of air cooling has to do with a variety of factors, including specific location, climate (temperature/humidity), power densities, workloads, efficiency, performance, heat reuse, and physical space available. This highlights the need for data centre stakeholders to take a holistic approach to cooling their critical systems. It will not and should not be an approach where only air or only liquid cooling is considered moving forward. Instead, the key is to understand the trade-offs of each cooling technology and deploy only what makes the most sense for the application. Click here for more thought leadership.

Supermicro launches NVIDIA HGX H100 servers with liquid cooling
Supermicro continues to expand its data centre offering with liquid-cooled NVIDIA HGX H100 rack scale solutions. Advanced liquid cooling technologies reduce the lead times for a complete installation, increase performance, and result in lower operating expenses while significantly reducing the PUE of data centres. Savings for a data centre are estimated to be 40% for power when using Supermicro liquid cooling solutions when compared to an air-cooled data centre. In addition, up to 86% reduction in direct cooling costs compared to existing data centres may be realised. “Supermicro continues to lead the industry supporting the demanding needs of AI workloads and modern data centres worldwide,” says Charles Liang, President, and CEO of Supermicro. “Our innovative GPU servers that use our liquid cooling technology significantly lower the power requirements of data centres. With the amount of power required to enable today's rapidly evolving large scale AI models, optimising TCO and the Total Cost to Environment (TCE) is crucial to data centre operators. We have proven expertise in designing and building entire racks of high-performance servers. These GPU systems are designed from the ground up for rack scale integration with liquid cooling to provide superior performance, efficiency, and ease of deployments, allowing us to meet our customers' requirements with a short lead time.” AI-optimised racks with the latest Supermicro product families, including the Intel and AMD server product lines, can be quickly delivered from standard engineering templates or easily customised based on the user's unique requirements. Supermicro continues to offer the industry's broadest product line with the highest-performing servers and storage systems to tackle complex compute-intensive projects. Rack scale integrated solutions give customers the confidence and ability to plug the racks in, connect to the network and become more productive sooner than managing the technology themselves. The top-of-the-line liquid cooled GPU server contains dual Intel or AMD CPUs and eight or four interconnected NVIDIA HGX H100 Tensor Core GPUs. Using liquid cooling reduces the power consumption of data centres by up to 40%, resulting in lower operating costs. In addition, both systems significantly surpass the previous generation of NVIDIA HGX GPU equipped systems, providing up to 30 times the performance and efficiency of today's large transformer models, with faster GPU-GPU interconnect speed and PCIe 5.0 based networking and storage. Supermicro's liquid cooling rack level solution includes a Coolant Distribution Unit (CDU) that provides up to 80kW of direct-to-chip (D2C) cooling for today's highest TDP CPUs and GPUs for a wide range of Supermicro servers. The redundant and hot-swappable power supply and liquid cooling pumps ensure that the servers will be continuously cooled, even with a power supply or pump failure. The leak-proof connectors give customers the added confidence of uninterrupted liquid cooling for all systems. Rack scale design and integration has become a critical service for systems suppliers. As AI and HPC have become an increasingly critical technology within organisations, configurations from the server level to the entire data centre must be optimised and configured for maximum performance. The Supermicro system and rack scale experts work closely with customers to explore the requirements and have the knowledge and manufacturing abilities to deliver significant numbers of racks to customers worldwide.

Dr Kelley Mullick joins Iceotope Technologies
Iceotope Technologies has announced the appointment of Kelley A. Mullick PhD, as Vice President, Technology Advancement and Alliances. Recognised for her expertise in immersion and cold plate liquid cooling, Kelley joins the company from Intel Corporation where she worked in product management and strategy for the data centre and AI group, where she developed Intel’s first immersion cooling warranty, announced at Open Compute Project (OCP) 2022. Kelley also holds a BSc in Chemistry and Biology from Walsh University, an MSc in Chemical Engineering from the University of Akron and a PhD in Chemical Engineering from Ohio State University. David Craig, CEO, Iceotope Technologies says, “Kelley is a welcome addition to the Iceotope team. She joins us as the market is turning increasingly to liquid cooling to solve a range of challenges from increasing processor output and efficiency to delivering greater data centre space optimisation and reducing the energy waste and inefficiencies associated with air-cooling for greater data centre sustainability. Kelley is a dynamic and results-oriented problem solver who brings solid systems engineering know-how. With many industry accolades, she is also a champion for diversity and inclusion having personally developed initiatives for women and under-represented minorities.” Kelley says, “As a systems engineer I fixate on technical requirements in tandem with business requirements to drive solutions. Today, existing challenges to mitigate against the climate emergency are joined by the technological expedients of AI applications such as ChatGPT. These compute-intensive operations need the support of compute-intensive infrastructure. The limitations and inefficiencies of air cooling are well known. Only precision immersion liquid cooling can meet the environmental needs of all processor board components in a familiar form factor that fits with the way we design data centres and carry out moves, adds and changes. “With the focus on sustainability at Intel, I became familiar with all types of liquid cooling. When I appraised Iceotope’s technology, I saw complete differentiation from anything else in the market. In addition to all the benefits of liquid cooling, it offers high levels of heat reuse, almost completely eliminates the use of water, and offers greater compute density and scalability than other solutions like cold-plate and tank immersion. It is the technology of the future that I want to invest my calories in.” Kelley to build out Iceotope’s ecosystem With responsibilities for building and maintaining alliances with OEMs and technology partnerships, Kelley’s role will also make Iceotope technology more accessible to the wider market. The company currently has alliances with leading global vendors including IT giants, HPE and Lenovo, as well as physical infrastructure manufacturers, nVent and Schneider Electric, and technology supply chain specialists, Avnet. As things stand, Iceotope precision liquid cooling solutions can be supplied with a warranty almost anywhere around the globe. By augmenting its ecosystem with additional technology and channel partners, Iceotope can build upon its aptitude for ease of installation and use, to make precision liquid cooling the first choice for new data centre developments as well as upgrading existing facilities as operators strive for greater cooling efficiency and reliability, and increased operational sustainability. Engineered to cool the whole IT stack from hyperscale to the extreme edge, Iceotope’s patented chassis-level precision liquid cooling offers up to 96% water reduction, up to 40% power reduction, and up to 40% carbon emissions reduction per kW of ITE . Kelley, a champion for minorities in tech Kelley is passionate about diversity and inclusion. She has worked throughout her career to help prepare and resource women as well as other underrepresented minorities to be confident and successful in their own careers. In addition to creating programmes in the workplace, she has also invested her personal time in developing free-to-access online materials in support of greater equality in the workforce.

Showcase the next generation in modular data centres
Mission Critical Facilities International (MCFI) is collaborating with Iceotope and nVent at Supercomputing 2022 (SC22), held November 13-18 in Dallas at the Kay Bailey Hutchison Convention Center. Together, the companies will showcase the features and benefits of their prefabricated all-in-one data centre solutions, highlighting fully-integrated, precision immersion liquid cooling solutions. MCFI’s liquid-cooled containers allow precision immersion liquid cooling to be deployed as a stand-alone solution in any location and climate - even at the far edge.  MCFI is leading the next generation of modular/prefabricated data centres with its customisable GENIUS solutions as well as MicroGENIUS, a sustainable microgrid communications shelter that delivers efficient, grid-independent energy solutions. Both solutions provide for reduced CapEx and OpEx costs, enhanced speed to market, global repeatability and scale, sustainable designs, reduced carbon building materials and zero-emission technology.  MCFI’s energy-efficient, scalable and cost-effective containerised/prefabricated data centre solution features innovative integrations with Iceotope’s precision immersion technology and nVent’s electrical connection and protection solutions. The MCFI solution allows for high-density computing anywhere, combining high-density loads alongside standard IT loads. It also eliminates mechanical cooling in the data centre while maximising free cooling to reduce energy consumption/cost by applying a hybrid water cooling technique that utilises the return water from the rear door heat exchangers to feed the Iceotope precision immersion technology.  The alliance is beneficial to enterprise data centres, high-performance and edge computing, smart manufacturing, content delivery, telemedicine, AI and virtual reality. The combined solution reduces execution complexities, lowers costs and eases the implementation of liquid cooling in retrofit and new build environments. “The MCFI, Iceotope and nVent relationships further exemplify the importance of collaborative commitments in developing innovative and sustainable solutions for the future of digital infrastructure and our planet,” says Patrick Giangrosso, Vice President at MCFI.  Visit MCFI, Iceotope and nVent at SC22, Booth 427 for a deeper dive into modular data centre solutions and the latest innovations in liquid cooling technologies.

Rising temperatures highlight need for liquid cooling systems
The rising frequency of extreme weather periods in Europe necessitates a move towards liquid cooling systems, suggests a sector expert. This warning follows record-breaking temperatures in the UK last month, with some locations exceeding 40°C. As a result, a number of high-profile service providers in the nation experienced outages that impacted customer services, the effects of which were felt as far as in the US. One operator attributed the failure to ‘unseasonal temperatures’. However, with the UK MET Office warning that heatwaves are set to become more frequent, more intense and long-lasting, Gemma Reeves, Data Centre Specialist at Alfa Laval, believes that data centres will need to transition to liquid cooling systems in order to cope. She says: “The temperatures observed last month are a sign of what is to come. Summers are continuing to get hotter by the year, so it’s important that data centres are able to manage the heat effectively. “Mechanical cooling methods have long been growing unfit for the needs of the modern data centre, with last month’s weather only serving to highlight this. As both outside temperatures and rack densities continue to rise, more efficient approaches to cooling will clearly be necessary.” Traditional mechanical cooling systems make use of an electronically powered chiller, which creates cold air to be distributed by a ventilation system. However, most mechanical cooling systems in the UK are designed for a maximum outdoor temperature of 32°C - a figure which continues to be regularly exceeded. Gemma believes that liquid cooling can solve this challenge. Cooling with dielectric fluid rather than air means that the cooling systems may be run at much higher temperatures. Liquid cooled principles such as direct-to-chip, single-phase immersive IT chassis, or single-phase immersive tub allow the servers to remain cool despite much higher outdoor air temperatures, while maintaining lower energy consumption and providing options for onward heat reuse. In studies, this has also been shown to increase the lifetime of servers due to maintaining a stable stasis. Gemma concludes: “The data centre sector remains in an era of air-based cooling. That said, July’s recent heatwave may be the stark reminder the sector needs that these systems are not sustainable in the long term. “Liquid cooling is truly the future of data centres, this technique allows us to cool quicker and more efficiently than ever before, which will be a key consideration with temperatures on the rise.”

DataQube Global has the rights to market products of LiquidCool Solutions
DataQube Global has announced that it has obtained exclusive rights to market products of LiquidCool Solutions (LCS) in various markets around the globe. In addition, DataQube has agreed to make an investment in LCS.   The agreement covers LCS' ZPServer and also the newly launched miniNODE, a next generation harsh environment sealed cooling solution technology solution, developed using eco-friendly dielectric fluids, and intended for mission critical infrastructures where reliability, low maintenance and equipment longevity are key. DataQube Global is planning to deploy LCS’s miniNODE across its portfolio of edge data centre solutions by the end of 2022, to assist clients in deploying edge technology. Incorporating LCS’ immersive cooling technology into the design architecture of DataQube Global’s edge data centre products deliver a range of operational and performance advantages, including low maintenance, reduced downtime and extended component shelf life, the LCS’ novel miniNODE cooling solution, along with the ability to deliver 1400 times more cooling power than air. The LCS technology fully supports DataQube in its mission to deploy edge data centre systems that are eco-friendly. Unlike other solutions, DataQube Global’s unique person-free layout reduces power consumption and CO2 emissions by up to 50% as the energy transfer is primarily dedicated to powering computers. Exploiting next generation cooling technologies such as those developed by LCS offers the potential to reduce these figures further. “We have already secured a major deal in the US to augment our presence in north America.” says David Keegan, Group CEO of DataQube Global. “Investing into LiquidCool Solutions cements our position as a serious player in the data centre industry and a force to be reconciled with.” “We are extremely happy to formalise our relationship with DataQube Global. Their rapidly expanding presence in edge computing and harsh environment markets provides LCS new opportunities and complements the growth plans of DataQube. The relationship with DataQube is a key element for introducing our patented chassis-based single-phase immersion technology to the burgeoning edge and data centre markets.” concludes Ken Krei, CEO of LiquidCool Solutions.



Translate »