Advertise on DCNN Advertise on DCNN Advertise on DCNN
Sunday, June 15, 2025

Cooling


Redefining liquid cooling from the server to the switch
By Nathan Blom, CCO, Iceotope Liquid cooling has long been a focal point in discussions surrounding data centres, and rightfully so, as these facilities are at the epicentre of an unprecedented data explosion. The explosive growth of the internet, cloud services, IoT devices, social media, and AI has fuelled an unparalleled surge in data generation, intensifying the strain on rack densities and placing substantial demands on data centre cooling systems. In fact, cooling power alone accounts for a staggering 40% of a data centre's total energy consumption. However, the need for efficient IT infrastructure cooling extends beyond data centres. Enterprise organisations are also looking for ways to reduce costs, maximise revenue and accelerate sustainability objectives. Not to mention the fact that reducing energy consumption is rapidly becoming one of the top priorities for telcos with thousands of sites in remote locations, making the reduction of maintenance costs key as well. Liquid cooling technologies have emerged as a highly efficient solution for dissipating heat from IT equipment, regardless of the setting. Whether it's within a data centre, on-premises data hall, cloud environment, or at the edge, liquid cooling is proving its versatility. While most applications have centred on cooling server components, new applications are rapidly materialising across the entire IT infrastructure spectrum. BT Group, in a ground-breaking move, initiated trials of liquid cooling technologies across its networks to enhance energy efficiency and reduce consumption as part of its commitment to achieving net zero status. BT kicked off the trials with a network switch cooled using Iceotope’s Precision Liquid Cooling technology and Juniper Networks QFX Series Switches. With 90% of its overall energy consumption coming from networks, it’s easy to see why reducing energy consumption is such a high priority. In a similar vein, Meta released a study last year confirming the practicality, efficiency and effectiveness of precision liquid cooling technology to meet the cooling requirements of high-density storage disks. Global data storage is growing at such a rate there is an increased need for improved thermal cooling solutions. Liquid cooling for high-density storage is proving to be a viable alternative as it can mitigate for variances and improve consistency. Ultimately, it lowers overall power consumption and improves ESG compliance. Liquid cooling technologies are changing the game when it comes to removing heat from the IT stack. While each of the technologies on the market today have their time and place, there is a reason we are seeing precision liquid cooling in trials that are broadening the use case for liquid cooling. It also ensures maximum efficiency and reliability as it uses a small amount of dielectric coolant to precisely target and remove heat from the hottest components of the server. This approach not only eliminates the need for traditional air-cooling systems, but it allows for greater flexibility in designing IT solutions than any other solution on the market today. There are no hotspots that can slow down performance, no wasted physical space on unnecessary cooling infrastructure, and minimal need for water consumption. As the demand for data increases, the importance of efficient and sustainable IT infrastructure cooling cannot be overstated. Liquid cooling, and precision liquid cooling in particular, is at the forefront of this journey. Whether it's reducing the environmental footprint of data centres, enhancing the energy efficiency of telecommunication networks, or meeting the ever-increasing demands of high-density storage, liquid cooling offers versatile and effective solutions. These trials and applications are not just milestones, they represent a pivotal shift toward a future where cooling is smarter, greener, and more adaptable, empowering businesses to meet their evolving IT demands while contributing to a more sustainable world.

ICS Cool Energy expands with a cold store dedicated team
ICS Cool Energy has announced the expansion of its hire division with a management team dedicated to cold stores. The team of engineering specialists that will manage the UK fleet of low to ultra-low temperature containerised cold store solutions is: Ralph Howes, Cold Store Major Accounts Manager Lisa Townsley, Cold Store Business Development Manager, South Kayla Shaw, Cold Store Business Development Manager, North Mike Elver, Cold Store Senior Sales Engineer The company's cold store container units can be used where raw or finished products require temporary or long-term temperature-controlled storage to preserve or increase shelf life. Cold stores can add storage space and deliver high cooling capacity in combination with precise temperature control from fresh to deep frozen, even in the most severe applications with high ambient temperatures, frequent door openings and long running hours. The units can be also applied in R&D applications, where a temperature-controlled environment is required to enable Accelerated Life Testing (ALT) of critical components. ICS Cool Energy temperature-controlled containers are available in 10ft, 20ft and 40ft length and feature tried and tested refrigeration technology from Thermo King. The units can be plugged in the 360-500V, 50 or 60Hz power supply to ensure cold or frozen temperatures in the container. Designed originally for global, seagoing reefer applications, its containers are equipped with features that make them suitable as static cold stores. They are washable with wash down drains, have man trapped person alarms, lighting inside emergency escape release and conform to BRC audit standards. The units can also be adapted to meet customer needs with options including telematics, remote monitoring, and controlled atmosphere. Customers can also benefit from a modular approach, where multiple cold stores, joined together without connecting walls, can be linked with buildings without time consuming engineering, ground works, or long planning permission process.

Vertiv enhances manufacturing capacity for chilled water solutions
Vertiv has unveiled an upgraded testing room at its thermal management centre near Tognana, Italy. This sizeable investment significantly increases the facility’s testing capacity and manufacturing capabilities in the existing space and demonstrates its ongoing commitment to the advancement of chilled water systems to help drive liquid cooling adoption. It also shows the company’s support of increasing demands on data centres, including high-performance computing, artificial intelligence (AI) and generative AI (GenAI). The upgraded testing room will allow Vertiv to do standard and tailored tests of customer equipment, spanning all of the cooling solutions in its product portfolio - both air and water-cooled, balancing a thermal load greater than 2MW with a chamber air temperature up to 55°C. Moreover, it will also be able to test units equipped with low global warming potential (GWP) refrigerants. This upgrade comes at a critical time for the data centre industry. Operators are expanding rapidly to meet increasing capacity needs, whilst at the same time seeking to minimise their environmental impact. More than 100 European data centre operators and trade associations have signed The Climate Neutral Data Centre Pact committing to climate neutrality by 2030. Chilled water systems play a key role in helping to reach this goal by enabling operators to upscale data centre capacity whilst simultaneously limiting direct and indirect emissions. These systems apply low-GWP refrigerants that enable significant reduction of direct and indirect CO2 emissions, decreasing a data centre’s carbon footprint. “Resource-efficient chilled water solutions are important to the sustainable growth of the data centre industry, and we must continue to focus on how we can evolve and improve the technologies for operators and the environment,” says Karsten Winther, President EMEA at Vertiv. “We are proud of the market-leading work we have achieved for our customers. For example, we worked with sustainability focused colocation provider, Green Mountain, to deploy 5MW of high-efficiency chilled water cooling systems. Enhancing the testing and manufacturing capacity at our thermal management facility allows us to continue to innovate in this space and deliver even more value to the industry and our customers.” “In December, we will introduce our new Vertiv Liebert AFC high capacity, inverter screw with low-GWP refrigerant chiller up to 2200kW to the EMEA market. This new testing room will enable us to test these larger capacity units,” says Roberto Felisi, Senior Director, Thermal Global Core Offering and EMEA Business Leader at Vertiv. “We continue to explore opportunities to further invest in our capabilities to support projected growth and demand for thermal management systems, particularly liquid cooling solutions.” To celebrate the latest expansion, Vertiv welcomed employees’ families for a special visit to the facility, including the Thermal Management Customer Experience Centre. The open day featured activities designed specifically to engage children and young people, offering insights into data centres and the significance of thermal management. Highlights included guided tours of production lines and primary laboratories, as well as Vertiv’s own data centre which provided a first-hand look at the machines in action. Depending on age, the young visitors could partake in thermodynamics workshops and explore topics such as cold generation and the behaviour of hot and cold air particles. They also got the chance to experience Vertiv’s cutting-edge augmented reality applications, like the Vertiv XR app, and navigate a virtual data centre. Click here for more latest news.

B­­T takes the plunge with new liquid cooling trials
BT Group has announced that it is trialling several liquid cooling technologies that could substantially improve energy consumption and efficiency metrics in its networks and IT infrastructure, in pursuit of its commitment to becoming a net zero business by the end of March 2031. The group will trial precision liquid cooled network switches using a solution provided by Iceotope and Juniper Networks QFX Series switches, which are widely used in existing network cloud architectures. Ahead of the trial, they have together demonstrated a replica ‘set-up’ using an HP x86 server at BT’s Sustainability Festival. The demonstration showed how power used to cool a network switch typically deployed in a data centre could be significantly reduced. All electronic and electrical systems generate heat during operation that must be dissipated to maintain working capability. Like most large data centres, network and IT equipment across its estate is currently cooled using air-based systems. As network capacity and demands increase, next generation IT and network hardware will have to work harder and will become hotter. Consequently, the power needed to cool them will increase, driving up energy consumption and operational cost. BT Group is, therefore, exploring numerous alternative cooling techniques and in addition to its trial with Iceotope and Juniper, the company will trial the following liquid cooling systems. Precision liquid cooled networking servers and data centre equipment, with Iceotope and Juniper Full immersion of networking servers in an immersion tank, with Immersion4 Liquid-cooled cold plates of networking equipment in a cooling enclosure, with Nexalus Cooling using sprayed-on partial immersion of data centre equipment, with Airsys. Typically, these techniques bring several benefits including a 40-50% reduction in power needed to cool systems vs air cooling, higher equipment density saving on real estate footprint and therefore further power usage reductions, and reduced material usage-reducing carbon footprint. Further, rather than heat dissipated into the air, liquid cooling systems can channel exhausted heat to be reused to heat other parts of a building. Liquid cooling enabling equipment can also be deployed in more environmentally challenging environments such as areas with more contaminants. Maria Cuevas, Networks Research Director, BT Group, says, “As the UK’s largest provider of fixed-line broadband and mobile services in the UK, it isn’t a surprise that over 90% of our overall energy consumption – and nearly 95% of our electricity - comes from our networks. In a world of advancing technology and growing data demands, it’s critical that we continue to innovate for energy efficiency solutions. Liquid cooling for network and IT infrastructure is one part of a much bigger jigsaw but is an area we’re very excited to explore with our technology partners.”

Iceotope Technologies announced as 'Great Place to Work'
Iceotope Technologies has announced its recent recognition as one of the 2023 Best Workplaces in the UK by Great Place to Work. This prestigious accolade underscores its resolute commitment to fostering a culture of unwavering excellence and creating an environment that magnetises and retains top-tier talent.  The 'Great Place to Work' certification acknowledges Iceotope's commitment and dedication to providing an extraordinary workplace experience for its team. After a thorough evaluation of its workplace practices, policies, and employee feedback, it has been recognised as a company that values collaboration, growth and employee wellbeing.  "We are absolutely thrilled to receive the ‘Great Place To Work’ certification, a recognition of Iceotope’s commitment to enabling our team to thrive professionally and personally," says, David Craig, CEO of Iceotope. "This recognition echoes our core values and our dedication to cultivating a workplace that celebrates innovation and fosters the growth of every individual. We believe that by nurturing a culture of excellence, we can continue to attract and empower exceptional talent." The journey towards this certification mirrors the company's core principles — a hunger for knowledge, curiosity and a commitment to solving real-world problems with innovative solutions. “Iceotope's emphasis on engineering excellence is at the heart of its achievements. Purpose-driven engineering, aligned with customer needs and global sustainability goals, reflects the company's dedication to creating progress that benefits both its clients and the environment,” says David.  As Iceotope celebrates this remarkable achievement, the company sets its sights on elevating standards of employee engagement and satisfaction even further. The commitment to maintaining and enhancing the 'Great Place to Work' status reflects its determination to cultivate a culture characterised by growth, collaboration and outstanding accomplishments. The complete list of winning companies can be found here.   Click here for more latest news.

Spirotech offers cooling solutions for data centres
The complex nature of data centres mean that customer information needs to be secure and protected from outside computer viruses and hackers, otherwise millions of accounts could be accessed and breached. There is another potential danger that can have a far-reaching impact on these huge, linked computer systems, and that is over-heating. It can cause untold damage and result in a major breakdown in service delivery. Key to preventing this from happening is in the design of reliable cooling systems with ‘back-up’ structures in place to ensure continuity of service. Rob Jacques, Spirotech’s Business Director UK, provides an insight into the complexities surrounding data centres and why only a handful of UK companies are ‘geared-up’ to not only specify cooling systems for this niche sector, but to provide a ‘cradle-to-grave’ service. In an age when we want everything instantly, stored data holds the key to so much of our everyday life and at the heart of this needs to be the smooth operation of data centres. Keeping them running around-the-clock requires meticulous design of the cooling systems from the outset, and within the ‘blueprint’, needs to be a back-up / failsafe plan covering such elements as the chiller, pumps and pressurisation. There are additional complexities to be taken into consideration and incorporated into the design, especially in terms of the communication between the plant itself and critical equipment parts. Quite simply, the bigger the computer, the more it has to be cooled. To put this in perspective, the rise of cloud computers means that a huge amount of energy is required to manage and maintain all the data, with tens of megawatts of computer power needed. This means computers occupy thousands of square metres of space. If the chillers or the cooling programmes were to fail, then data could be lost on a large scale. Of course, all equipment is subject to fail at some point and that is when the back-up measures need to kick-in and ensure continuity of an effective cooling system. Spirotech's control systems feedback data from pumps, valves, pressurisation units and degassers. For example, it can be noticed from the vacuum degasser, how much air has been removed over a certain period and when, as well as provide valuable information revealing trends within the system. The same applies to the pressurisation units, information is gathered over its operational lifespan revealing what the pressure has been, report on any leaks and needs to bring in more water. There is a link between the pressurisation units and vacuum degassers. Any faults can be signalled and sent over to whoever needs the data. A poorly designed, installed and maintained pressurisation system can lead to negative pressures around the circuit. Air can be drawn in through automatic air vents, gaskets and micro leaks. High pressure situations can lead to water being emitted through the safety valves and the subsequent frequent addition of further raw refill water. The top control unit has the electronic capabilities to effectively manage pressurisation within the system and be programmed to work in parallel with the back-up system. Air and dirt separators are another key component to maintaining the ongoing health of any heating and ventilation system and keeping pipework clean is essential.  Within this sector, there is a much smaller community serving the data centres. It’s an area not every company wants to be in, or is geared-up to serve. Spirotech has a depth of knowledge through working with data centre installations to provide an all-encompassing service. It listens closely to the needs of the client, design a bespoke system, and as a leading manufacturer, can supply the right equipment for the project. Customers also have peace of mind that it can supply spares during a tight deadline. It’s not just about getting the design right from the outset, it’s also about providing the ongoing technical and maintenance support for the project going forward. Click here for more latest news.

Why hybrid cooling is the future for data centres
Gordon Johnson, Senior CFD Manager, Subzero Engineering Rising rack and power densities are driving significant interest in liquid cooling for many reasons. Yet, the suggestion that one size fits all ignores one of the most fundamental aspects of potentially hindering adoption - that many data centre applications will continue to utilise air as the most efficient and cost-effective solution for their cooling requirements. The future is undoubtedly hybrid, and by using air cooling, containment, and liquid cooling together, owners and operators can optimise and future-proof their data centre environments. Today, many data centres are experiencing increasing power density per IT rack, rising to levels that just a few years ago seemed extreme and out of reach, but today are considered both common and typical while simultaneously deploying air cooling. In 2020 for example, the Uptime Institute found that due to compute-intensive workloads, racks with densities of 20kW and higher are becoming a reality for many data centres. This increase has left data centre stakeholders wondering if air-cooled IT equipment (ITE) along with containment used to separate the cold supply air from the hot exhaust air has finally reached its limits and if liquid cooling is the long-term solution. However, the answer is not as simple as yes or no. Moving forward, it’s expected that data centres will transition from 100% air cooling to a hybrid model, encompassing air and liquid-cooled solutions with all new and existing air-cooled data centres requiring containment to improve efficiency, performance, and sustainability. Additionally, those moving to liquid cooling may still require containment to support their mission-critical applications, depending on the type of server technology deployed. One might ask why the debate of air versus liquid cooling is such a hot topic in the industry right now? To answer this question, we need to understand what’s driving the need for liquid cooling, the other options, and how can we evaluate these options while continuing to utilise air as the primary cooling mechanism. Can air and liquid cooling coexist? For those who are newer to the industry, this is a position we’ve been in before, with air and liquid cooling successfully coexisting, while removing substantial amounts of heat via intra-board air-to-water heat exchangers. This process continued until the industry shifted primarily to CMOS technology in the 1990s, and we’ve been using air cooling in our data centres ever since. With air being the primary source used to cool data centres, ASHRAE (American Society of Heating, Refrigeration, and Air Conditioning Engineers) has worked towards making this technology as efficient and sustainable as possible. Since 2004, it has published a common set of criteria for cooling IT servers with the participation of ITE and cooling system manufacturers entitled ‘TC9.9 Thermal Guidelines for Data Processing Environments’. ASHRAE has focused on the efficiency and reliability of cooling the ITE in the data centre. Several revisions have been published with the latest being released in 2021 (revision 5). This latest generation TC9.9 highlights a new class of high-density air-cooled ITE (H1 class) which focuses more on cooling high-density servers and racks with a trade-off in terms of energy efficiency due to lower cooling supply air temperatures recommended to cool the ITE. As to the question of whether or not air and liquid cooling can coexist in the data centre white space, it’s done so for decades already, and moving forward, many experts expect to see these two cooling technologies coexisting for years to come. What do server power trends reveal? It’s easy to assume that when it comes to cooling, a one-size will fit all in terms of power and cooling consumption, both now and in the future, but that’s not accurate. It’s more important to focus on the actual workload for the data centre that we’re designing or operating. In the past, a common assumption with air cooling was that once you went above 25kW per rack, it was time to transition to liquid cooling. But the industry has made some changes in regards to this, enabling data centres to cool up to and even exceed 35kW per rack with traditional air cooling. Scientific data centres, which include largely GPU-driven applications like machine learning, AI, and high analytics like crypto mining, are the areas of the industry that typically are transitioning or moving towards liquid cooling. But if you look at some other workloads like the cloud and most businesses, the growth rate is rising but it still makes sense for air cooling in terms of cost. The key is to look at this issue from a business perspective, what are we trying to accomplish with each data centre? What’s driving server power growth? Up to around 2010, businesses utilised single-core processors, but once available, they transitioned to multi-core processors, however, there still was a relatively flat power consumption with these dual and quad-core processors. This enabled server manufacturers to concentrate on lower airflow rates for cooling ITE, which resulted in better overall efficiency. Around 2018, with the size of these processors continually shrinking, higher multi-core processors became the norm and with these reaching their performance limits, the only way to continue to achieve the new levels of performance by compute-intensive applications is by increasing power consumption. Server manufacturers have been packing in as much as they can to servers, but because of CPU power consumption, in some cases, data centres were having difficulty removing the heat with air cooling, creating a need for alternative cooling solutions such as liquid. Server manufacturers have also been increasing the temperature delta across servers for several years now, which again has been great for efficiency since the higher the temperature delta, the less airflow that’s needed to remove the heat. However, server manufacturers are, in turn, reaching their limits, resulting in data centre operators having to increase the airflow to cool high-density servers and to keep up with increasing power consumption. Additional options for air cooling Thankfully, there are several approaches the industry is embracing to cool power densities up to and even greater than 35kW per rack successfully, often with traditional air cooling. These options start with deploying either cold or hot aisle containment. If no containment is used typically, rack densities should be no higher than 5kW per rack, with additional supply airflow needed to compensate for recirculation air and hot spots. What about lowering temperatures? In 2021, ASHRAE released their 5th generation TC9.9, which highlighted a new class of high-density air-cooled IT equipment, which will need to use more restrictive supply temperatures than the previous class of servers. At some point, high-density servers and racks will also need to transition from air to liquid cooling, especially with CPUs and GPUs expected to exceed 500W per processor or higher in the next few years. But this transition is not automatic and isn’t going to be for everyone. Liquid cooling is not going to be the ideal solution or remedy for all future cooling requirements. Instead, the selection of liquid cooling instead of air cooling has to do with a variety of factors, including specific location, climate (temperature/humidity), power densities, workloads, efficiency, performance, heat reuse, and physical space available. This highlights the need for data centre stakeholders to take a holistic approach to cooling their critical systems. It will not and should not be an approach where only air or only liquid cooling is considered moving forward. Instead, the key is to understand the trade-offs of each cooling technology and deploy only what makes the most sense for the application. Click here for more thought leadership.

Paying attention to data centre storage cooling
Authored by Neil Edmunds, Director of Innovation, Iceotope With constant streams of data emerging from the IoT, video, AI and more, it is no surprise we are expected to generate 463EB of data each day by 2025. How we access and interact with data is constantly changing and is going to have a real impact on the processing and storage of that data. In just a few years, it's predicted that global data storage will exceed 200ZB with half of that stored in the cloud. This presents a unique challenge for hyperscale data centres and their storage infrastructure. According to Seagate, cloud data centres choose mass capacity hard disk drives (HDDs) to store 90% of their exabytes. HDDs are tried and tested technology, typically found in a 3.5in form factor. They continue to offer data centre operators cost effective storage at scale. The current top-of-the-range HDD features 20TB capacity. By the end of the decade that is expected to reach 120TB+, all within the existing 3.5in form factor. The practical implications of this show a need for improved thermal cooling solutions. More data storage means more spinning of the disks, higher speed motors, more actuators – all of which translates to more power being used. As disks go up in power, so does the amount of heat produced by them. Next, with the introduction of helium into the hard drives in the last decade, performance has not only improved, thanks to less drag on the disks, but the units are now sealed. There is also ESG compliance to consider. With data centres consuming 1% of global electricity demand and cooling power accounting for more than 35% of a data centre’s total energy consumption, pressure is on data centre owners to reduce this consumption. Comparison of cooling technologies Traditionally, data centre environments use air cooling technology. The primary way of removing heat with air cooling methods is by pulling increasing volumes of airflow through the chassis of the equipment. Typically, there is a hot aisle behind the racks and a cold aisle configuration in front of the racks which dissipates the heat by exchanging warm air with cooler air. Air cooling is widely deployed and well understood. It is also well engrained into nearly every data centre around the world. However, as the volume of data evolves, it is becoming increasingly likely that air cooling will no longer be able to ensure an appropriate operating environment for energy dense IT equipment. Technologies like liquid cooling are proving to be a much more efficient way to remove heat from IT equipment. Precision liquid cooling, for example, circulates small volumes of dielectric fluid across the surface of the server, removing almost 100% of the heat generated by the electronic components. There are no performance throttling hotspots and no front to back air cooling, or bottom to top immersion constraints which are present in tank solutions. While initial applications of precision liquid cooling have been in a sealed chassis for cooling server components, given the increased power demands of HDD, storage devices are also an ideal application. High density storage demands With high density HDD, traditional air cooling pulls air through the system from front to back. What typically occurs in this environment is that disks in the front become much cooler than those in the back. As the cold air comes and travels through the JBOD device, the air gets hotter. This can result in a 20°C or more temperature differential between the discs at the front and back of the unit depending on the capacity of the hard drive. For any data centre operator, consistency is key. When disks are varying by nearly 20°C from front to back, there is inconsistent wear and tear on the drives leading to unpredictable failure. The same goes for variance across the height of the rack, as lower devices tend to consume the cooler air flow coming up from the floor tiles. Liquid cooling for storage While there will always be variances and different tolerances taking place within any data centre environment, liquid cooling can mitigate for these variances and improve consistency. In 2022, Meta published a study showcasing how an air cooled, high density storage system was reengineered to utilise single phase liquid cooling. The study found that precision liquid cooling was a more efficient means of cooling the HDD racks with the following results: The variance in temperature of all HDDs was just 3°C, regardless of location inside the JBODs. HDD systems could operate reliably in rack water inlet temperatures up to 40°C. System-level cooling power was less than 5% of the total power consumption. Mitigating acoustic vibrational issues. While consistency is a key benefit, cooling all disks at a higher water temperature is important too. This means data centre operators do not need to provide chilled water to the unit. Reduced resource consumption – electrical, water, space, audible noise – all lead to greater reduction in TCO and improved ESG compliance. Both of which are key benefits for today’s data centre operators. As demand for data storage continues to escalate, so will the solutions needed by hyperscale data centre providers to efficiently cool the equipment. Liquid cooling for high density storage is proving to be a viable alternative as it cools the drives at a more consistent temperature and removes vibration from fans, with lower overall end-to-end power consumption and improved ESG compliance. At a time when data centre operators are under increasing pressure to reduce energy consumption and improve sustainability metrics, this technology may not only be good for the planet, but also good for business. Enabling innovation in storage systems Today’s HDDs are designed with forced air cooling in mind, so it stands to reason that air cooling will continue to play a role in the short term. For storage manufacturers to embrace new alternatives demonstrations of liquid cooling technology, like the one Meta conducted, are key to ensuring adoption. Looking at technology trends moving forward, constantly increasing fan power on a rack will not be a long term sustainable solution. Data halls are not getting any larger and costs to cool a rack are increasing. The need for more data storage capacity at greater density is exponentially growing. Storage designed for precision liquid cooling will be smaller, use fewer precious materials and components, perform faster and fail less often. The ability to deliver a more cost effective HDD storage solution in the same cubic footprint, delivers not only a TCO benefit but contributes to greater ESG value as well. Making today's technology more efficient and removing limiting factors for new and game changing data storage methods can help us meet the global challenges we face and is a step forward towards enabling a better future. Click here for more thought leadership.

Castrol and Hypertec accelerate immersion cooling technology
Castrol has announced its collaboration with Hypertec. To accelerate the widespread adoption of Hypertec’s immersion cooling solutions for data centres, supported by Castrol’s fluid technology, both companies will collaborate to develop and test the immersion cooling technology at Castrol’s global headquarters in Pangbourne, UK. Castrol announced in 2022 that it will invest up to £50m investment in its headquarters at Pangbourne. It is pleased to have the first systems in place and fully functional for research to begin on furthering immersion cooling technologies across systems, servers and fluids to provide world class, integrated solutions to customers. Hypertec is the first server OEM to join Castrol in its drive to accelerate immersion cooling technology. The two will leverage Castrol’s existing collaboration with Submer, a leader in immersion cooling technology, who has provided its SmartPod and MicroPod tank systems to the Pangbourne facility, which have been modified to test new fluids and new server technologies. Working together, Castrol will be able to continue to develop its offers for data centre customers and look to accelerate the adoption of immersion cooling as a path to explore more sustainable and more efficient data centre operations. With immersion cooling, water usage and the power consumption needed to operate and cool server equipment can be significantly reduced. Click here for latest data centre news.

Vertiv's guidance on data centres during extreme heat
Summer in the northern hemisphere has just started, but already devastating heatwaves have washed over much of the US, Mexico, Canada, Europe and Asia. Widespread wildfires in Canada have triggered air quality alerts across that country and much of the eastern half of the US and similar extreme heat events across Asia have caused widespread power outages, Europe also continues to break heat records as the fastest warming continent. The data centre cooling experts at Vertiv have issued updated guidance for managing the extreme heat. Climate change has made the past eight years the hottest on record, but with an El Niño weather pattern compounding the issue this year, many forecasts anticipate record-breaking temperatures in 2023. The sizzling outdoor temperatures and their aftermath create significant challenges for data centre operators who already wage a daily battle with the heat produced within their facilities. There are steps organisations can take to mitigate the risks associated with extreme heat. These include: Clean or change air filters: The eerie orange haze that engulfed New York was a powerful visual representation of one of the most immediate and severe impacts of climate change. For data centre operators, it should serve as a reminder to clean or change air filters in their data centre thermal management systems and HVAC systems. Those filters help to protect sensitive electronics from particulates in the air, including smoke from faraway wildfires. Accelerate planned maintenance and service: Extreme heat and poor air quality tax more than data centre infrastructure systems. Electricity providers often struggle to meet the surge in demand that comes with higher temperatures, and outages are common. Such events are not the time to learn about problems with UPS system or cooling unit. Cleaning condenser coils and maintaining refrigerant charge levels are examples of proactive maintenance that can help to prevent unexpected failures. Activate available efficiency tools: Many modern UPS systems are equipped with high efficiency eco-modes that can reduce the amount of power the system draws from the grid. Heatwaves like those seen recently push the grid to its limits, meaning any reductions in demand can be the difference between uninterrupted service and a devastating outage. Leverage alternative energy sources: Not all data centres have access to viable alternative energy, but those that do should leverage off-grid power sources. These could include on/off-site solar arrays or other alternate sources, such as off-site wind farms and lithium-ion batteries, to enable peak shifting or shaving. Use of generators is discouraged during heat waves unless an outage occurs. Diesel generators produce more greenhouse gas and emissions associated with climate change than backup options that use alternative energy. In fact, organisations should postpone planned generator testing when temperatures are spiking. “These heatwaves are becoming more common and more extreme, placing intense pressure on utility providers and data centre operators globally,” says John Niemann, Senior Vice President for the Global Thermal Management Business for Vertiv. “Organisations must match that intensity with their response, proactively preparing for the associated strain not just on their own power and cooling systems, but on the grid as well. Prioritising preventive maintenance service and collaborating with electricity providers to manage demand can help reduce the likelihood of any sort of heat-related equipment failure.” “Again this year, parts of Europe are experiencing record setting heat, and in our business we specifically see the impact on data centres. Prioritising thermal redundancy and partnering with a service provider with widespread local presence and first-class restoration capabilities can make the difference in data centre availability,” says Flora Cavinato, Global Service Portfolio Director. “Swift response times and proactive maintenance programs can help organisations to sustain their business operations while effectively optimising their critical infrastructure.” Click here for more on Vertiv.



Translate »