Advertise on DCNN Advertise on DCNN Advertise on DCNN
Sunday, June 15, 2025

Cooling


Castrol’s work with Submer on immersion cooling gains momentum
Castrol is working with Submer, a frontrunner in immersion cooling systems. Castrol’s cooling fluids are designed to maximise data centre cooling efficiency, offer enhanced equipment protection and can help safeguard against facility downtime by comprehensive material testing. Immersion cooling involves submerging electronic components in a non-conductive liquid coolant. Compared to conventional cooling methods, immersion cooling can help reduce the consumption of energy and water needed to cool servers and storage devices, and enables the reuse of generated waste heat. Submer has now conducted a comprehensive compatibility study through its Immersion Centre of Excellence on Castrol ON Immersion Cooling Fluid DC 20. The product exceeded Submer’s technical requirements and has now been fully approved and warranty-backed for use across Submer equipment. “This is a significant milestone in Castrol’s collaboration with Submer,” comments Nicola Buck, CMO, Castrol. “We are now well-positioned to work together in developing a joint offer to data centre customers and magnify each other’s market reach and impact. We aspire to make immersion cooling technology mainstream in the data centre industry and will develop integrated customer offers to further improve the technology and address hurdles for adoption.”  Nicola adds, “We strongly believe in the future of immersion cooling technology as it can help achieve significant operational benefits for the data centre industry and address some of its future challenges. Through this collaboration with Submer, we want to help optimise the efficiency and energy usage across some of the world’s most powerful data centres. This is also aligned to Guiding Principle 4 of our PATH360 sustainability programme and the work we are doing with our customers to help them save energy, waste and water.” “Submer’s journey to fluid standardisation began in 2020. We're now proud to be seeing tangible results from our rigorous array of testing, including thermal performance, oxidation, and a sustainability assessment, all of which ensure all fluids meet industry standards. With Castrol on board, the widescale adoption of immersion cooling is one step closer”, says Peter Cooper, VP of Fluids and Chemistry at Submer.

Improved data centre resilience and efficiency is a cool outcome from Schneider Electric upgrade at UCD
The Future Campus project at University College Dublin called for space utilised by facility plant and equipment to be given up for development to support the student population. Total Power Solutions, an Elite Partner to Schneider Electric, worked with UCD’s IT Services organisation to upgrade its primary data centre cooling system, to provide greater resilience for its HPC operations whilst releasing valuable real estate. Introduction: data centres at Ireland’s largest university University College Dublin (UCD) is the largest university in Ireland, with a total student population of about 33,000. It is one of Europe’s leading research-intensive universities with faculties of medicine, engineering, and all major sciences, as well as a broad range of humanities and other professional departments.  The university’s IT infrastructure is essential to its successful operation, for academic, administration and research purposes. The main campus at Belfield, Dublin is served by two on-premises data centres that support all the IT needs of students, faculty and staff, including high-performance computing (HPC) clusters for computationally intensive research. The main data centre in the Daedalus building hosts all the centralised IT including storage, virtual servers, Identity and Access Management, business systems, networking, and network connectivity, in conjunction with a smaller on-premises data centre.  “Security is a major priority, so we don’t want researchers having servers under their own desks. We like to keep all applications inside the data centre, both to safeguard against unauthorised access - as universities are desirable targets for hackers - and for ease of management and efficiency.” Challenges: ageing cooling infrastructure presents downtime threat and reputational damage Resilience is a key priority for UCD’s IT Services. Also, with its campus located close to Dublin’s city centre, real estate is at a premium. There are continuing demands for more student facilities and consequently the need to make more efficient use of space by support services such as IT.  Finally, there is a pervasive need to maintain services as cost-effectively as possible and to minimise environmental impact in keeping with a general commitment to sustainability. As part of a major strategic development of the university’s facilities called Future Campus, the main Daedalus data centre was required to free up some outdoor space taken up by a mechanical plant and make it available for use by another department. The IT Services organisation took this opportunity to revise the data centre cooling architecture to make it more energy and space efficient, as well as more resilient and scalable. “When the data centre was originally built, we had a large number of HPC clusters and consequently a high rack power density,” says Tom Cannon, Enterprise Architecture Manager at UCD. “At the time we deployed a chilled-water cooling system as it was the best solution for such a load. However, as the technology of the IT equipment has advanced to provide higher processing capacity per server, the cooling requirement has reduced considerably even though the HPC clusters have greatly increased in computational power.” One challenge with the chilled water system was that it relied upon a single set of pipes to supply the necessary coolant, which therefore represented a single point of failure. Any issues encountered with the pipework, such as leaks, could therefore threaten the entire data centre with downtime. This could create problems at any time in the calendar, however, were it to occur at critical moments such as during exams or registration it would have a big impact on the university community. Reputational damage, both internally and externally, would also be significant. Solution: migration to Schneider Electric Uniflair InRow DX Cooling Solution resolves reliability, scalability and space constraints UCD IT services took the opportunity presented by the Future Campus project to replace the existing chilled water-based cooling system with a new solution utilising Schneider Electric’s Uniflair InRow Direct Expansion (DX) technology, utilising a refrigerant vapour expansion and compression cycle. The condensing elements have been located on the roof of the data centre, conveniently freeing up significant ground space on the site formerly used for a cooling plant. Following on from an open tender, UCD selected Total Power Solutions, a Schneider Electric Elite Partner, to deliver the cooling update project. Total Power Solutions had previously carried out several power and cooling infrastructure installations and upgrades on the campus and is considered a trusted supplier to the university. Working together with Schneider Electric, Total Power Solutions was responsible for the precise design of an optimum solution to meet the data centre’s needs and its integration into the existing infrastructure. A major consideration was to minimise disruption to the data centre layout, keeping in place the Schneider Electric EcoStruxure Row Data Centre System (formerly called a Hot Aisle Containment Solution, or HACS). The containment solution is a valued component of the physical infrastructure, ensuring efficient thermal management of the IT equipment and maximising the efficiency of the cooling effort by minimising the mixing of the cooled supply air and hot return – or exhaust - airstream. The new cooling system provides a highly efficient, close-coupled approach which is particularly suited to high density loads. Each InRow DX unit draws air directly from the hot aisle, taking advantage of higher heat transfer efficiency and discharges room-temperature air directly in front of the cooling load. Placing the unit in the row yielding 100% sensible capacity and significantly reduces the need for humidification. Cooling efficiency is a critical requirement for operating a low PUE data centre, but the most obvious benefit of the upgraded cooling system is the built-in resilience afforded by the 10 independent DX cooling units. No longer is there a single point of failure; there is currently sufficient redundancy in the system that if one of the units fails, the others can take up the slack and continue delivering cooling with no impairment of the computing equipment in the data centre. “We calculated that we might just have managed with eight separate cooling units,” says Tom, “but we wanted the additional resilience and fault tolerance that using 10 units gave us.” Additional benefits of the new solution include its efficiency – the system is now sized according to the IT load and avoids the overcooling of the data centre both to reduce energy use and improve its PUE. In addition, the new cooling system is scalable according to the potential requirement to add further HPC clusters or accommodate innovations in IT, such as the introduction of increasingly powerful but power-hungry CPUs and GPUs. “We designed the system to allow for the addition of four more cooling units if we need them in the future,” says Tom. “All of the power and piping needed is already in place, so it will be a simple matter to scale up when that becomes necessary.” Implementation: upgrading a live environment at UCD It was essential while installing the new system that the data centre kept running as normal and that there was no downtime. The IT department and Total Power Solutions adopted what Tom Cannon calls a 'Lego block' approach; first to consolidate some of the existing servers into fewer racks and then to move the new cooling elements into the freed-up space. The existing chilled-water system continued to function while the new DX-based system was installed, commissioned and tested. Finally, the obsolete cooling equipment was decommissioned and removed.  Despite the fact that the project was implemented at the height of the COVID-19 pandemic with all the restrictions on movement and the negative implications for global supply chains, the project ran to schedule and the new equipment was successfully installed and implemented without any disruption to IT services at UCD. Results: a cooling boost for assured IT services and space freed for increased student facilities The new cooling equipment has resulted in an inherently more resilient data centre with ample redundancy to ensure reliable ongoing delivery of all hosted IT services in the event that one of the cooling units fails. It has also freed up much valuable real-estate that the university can deploy for other purposes. As an example, the building housing the data centre is also home to an Applied Languages department. “They can be in the same building because the noise levels of the new DX system are so much lower than the chilled-water solution,” says Tom. “That is clearly an important issue for that department, but the DX condensers on the roof are so quiet you can’t tell that they are there. It’s a much more efficient use of space.” With greater virtualisation of servers, the overall power demand for the data centre has been dropping steadily over the years. “We have gone down from a power rating of 300kW to less than 100kW over the past decade,” says Tom. The Daedalus data centre now comprises 300 physical servers but there are a total of 350 virtual servers split over both data centres on campus. To maximise efficiency, the university also uses EcoStruxure IT management software from Schneider Electric, backed up with a remote monitoring service that keeps an eye on all aspects of the data centre’s key infrastructure and alerts IT Services if any issues are detected. The increasing virtualisation has seen the Power Usage Effectiveness (PUE) ratio of the data centre drop steadily over the years. PUE is the ratio of total power consumption to the power used by the IT equipment only and is a well understood metric for electrical efficiency. The closer to 1.0 the PUE rating, the better. “Our initial indications are that we have managed to improve PUE from an average of 1.42 to 1.37,” says Tom. “However, we’re probably overcooling the data centre load currently, as the new cooling infrastructure settles. Once that’s happened, we’re confident that we can raise temperature set points in the space and optimise the environment in order to make the system more energy efficient, lower the PUE and get the benefit of lower cost of operations.” The overall effects of installing the new cooling system are therefore: greater resilience and peace of mind; more efficient use of space for the benefit of the university’s main function of teaching; greater efficiency of IT infrastructure; and consequently a more sustainable operation into the future.

Swindon data centre goes carbon neutral in sustainability push
A data centre in Swindon, Carbon-Z, has become one of the first in the UK to be fully carbon neutral, following an overhaul of its site and work practices. This includes the submersion of all hardware components in cooling liquid and sourcing electricity from green energy providers. Plans are also in place for installing solar panels on the site’s roof. The site was previously known as SilverEdge and is rebranding itself to reflect the change of direction in how it operates and the services it provides to clients. It now hopes to inspire a wider shift towards sustainability within the data centre industry, which accounts for more greenhouse gas emissions annually than commercial flights. Jon Clark, Commercial and Operations Director at Carbon-Z, comments, “As the UK and the world move towards achieving net zero emissions by 2050, our industry is responsible for making data centres greener and more efficient. At Carbon-Z, we continually look for new ways to improve our sustainability, with the goal being to get our data centres to carbon neutral, then carbon zero and then carbon negative. We believe this is possible and hope to see a wider movement among our peers in the same direction over the coming years.” Playing it cool The growing intensity of computing power, as well as high performance demands, has resulted in rapidly rising temperatures within data centres and a negative cycle of energy usage. More computing means more power, more power means more heat, more heat demands more cooling, and traditional air-cooling systems consume massive amounts of power, which in turns contributes to the heating up of sites. To get around this, Carbon-Z operates using liquid immersion cooling, a technology which involves the submersion of hardware components in dielectric liquid (which does not conduct electricity) and conveys heat away from the heat source. This greatly reduces the need for cooling infrastructure and costs less than traditional air cooling. The smaller amount of energy that is now needed to power the Swindon site can now be sourced through Carbon-Z’s Green Energy Sourcing.  While its clear that immersion cooling is quickly catching on - it is predicted to grow from $243 million this year to $700 million by 2026 - the great majority of the UK’s more than 600 data centres are not making use of it, and continue to operate in a way which is highly energy intensive and carbon emitting. Riding the wave As part of its rebrand, Carbon-Z has also updated the kinds of services it offers to customers to make sure that they are financially, as well as environmentally, sustainable. Its new service, Ocean Cloud, has been designed with this in mind, providing customers dedicated servers and a flat-fee approach to financing. Having a dedicated server within a data centre means that spikes in demand from other tenants has no effect at all on yours, avoiding the ‘noisy neighbour’ problem associated with the multi-tenant model favoured by many large operators. This makes the performance of the server more reliable and energy efficient. Ocean Cloud also solves one of the other major problems with other cloud services - overspend - through its flat-fee approach. Customers are charged a fixed fee that covers the dedicated server and associated storage, as well as hosting and remote support of the hardware infrastructure to reduce maintenance overheads. Jon comments, “We are very proud of Ocean Cloud, as it allows us to offer clients a service that is not only better for the ocean, the planet and for our local communities than other hosted services, but also brings clear operational and cost-related benefits. Striking this balance is crucial to ensure customers are on board with the transition to more sustainable data centre operations, especially at times like these when many companies are feeling the financial pinch off the back of rising inflation.”

Castrol to build data centre immersion cooling test facilities
Castrol plans to build new state-of-the-art immersion cooling development and test facilities for data centres at its global headquarters in Pangbourne, UK. This commitment is part of bp’s recent announcement of plans to invest up to £50 million to set up a new battery test centre and analytical laboratories at the UK site. The new facility will help Castrol’s thermal management experts to accelerate the development of advanced immersion fluid technologies specifically for data centres and IT/communications infrastructure. It will also support test and validation programmes for customers and partners. Building on its existing collaboration with Submer, Castrol plans to install Submer’s SmartPod and MicroPod tank systems, that have been adapted to test new fluids and new server equipment. Earlier this year, both companies agreed to work together on accelerating the adoption and development of immersion cooling technology as a path to more sustainable data centre operations. This announcement will help both parties to intensify their collaboration and strengthen joint development programmes. The facilities will be used to develop and test methods to capture and reuse the heat from data centre operations to further increase operational efficiency. “Immersion cooled data centres could bring huge gains in performance and big reductions in energy wasted in cooling. Together, Submer and Castrol aim to deliver sustainable solutions as the demand for computer power continues to surge. This investment in proven Submer systems is a key step towards joint development with the goal of enhancing performance and improving data centre sustainability even further through integrated energy solutions”, says Rebecca Yates, bp’s Technology Vice President - Advanced Mobility and Industrial Products. “Castrol’s investment in Submer’s systems is the next step in our joint mission to accelerate the adoption of immersion cooling technology within the IT industry. The combined expertise of Submer and Castrol aims to provide evidence of how the technology can enhance performance, efficiency and deliver environmental benefits. We look forward to working with Castrol and the wider bp corporation to help the industry become more sustainable”, says Daniel Pope, Co-Founder and CEO, Submer. The acceleration of Castrol’s ambitions in immersion cooling are aimed at supporting the data centre sector’s increasing ambitions to reduce its environmental footprint. According to the International Energy Agency, data centre operations together with the data transmission network are responsible for over 2% of the global electricity consumption in 2020. With sizeable growth expected in the industry, this share is expected to rise. The energy required for the cooling of a data centre makes up close to 40% of the total energy consumed. To operate data centres sustainably, efficient cooling is key. Immersion cooling can also help reduce the water consumption of data centres, which is of increasing importance. Accelerating the adoption of immersion cooling fits with Castrol’s PATH360 sustainability framework, which aims to reduce energy, waste and water and help its commercial customers meet their sustainability goals.

Schneider Electric delivers cooling infrastructure for University College Dublin
Schneider Electric has worked together with Total Power Solutions to design and deliver a new, high efficiency cooling system to help reduce the PUE of University College Dublin’s (UCD) main production data centre. UCD’s data centre was originally designed to accommodate high performance computing (HPC) clusters and provides a platform for research at its university campus. University College Dublin is the largest university in Ireland with a total student population of 33,000. It is also one of Europe’s leading research-intensive universities with faculties of medicine, engineering, and major sciences, as well as a broad range of humanities and other professional departments. As part of a new strategic development plan to free up space at its central Dublin location, the IT services department made the decision to revise and revitalise its data centre cooling architecture to make the facility more energy and space efficient, as well as more resilient and scalable. In response to a public tender, Total Power Solutions, experts in power and cooling infrastructure design and installation, worked with Schneider Electric to secure the contract with a bid to replace the existing data centre cooling system with a Uniflair InRow Direct Expansion (DX) solution. Schneider Electric’s InRow DX cooling technology offers many benefits including a modular design, more predictable cooling, and variable speed fans which help to reduce energy consumption. A scalable and efficient cooling solution for UCD  The new solution at UCD is based on 10 independent InRow DX cooling units, which are adapted to the server load to optimise efficiency. The system is scalable to enable UCD’s IT Services Group to add further HPC clusters and accommodate future innovations in technology. This includes the introduction of increasingly powerful central processing units (CPUs) and graphics processing units (GPUs). The InRow DX cooling units work in conjunction with UCD’s existing EcoStruxure Row Data Centre system, formerly a Hot Aisle Containment Solution (HACS), and provides a highly efficient, close-coupled design that is suited to high density loads. Each InRow DX unit draws air directly from the hot aisle, taking advantage of higher heat transfer efficiency and discharges room-temperature air directly in front of the cooling load, which significantly reduces the need for humidification. “We designed the system to allow for the addition of four more cooling units to meet future requirements for facility expansion and changes in server technology. The overall effects of installing the new system are greater resilience and peace of mind, more efficient use of space for the benefit of the university’s main function of teaching, greater efficiency of IT infrastructure and consequently, a more sustainable operation,” says Tom Cannon, Enterprise Architecture Manager at UCD. Resilience and future expansion Each independent cooling unit also provides additional redundancy in the system, so that if one fails the others have sufficient capacity to continue delivering cool air, ensuring uninterrupted operation of UCDs IT equipment and services. Together Schneider Electric and Total Power Systems also worked to increase the resilience of the system and remove a major single point of failure, which previously existed. This is another major benefit to the university and eliminates risk of outages at critical times such as clearing and examinations. Further, the condensing elements of the cooling system have also been relocated to the roof of the data centre, freeing up significant space formerly used for external cooling plant and equipment. This has released additional land for redevelopment to house new student and university facilities, and the building is now home to an Applied Languages department, illustrating the low noise levels of the DX system compared to the equipment it replaced. The increased efficiency of the new cooling system has also lowered the data centre’s PUE, reducing its energy consumption and its ongoing operational expenses. “The Daedalus data centre at UCD hosts everything from high performance computing clusters for research to the centralised IT that keeps the University running. Total Power Solutions and Schneider Electric worked together to deliver a new, more efficient, and scalable data centre cooling system. The installation took place in a live environment with no downtime, in the midst of extensive construction activities on UCD’s Belfield Campus,” says Paul Kavanagh, Managing Director, Total Power Solutions. “For UCD, having an efficient and highly effective cooling infrastructure was critical to both their HPC research infrastructure and their general IT operations,” adds Marc Garner, VP, Secure Power Division, Schneider Electric UK and Ireland. “By working together with Total Power Solutions, we were able to successfully deliver the new cooling architecture, which will provide UCD with greater resilience of their critical IT systems and will meet the demands of this prestigious university for many years ahead.”

Nidec adds new products to its line-up of cooling modules
Nidec has announced that it has added products to its line-up of water-cooling modules for data centres. In recent years, data centres that support ICT services’ operations are witnessing an increase in thermal load due to the technical advancement of the CPU (Central Processing Unit), the GPU (Graphics Processing Unit), and the ASIC (Application Specific Integrated Circuit) among other products, making the requirements for cooling components increasingly strict. With more than 300W of heat generated from high-end CPUs, heatsinks, fans, and other air-cooling systems are considered insufficient to accommodate such an amount of thermal energy. This is why the need for water-cooling systems is on the rise. Additionally, compared with their air-cooling counterparts, water-cooling products can reduce an entire server’s electricity consumption by approximately 30% (according to the results of Nidec’s own investigations and research). Now, Nidec has added to its line-up CDUs (Coolant Distribution Units), manifold units, water-cooling modules, pumps, and other products, all of which boast high heat-exchange efficiency and low electricity consumption among water-cooling systems. Since any defect to a water-cooling system can directly cause an ICT service to shut down, the redundancy of the CDU’s main components plays crucial roles. Nidec’s latest CDU is of a compact, rack-installable size (4U) with a cooling capability of 80kW. This product, for the first time in the industry, has redundancy in its critical components of pump, electric power source, and control board (all two units per CDU), as well as other excellent features of serviceability and long-term reliability, which enable hot swap (replacing an existing faulty item without having a shutdown). As a company with accurate design, simulation, and machining technologies and equipment used for HDD spindle motors (which we own the largest global market share) and other products, Nidec can offer highly reliable products with an excellent cooling capability at a low price. Individual water-cooling modules can be customized to meet our customers’ requirements such as product sizes and heating values. Nidec stays committed to upgrading its technologies and producing more products in-house to save cost and cover more categories of business and industries. www.nidec.com

Scrolling our way to a climate emergency
By Vijay Madlani, co-CEO, Katrick Technologies The internet is an essential part of modern everyday life. We go online for everything from work to shopping, to communicating - and social media makes up an estimated 35% of this activity. Though posing for selfies isn’t damaging to the environment, the energy consumption and carbon emissions from charging devices, powering the internet, and running data centres can be colossal. As of 2022, 4.62 billion people use social media in one form or another. Moreover, of the estimated seven hours each day that the average person uses the internet, the largest proportion of this is made up of social media, at an estimated 35%, or two hours and 27 minutes. Hands up if you’re guilty too! Even the smallest action on social media produces small amounts of carbon. Instagram emits 1.5g of CO2 per minute of scrolling and posting a photo emits 0.15g. Even in the 28 minutes a day that the average Instagram user browses the app, this would result in at least 42g of CO2 on this platform alone. Each of Facebook’s 2.9 billion active users is estimated to produce 12g of CO2 annually, and on Twitter, sending a single tweet is thought to emit roughly 0.02g - a relatively low figure, until you consider that the 50 million tweets sent out daily across the globe would produce one metric tonne of CO2. The biggest offender is TikTok. Just one minute of scrolling through TikTok videos emits 2.63g of CO2. Even five minutes a day on TikTok would add up to roughly 4800g of CO2 a year per user, which is the equivalent to the emissions released by driving over 21 miles in a car. Though the emissions produced by small actions on an individual scale may not seem that significant, when we account for users’ overall social media use it adds up considerably. Combining this with other activities highlights that our internet use may be harmful to the environment. But how does using social media actually produce emissions? Social media relies on the exchange of large amounts of data - data which needs to be securely and reliably stored. One of the most effective ways is through data centres. Meta is constructing an additional seven million square feet of data centre space in the USA and an additional new centre in Spain, alongside its existing centres. TikTok is set to open its first European data centre later in 2022, and Google currently has 23 worldwide. Data centres worldwide consume just under 200TWh of energy and produce around the same amount of carbon emissions as the global aviation industry at just over 2%. One of the most significant factors in data centre energy consumption is cooling. Most data centres need to be run at a consistent stable temperature. Above these temperatures there are risks of overheating and failure. As servers produce a large amount of heat, keeping the surrounding environment cool to ensure optimum working temperatures is crucial. With the scale of many data centres, keeping them cool is no mean feat. Not only can cooling systems be expensive to implement, powering them requires significant amounts of energy. 90% of the air conditioning and air handling units used by the UK data centre market consume between 26% and 41% of the total energy. To address this, new innovations are in development, like passive cooling systems that use waste heat produced by data centre servers to power a Thermal Vibration Bell (TVB). This example is a unique patented system from Katrick Technologies, which uses bi-fluids to convert heat to fluid vibrations which turn into mechanical oscillations when they hit protruding fins. These fins passively dissipate unwanted hear to provide the required ambient temperatures for servers to run. Initial trials conducted at UK-based data centre provider iomart indicate that the system can reduce the energy used for cooling by 70%, which would reduce an operational carbon footprint significantly. Finding these alternative ways to cool data centres is crucial to support the world’s relentless appetite for social media. As such, innovation in this sector is more important than ever. Internet usage will inevitably continue to underpin many important aspects of modern life. Though social media has revolutionised communications and made our planet more connected, it is important we understand the environmental consequences of these habits and how to counteract them. www.katricktechnologies.com

Rising temperatures highlight need for liquid cooling systems
The rising frequency of extreme weather periods in Europe necessitates a move towards liquid cooling systems, suggests a sector expert. This warning follows record-breaking temperatures in the UK last month, with some locations exceeding 40°C. As a result, a number of high-profile service providers in the nation experienced outages that impacted customer services, the effects of which were felt as far as in the US. One operator attributed the failure to ‘unseasonal temperatures’. However, with the UK MET Office warning that heatwaves are set to become more frequent, more intense and long-lasting, Gemma Reeves, Data Centre Specialist at Alfa Laval, believes that data centres will need to transition to liquid cooling systems in order to cope. She says: “The temperatures observed last month are a sign of what is to come. Summers are continuing to get hotter by the year, so it’s important that data centres are able to manage the heat effectively. “Mechanical cooling methods have long been growing unfit for the needs of the modern data centre, with last month’s weather only serving to highlight this. As both outside temperatures and rack densities continue to rise, more efficient approaches to cooling will clearly be necessary.” Traditional mechanical cooling systems make use of an electronically powered chiller, which creates cold air to be distributed by a ventilation system. However, most mechanical cooling systems in the UK are designed for a maximum outdoor temperature of 32°C - a figure which continues to be regularly exceeded. Gemma believes that liquid cooling can solve this challenge. Cooling with dielectric fluid rather than air means that the cooling systems may be run at much higher temperatures. Liquid cooled principles such as direct-to-chip, single-phase immersive IT chassis, or single-phase immersive tub allow the servers to remain cool despite much higher outdoor air temperatures, while maintaining lower energy consumption and providing options for onward heat reuse. In studies, this has also been shown to increase the lifetime of servers due to maintaining a stable stasis. Gemma concludes: “The data centre sector remains in an era of air-based cooling. That said, July’s recent heatwave may be the stark reminder the sector needs that these systems are not sustainable in the long term. “Liquid cooling is truly the future of data centres, this technique allows us to cool quicker and more efficiently than ever before, which will be a key consideration with temperatures on the rise.”

Airedale appoints Adrian Trevelyan as data centre lead
Airedale has announced the appointment of Adrian Trevelyan as Director of its Cloud Services Business Unit. Responsible for one of Airedale’s most significant commercial operations, Adrian will be accountable for a worldwide territory (excluding US), leading the data centre cooling arm of the Airedale business as it accelerates its growth strategy. Adrian, who has achieved an MBA and Chartered Manager status during his career to date, brings a wealth of data centre knowledge and unwavering leadership skills to his new position. Having worked with Airedale by Modine for 32 years, most recently as After-market Director, he has vast experience of the data centre industry, both at a hands-on and strategic level. Reporting to Adrian will be Airedale’s established data centre focussed engineering, project management and commercial teams. Replacing Adrian as After-market Director will be John Board, who recently joined the company as Service Manager and has already impressed with his passion, knowledge and commitment. John also has a wealth of business-critical cooling experience and holds an accreditation in ‘Data Centre Management and Operations’ from The Uptime Institute and will work closely with Adrian during a transition period to ensure service continuity for clients. Adrian will work alongside Airedale’s existing business unit directors, focussed on Airedale’s strategic customer segments, to continue to accelerate Airedale’s ambitious growth strategy. These are Rob Bedard, who leads the Cloud Services division in the US, Jonathan Jones, who leads the Commercial and Industrial Sector for the UK, and Asim Ansari, who leads the Enterprise, Telecoms, Edge and International Channel Sales division. All four business unit leaders report to Vice President, Jonas Caino. Jonas says, “I am delighted to announce the appointment of Adrian Trevelyan as Cloud Services Business Unit Leader. Adrian has consistently delivered throughout his career with Airedale and is an esteemed member of our management team. Well-liked and trusted by both customers and employees, Adrian’s industry insight and technical knowledge affords him huge respect in the data centre segment and wider HVAC community. “Having Adrian at the helm as we accelerate our growth strategy into the data centre industry is a great thing for both our customers and our business.” Adrian comments, “I am thrilled to be taking up this new position with Airedale by Modine. I have been with the company for a long time and am keen to take it to the next level, supporting my teams along the way.”

University of Hull develops a new cooling system for data centres
As heatwaves become more extreme, cooling solutions are increasingly important, but the challenge is how to build cooling systems without contributing to climate change. The Centre for Sustainable Energy Technologies (CSET) at the University of Hull has developed a new, energy efficient cooling system for data centres. The technology is 50% more efficient than existing indirect evaporative air conditioning, and 90% more than mechanical vapour compression. As a result, the technology delivers near to zero carbon air conditioning and reaches a Coefficient of Performance (COP) of 52.5. The new dew point air conditioning technology works by indirect evaporative cooling - no refrigerant or mechanical compressor is used. Instead, it cools the air by water evaporation, but does not add any moisture into the air-conditioned space, like data centres. Importantly, thanks to the patented complex heat/mass exchanger, the new technology can produce the cooled air with a temperature below the ambient wet-bulb temperature. This reduces the ducting size which would be required if using a conventional evaporative air conditioner. This innovation offers a new method for cooling which could greatly reduce power use and be used worldwide. The technology can be used for cooling a range of buildings but has found a particular home in cooling data centres. The data centre market has grown rapidly over the past 40 years, especially during the pandemic when global internet traffic surged by 40% at the start of 2020. The system was first rolled out on an industrial scale in 2020, when CSET installed two 4kW systems at the Aura Innovation Centre. Compared to the existing mechanical vapour compression AC systems installed at the centre the new units saved 24,000kWh annually. This saves about 5,600kg per annum of carbon emissions and £3,400 per annum off their energy bill. In 2021, funded by the Industrial Energy Efficiency Accelerator (IEEA) programme, another 100kW system was installed at Hull City Council’s Maritime Data Centre. This saved 350,000kWh in power compared to the previous systems, reduced emissions by 100,000kg and saved £54,750 per annum in energy. The IEEA supports partnerships between developers of energy or resource efficient process technologies as they partner with industrial companies, willing to demonstrate the solutions on-site. The IEEA is funded by BEIS and managed by the Carbon Trust in partnership with Jacobs and KTN. Phase four of the IEEA is open for applications until 19 September 2022.



Translate »