18 September 2025
Panduit launches EL2P intelligent PDU
 
18 September 2025
TBM's Excellence Forum returns to Washington
 
18 September 2025
DataPro+ launches job board and social network
 
17 September 2025
CDM unveils DC platforms for the AI era
 
17 September 2025
Duos deploys fifth edge data centre
 

Latest News


Microgrids are key to accelerating DC growth, research finds
A combination of renewables, grid balancing engines and energy storage make for the most cost-effective microgrids to power data centres, while also cutting emissions and providing vital grid balancing to enable the energy transition, according to a new research paper from technology group Wärtsilä and energy solutions business AVK. The paper, Data centre dispatchable capacity: a major opportunity for Europe’s energy transition, provides new analysis on how data centre microgrids can reduce grid infrastructure spending, emissions and wasted energy, while providing a balanced path for the energy transition.The analysis finds that powering the data centres across Europe by optimised microgrids could create a significant bank of dispatchable power, supporting the entire continent’s energy transition. The rapid growth of AI is driving increased demand for data centres across Europe, which is expected to increase by 250% by 2030, from 10GW to 35GW. With the continent’s grid facing constraints from high energy prices and bloated grid connection queues, data centre operators are increasingly turning to off-grid solutions to power these energy-intensive assets. Anders Lindberg, President of Wärtsilä Energy and Executive Vice President of Wärtsilä, says, “The growth of AI over recent years has been extraordinary, and as it continues to transform the way we live and work, it drives a need for more energy. This is causing significant challenges for grid operators across Europe, who are struggling with rising costs and up to a 10-year waiting time for a grid connection. “By investing in microgrids, data centres can sidestep energy constraints, and with the right technology mix of renewables, grid balancing engines and energy storage, can ensure their emissions profiles and costs do not outweigh the huge benefits that AI brings. AVK CEO Ben Pritchard comments, “The answer to the challenges we face in combatting climate change is as much to do with changing behaviours as developing new technologies. And the key to behavioural change is the recognition that there are different ways of doing things. The solutions outlined in this paper are not impractical; they are based on real-world cases and calculations. All that’s needed to make them more widespread is for investors, operators, equipment suppliers, planners, policy makers to recognise the widespread benefits that sharing dispatchable data centre capacity with the grid can bring and pass that knowledge on.” In addition to benefits created by microgrids, engine power plants bring cost efficiencies to data centre power generation. Modelling an 80MW data centre, a combination of engine power plants, renewables, and energy storage provides the lowest levelised cost of electricity – at 108 EUR/MWh – in comparison to three other real-world scenarios. It also offers a low emissions scenario in comparison to the other modelled scenarios, and particularly in comparison to gas turbines. The emissions of engine power plants can also decrease as sustainable fuels become commercially available. “Through investing in flexibility, microgrids can have the lowest possible cost, while cutting emissions dramatically compared to other pathways including turbines. This flexibility can have a significant, positive impact on the continent’s digital and energy transition,” Anders Lindberg states. On current trajectories, 40% of existing AI data centres will be operationally constrained by power availability by 2027. Microgrids can take this new strain off the grid in the short term and when grid connection is achieved, excess energy generated can be sold. As well as furthering cost reductions for data centre operators, this can provide vital flexibility to Europe's power challenges. Read the new research paper by clicking here. For more from AVK, click here.

Quantum-AI data centre opens in New York City
Oxford Quantum Circuits (OQC) and Digital Realty today announced the launch of the first Quantum-AI Data Centre in New York City, located at Digital Realty’s JFK10 facility and built with NVIDIA GH200 Grace Hopper Superchips. - Quantum-AI Data Centre: OQC and Digital Realty are working with NVIDIA to integrate superconducting quantum computers and AI supercomputing under one roof, creating a data centre built for the Quantum-AI era. - Landmark deployment and integration: OQC’s GENESIS quantum computer will integrate NVIDIA Grace Hopper Superchips to become the first-ever quantum computing system deployed in New York City. OQC plans to integrate its quantum hardware with NVIDIA accelerated computing to support the scalability of future systems. - Quantum-AI at scale: Embedded within Digital Realty’s global platform, PlatformDIGITAL, OQC is delivering secure, interconnected Quantum-AI infrastructure to power breakthroughs from Wall Street to Security – a central pillar of the UK–US tech trade Partnership to be announced.The Quantum-AI Data Centre brings together OQC’s quantum computing, NVIDIA accelerated AI hardware, and Digital Realty’s cutting-edge global infrastructure, eliminating geographical and infrastructure barriers to enable businesses to harness the power of quantum compute and AI. This initiative allows enterprises to access an integrated environment where quantum computing powers the AI revolution: enabling faster model training, more efficient data generation, and transformative applications in finance and security. The system features OQC GENESIS, a logical-era quantum computer, installed within Digital Realty’s secure JFK10 site – the first-ever quantum computer installed within a New York City data centre. Integrated with NVIDIA Grace Hopper Superchips, the platform provides a launchpad for hybrid workloads and enterprise adoption at scale. OQC expects that future GENESIS systems will ship with NVIDIA accelerated computing as standard, building on its earlier collaboration integrating the NVIDIA CUDA-Q platform and providing developers seamless tools to build hybrid quantum-AI applications. “This Quantum-AI Data Centre demonstrates how quantum can drive the AI revolution - securely, practically, and at scale - while strengthening the UK–US technology alliance.” says Gerald Mullally, CEO of OQC. “Leveraging Digital Realty’s infrastructure and NVIDIA supercomputing, we are redefining enterprise computing for finance and security.” “Digital Realty’s mission has always been to enable the world’s most innovative technologies by providing secure, interconnected infrastructure at global scale,” adds Andy Power, President & CEO of Digital Realty. “By working with OQC, we’re using NVIDIA supercomputing to make Quantum-AI directly accessible in one of the world’s most important data hubs - empowering enterprises and governments to unlock new levels of performance and resilience.” Science Minister Patrick Vallance comments, “Quantum computing could transform everything - from speeding up drug discovery to supercharging clean energy so we can cut bills. The economic prize is enormous, with £212 billion expected to flow into the UK economy by 2045 and tens of thousands of high-skilled jobs on offer. OQC’s launch of the first quantum computer in New York City showcases British tech excellence and strengthens our transatlantic ties. And the industry’s first quantum-AI data centre will put British innovation at the heart of next-gen computing - delivering speed, scale and security to tackle problems today’s tech is yet to grasp." Applications and impact By integrating quantum computing with NVIDIA AI supercomputing inside a secure enterprise-grade data centre, OQC and Digital Realty are creating a platform that will unlock new possibilities across critical sectors: Finance: Faster and more accurate risk modelling, portfolio optimisation, fraud detection, and derivatives pricing, delivering competitive advantage in the world’s most data-intensive markets. Security: Advanced material simulation, logistics optimisation, and decision-making under uncertainty, strengthening resilience in mission-critical domains. Quantum for AI: Quantum computing will unlock new frontiers for AI itself, from accelerating model training and efficient data generation to emerging quantum machine learning applications with transformative impact across industries. “This milestone shows the strength of a British tech leader scaling globally through international collaboration,” says Jack Boyer, Chair of OQC. “Working with Digital Realty and using NVIDIA supercomputing here in the United States, OQC demonstrates how the UK and US can lead together in the responsible deployment of frontier technologies for finance and security” “The UK–US technology alliance is vital to ensuring that powerful new capabilities like quantum computing protect our nations, improve our prosperity, and are developed securely and in line with democratic values,” remarks Sir Jeremy Fleming, OQC Board member and former Director of GCHQ. “This deployment combines British innovation and American infrastructure, and brings NVIDIA’s AI leadership to deliver trusted computing power for the most critical applications.” Proven technology and roadmap OQC is reportedly the only quantum computing company with live deployments into colocated data centres: OQC already has systems operating in London and Tokyo, and now in New York. Its patented dual-rail Dimon qubit technology represents a breakthrough in error suppression, reducing the hardware overheads needed for error-corrected qubits and accelerating the path to fault-tolerant quantum computing. OQC has set a market leading roadmap – in collaboration with Digital Realty – to deliver scalable, commercially viable systems, with near-term impact in finance, defence, and AI. As a British champion of quantum computing, OQC is committed to building systems that drive both commercial advantage and national resilience. For more from Digital Realty, click here.

FTTH Congress CEE 2025 to focus on fibre rollout
The FTTH Congress CEE 2025 will take place on 7–8 October at the DoubleTree by Hilton in Warsaw, Poland, bringing together policymakers, operators, investors, and technology providers to address fibre deployment challenges and opportunities across Central and Eastern Europe (CEE). The two-day event, organised by the FTTH Council Europe, is expected to draw more than 400 delegates from across the region’s broadband ecosystem. Fibre challenges and opportunities in CEE According to the latest FTTH/B Market Panorama, the CEE region still has more than 13 million homes without fibre access, with rural areas presenting the largest gaps. While markets such as Poland and Romania have seen rapid deployment, others - including Czechia and parts of the Baltics - continue to face regulatory and investment obstacles. Vincent Garnier, Director General of the FTTH Council Europe, says, “Central and Eastern Europe represents both one of the continent’s biggest fibre challenges and one of its greatest opportunities. "With millions of homes still unconnected, this Congress is about ensuring that ambition translates into action by bringing together the actors who can make fibre a reality across the region.” Programme highlights The programme includes keynote sessions, technical presentations, and national market debates, with participation from the European Commission, BEREC, national regulators, operators, infrastructure investors, and vendors. Key themes will cover: • Regulatory frameworks and funding to accelerate deployment• Investment models and cross-border partnerships• Innovations in network resilience and open access models• The role of fibre in smart cities, inclusive growth, and digital sovereignty Country-focused sessions will provide insight into fibre developments in Poland, Czechia, Romania, Ukraine, and the Baltics. Francesco Nonno, President of the FTTH Council Europe, adds, “This event is a unique chance to address the strategic dimension of fibre. Beyond infrastructure, it is about enabling digital competitiveness, sustainability, and resilience. "The Congress in Warsaw will highlight how national and European priorities can come together to deliver for citizens and businesses alike.”

Industry reacts as EU Data Act comes into force
The EU Data Act officially comes into effect today, ushering in a new regulatory framework that aims to give users greater rights over their data while imposing fresh obligations on businesses around access, sharing, and cloud portability. The legislation seeks to improve transparency, promote fair competition, and create a more open data economy across Europe. However, industry reactions remain mixed, with some hailing the Act as a positive step forward and others warning of challenges with its implementation. A call for resilience and flexibility Tim Pfaelzer, Senior Vice President and General Manager EMEA at Veeam, says the Act arrives at a critical moment for organisations already navigating complex hybrid environments: “Many organisations have embraced hybrid models for their flexibility, but often at the expense of data portability. "The Act highlights why flexibility must be embedded into operations from the ground up. Proactive action now will not only support compliance, but also become a competitive advantage as data sovereignty and portability grow increasingly central to digital operations.” An opportunity for trust and openness Juliet Bramwell, Vice President EMEA at Glean, emphasises the Act’s potential to rebalance the data economy: “By giving users greater access to their own data and removing barriers to switching providers, the Act shifts power back to businesses and consumers. "Data sovereignty and interoperability are no longer optional; companies that embrace these principles will be better placed to innovate responsibly and build long-term trust in AI and cloud ecosystems.” Concerns around ambiguity and burden Adam Blake, CEO of ThreatSpike, welcomes the Act’s intent, but voices concern over its clarity and impact on smaller firms: “The language on forced data sharing is far too ambiguous and could end up weakening security. "Larger enterprises may have the resources to adapt, but for SMEs, redesigning products and meeting compliance demands could become a serious bottleneck. "Five years after GDPR, many businesses are still failing to comply [and] I fear this law could face the same fate.” Balancing ambition with practicality With the EU Data Act now in force, businesses across Europe will need to assess their compliance strategies, data management policies, and technical architectures to align with the new requirements. While many see it as an opportunity to improve trust and flexibility, others warn of potential risks and burdens. How effectively the Act is enforced - and how businesses adapt - will determine whether it becomes a cornerstone of Europe’s digital transformation, or another layer of complex regulation.

DataX Connect's salary survey results are in
UK data centre recruitment company DataX Connect has today, on National Data Centre Day, released the results of its 2025 Data Centre Salary Survey, coinciding with the company’s fifth anniversary of its founding. The study, which draws on insights from over 1,500 data centre professionals across Europe and the United States, reveals an industry that continues to offer strong pay and rapid progression, but also faces challenges around retention, satisfaction, and pay fairness. With demand for digital infrastructure only increasing, competition for skilled talent is fiercer than ever. The report shows that while salaries are rising, money alone is no longer enough to keep professionals engaged. The key findings • Pay rises ≠ retention — One in five professionals who received a pay increase last year still plan to leave their role. Overall, around 40% of respondents intend to change jobs within the next 12 months. • Women earn less — DataX Connect suggests the "gender pay gap persists across all levels of seniority." • Young professionals are progressing fast — One in five professionals with less than five years’ experience, and 30% of under 35s, already hold senior roles. Ambitious, early-career employees are finding fast routes to progression in the sector. Those aged 18–24 are already earning an average salary of £64k, showing what’s possible for ambitious young talent in this space. • Competitive pay, low satisfaction — While more than half of respondents believe data centre pay is more competitive than other industries, only one in five are truly satisfied with their compensation. The frustration often comes down to bonuses that feel out of reach or benefits that "aren’t cutting it." Looking ahead The findings highlight that, while the data centre sector is a lucrative industry, the next 12 months could contain a critical turning point. Businesses that invest in fairer pay structures and more transparent rewards could have the edge in attracting and retaining great talent. "The takeaway from this year’s survey is clear: the industry’s doing well, but salary alone won’t solve the bigger challenges," says Andy Davis, Director at DataX Connect and Data eXec. "If we’re serious about retention and satisfaction, we’ve got to do more than just pay competitively.”

DCNN celebrates National Data Centre Day
Today marks the first National Data Centre Day, an annual initiative recognising the vital role of data centres in powering the UK’s digital economy and AI ambitions. Taking place each year on 12 September, the date commemorates when data centres were officially designated as Critical National Infrastructure (CNI) by the UK Government in 2021. The awareness day aims to spotlight the innovation, sustainability, and people driving this essential sector forward, while also encouraging greater recognition of the industry’s contribution to society. As part of the celebrations, figures from across the sector have begun sharing their reflections on why National Data Centre Day matters, as well as the challenges and opportunities that lie ahead. Cooling at the forefront Ted Pulfer, Data Centre Director at Lennox Data Centre Solutions, highlights how cooling has become central to the industry’s progress: "National Data Centre Day is a significant moment for the UK industry. Marking a year since data centres were formally recognised as Critical National Infrastructure, it offers an opportunity to reflect on the evolution and challenges of the past year. “Cooling, once ‘part of’ the supporting infrastructure, has now moved to the forefront of the conversation, driven by increasing compute densities, AI workloads, and the rise of liquid cooling. "What’s particularly exciting is the collaboration this has inspired across manufacturers, engineers, and end users. Cooling is no longer a niche issue; it has become a strategic enabler of digital progress.” Connectivity as the foundation David Bruce, CRO of Neos Networks, points to the crucial role of fibre in enabling sustainable growth: “National Data Centre Day is a welcome opportunity to celebrate an industry that has quietly become the backbone of our digital lives. "From powering AI and cloud to supporting healthcare, finance, and public services, data centres are now rightly recognised as Critical National Infrastructure. Their role in enabling growth, innovation, and resilience cannot be overstated. “But as we look to the future, we must also recognise that data centres do not stand alone. Compute and power are essential, but it is fibre that connects investment to opportunity. "Without high-capacity, resilient networks stretching across the country, the benefits of our expanding data centre footprint risk being unevenly distributed and bottlenecked." Recognition and the road ahead National Data Centre Day provides the sector with a moment to reflect on its progress, showcase innovation, and address the challenges ahead. From cooling breakthroughs to fibre expansion, the themes highlighted today underline the growing strategic importance of digital infrastructure to the UK economy and society at large. For more on National Data Centre Day, click here.

Manufacturing in the digital age
In this article, Eric Herzog, CMO at Infinidat, explores how to protect your enterprise with cyber resilient storage: A significant transformation is underway in manufacturing enterprises, as traditional boundaries between Operational Technology (OT) and Information Technology (IT) systems rapidly dissolve. This convergence, driven as a result of ongoing digital transformation and the adoption of Industry 4.0 technology, is enabling manufacturers to achieve new levels of efficiency, productivity, and visibility across their operations. However, as these systems become increasingly integrated, the risks - particularly in the realm of cyber security - are also escalating. Understanding the changing landscape Historically, manufacturers have relied on OT systems to manage their core physical processes and machinery on the factory floor, focusing on real-time control and automation. In contrast, IT systems have taken care of data processing, business operations, and enterprise resource planning requirements. Initially, these systems would have been running independently, but in recent years, manufacturers have invested in more integrated manufacturing environments, where data flows seamlessly between shop floor equipment and enterprise systems. This integration is essential for efficiency. It enables real-time monitoring, advanced analytics, and data-driven decision-making, leading to optimised production processes and vastly improved business outcomes. At the heart of a manufacturing business is the Manufacturing Execution System (MES). The MES connects production equipment with business applications, supporting the planning, monitoring, documentation, and control of manufacturing processes in real time. It also acts as a bridge to higher-level ERP systems and industrial automation platforms, providing comprehensive visibility and enabling enterprises to make informed, data-driven decisions. But herein lies the risk, because integration is also a somewhat double-edged sword. There are plenty of upsides, but the cyber security risks can grind an enterprise to a halt. Integration upsides Here are three of the immediate benefits realised through OT and IT system integration: • Potential for real-time data analysis — Integrated OT/IT systems allow for immediate feedback and adjustments, reducing downtime and waste. • Enhanced communication — Seamless data exchange between shop floor and enterprise systems leads to better coordination and a faster response to all issues. • Optimised production — Enterprises can fine-tune their processes based on live data, improving quality and throughput. Integration downsides These operational advantages also expose manufacturers to additional cyber security threats. This question of cyber risk is for all industry sectors. The UK government’s 2024 Cyber Security Breaches Survey found that half of UK businesses experienced a cyber breach or attack in the past year, with the rate even higher among medium (70%) and large (74%) businesses. Manufacturing enterprises are an especially attractive target for cyber criminals for multiple reasons. They rely on complex, interconnected supply chains. They tend to be running a larger number of legacy systems than other industry sectors and this can create security blind spots. They also provide a high-impact target, because a successful cyberattack can disrupt an entire supply chain. Dealing with a cyberattack is also very costly. According to Make UK, an organisation representing manufacturers, nearly half of British manufacturers suffered cyberattacks in the previous year. A quarter reported losses between £50,000 and £250,000, and 65% experienced production downtime. But the true costs of a cyberattack run much deeper, because many attacks involve data exfiltration. In these cases, sensitive intellectual property or customer information is stolen and potentially sold or leaked. Data breaches are one of the biggest security threats, and new research from Deloitte - conducted with the Manufacturing Leadership Council in 2024 - quantifies this. The study reported that 48% of manufacturers experienced at least one data breach in the past 12 months, at an average cost of £2.1 million per breach. The devastating impact of storage targeted attacks A ransomware attack on enterprise storage systems can cripple a manufacturer, potentially completely halting production processes as data and files become encrypted and inaccessible. Such an attack can also compromise the entire manufacturing operation, from design and engineering data to supply chain management information. If key files are encrypted, the enterprise may not have access to product specifications, production schedules, and customer orders. Operations can be brought to a stand-still and the implications are far reaching, potentially also damaging long-term projects, customer relationships, and the business reputation. Investing in cyber resilience is not just business best practice; it is mandated by law. The EU’s NIS2 directive (2024) sets strict requirements for cyber risk management in critical sectors including manufacturing. And although no longer bound by EU laws, the UK will be releasing its own regulations with the forthcoming Cyber Security and Resilience Bill, expected to be ratified later in 2025. It is now widely accepted that, these days, it’s not a case of 'if my enterprise will be attacked', but 'when will I be attacked, how often will I be attacked, and, most importantly, how quickly can I recover?' Cyberattacks are occurring constantly. They have become an inevitable part of being in business. As the likelihood of an attack has evolved, so too have the techniques used, and completely preventing any form of cyber security breach is no longer realistic. Instead, manufacturers should focus on building cyber storage resilience into their enterprise storage and maximising their ability to detect, respond to, and recover quickly from attacks. Six foundations for cyber resilient storage A cyber resilient storage infrastructure to support manufacturing business continuity is built on six key principles: 1. Immutable snapshots — Rather than creating simple backups, manufacturers need secure, unalterable data copies taken at specific intervals. These immutable snapshots ensure that critical production and business data remains unchanged after creation, providing a reliable recovery source regardless of attack sophistication. 2. Logical and remote air-gapping — Effective cyber resilient storage requires logical isolation of immutable snapshots from network access. Air-gapping - implemented locally, remotely, or both - creates an additional protection layer that keeps recovery data segregated from potential infection vectors. 3. Automated detection and response — The speed of modern cyberattacks renders manual monitoring insufficient. Manufacturing companies need automated cyber security capabilities: Automated Cyber Protection (ACP) that integrates seamlessly with their existing security stack, including Security Operations Centres (SOC); Security Information and Event Management (SIEM); and Security Orchestration, Automation, and Response (SOAR) platforms. These systems should automatically trigger immutable snapshots when security incidents are detected. 4. Fenced forensic environment — Recovery from cyberattacks requires a completely isolated network environment for forensic analysis. This 'fenced' area allows for thorough data testing and integrity verification, ensuring that recovered data isn't compromised before reintroduction to production systems. 5. Near-instantaneous recovery — Critical for manufacturing operations is the ability to retrieve clean data copies within minutes, regardless of dataset size. Manufacturing processes are particularly time-sensitive, making rapid recovery capabilities essential for minimising production disruption and financial losses. 6. Scanning for cyber threats in your storage estate — Leveraging advanced AI and ML technology, you can scan your storage at regular intervals to see if there is a cyber threat. This gives you two different advantages: First, by scanning on a regular basis, you may uncover a cyber threat. Then, you can report that to the cyber security elements in your data centre as an 'early warning system.' Second, if you have an attack, the ability to search your immutable snapshots for a dataset free from any cyberattack gives you much faster and more reliable recovery. Road to proactive cyber storage resilience The integration of OT and IT is transforming manufacturing and unlocking new efficiencies, but it is also heightening the cyber security risk. As cyberattacks become more frequent and sophisticated, manufacturers must adopt a proactive, resilience-focused approach to their cyber security and enterprise storage. This means investing in advanced, cyber resilient storage, with robust defences and rapid data recovery capabilities. By prioritising these investments, manufacturing enterprises can reap all the benefits that integration offers, safeguard their operations, and protect data and intellectual property - even in the face of an increasingly hostile cyber threat landscape. For more from Infinidat, click here.

Cadence adds NVIDIA DGX SuperPOD to digital twin platform
Cadence, a developer of electronic design automation software, has expanded its Reality Digital Twin Platform library with a digital model of NVIDIA’s DGX SuperPOD with DGX GB200 systems. The addition is aimed at supporting data centre designers and operators in planning and managing facilities for large-scale AI workloads. The Reality Digital Twin Platform enables users to create detailed digital replicas of data centres, simulating power, cooling, space, and performance requirements before physical deployment. By adding the NVIDIA DGX SuperPOD, Cadence says engineers can model AI factory environments with greater accuracy, supporting faster deployment and improved operational efficiency. Digital twins for AI data centres Michael Jackson, Senior Vice President of System Design and Analysis at Cadence, says, “Rapidly scaling AI requires confidence that you can meet your design requirements with the target equipment and utilities. "With the addition of a digital model of NVIDIA’s DGX SuperPOD with DGX GB200 systems to our Cadence Reality Digital Twin Platform library, designers can model behaviourally accurate simulations of some of the most powerful accelerated systems in the world, reducing design time and improving decision-making accuracy for mission-critical projects.” Tim Costa, General Manager of Industrial and Computational Engineering at NVIDIA, adds, “Creating the digital twin of our DGX SuperPOD with DGX GB200 systems is an important step in enabling the ecosystem to accelerate AI factory buildouts. "This step in our ongoing collaboration with Cadence fills a crucial need as the pace of innovation increases and time-to-service shrinks.” The Cadence Reality Digital Twin Platform allows engineers to drag and drop vendor-provided models into simulations to design and test data centres. It can also be used to evaluate upgrade paths, failure scenarios, and long-term performance. The library currently contains more than 14,000 items from over 750 vendors. Industry engagement The addition of the NVIDIA model is part of Cadence’s ongoing collaboration with NVIDIA, following earlier support for the NVIDIA Omniverse blueprint for AI factory design. Cadence will highlight the expanded platform at the AI Infra Summit in Santa Clara from 9-11 September, where company experts will take part in keynotes, panels, and talks on chip efficiency and simulation-driven data centre operations. For more from Cadence, click here.

NorthC to build new data centre at uptownBasel campus
NorthC, a Dutch provider of sustainable data centre and colocation services, has signed an agreement to develop a regional data centre at the uptownBasel Innovation Campus in Arlesheim, Switzerland. The project expands NorthC’s existing collaboration with uptownBasel and will deliver a facility designed to support advanced technologies such as artificial intelligence, quantum computing, diagnostics, and personalised medicine. Construction is scheduled to begin around 18 months after planning approval, with operations expected to start by mid-2027. Sustainable design and regional focus In its first phase, the facility will cover 2,500 m² and provide 6 MVA of power capacity. It will be powered entirely by renewable energy, with backup systems running on green diesel, part of NorthC’s goal to achieve climate neutrality by 2030. The centre will also incorporate waste heat reuse for residential heating. Hans-Jörg Fankhauser, founder of uptownBasel, says, “With NorthC, we have a partner that shares our vision, one that understands the potential of operating a data centre on a globally recognised innovation campus.” Fankhauser adds that uptownBasel aims to be “a platform for leading networks in medical technology, quantum computing, artificial intelligence, and the future of work. We aim to attract startups and talent to Arlesheim early on in their journey.” NorthC already operates two facilities in nearby Münchenstein and says the new site will strengthen its role as an infrastructure provider for MedTech, Industry 4.0, and AI in the Basel region. With this development, the company will operate five data centres in Switzerland. Alexandra Schless, CEO of NorthC Group, comments, “We are very pleased to deepen our partnership with uptownBasel. The construction of this data centre underscores our continued commitment to Switzerland as a strategic location and reflects our belief in innovation and the future.” For more from NorthC, click here.

Fluke launches DC kits to reduce fibre failures
Fluke Networks, a manufacturer of network certification and troubleshooting tools, has introduced a set of Versiv Data Center Kits designed to help technicians and engineers prevent copper and fibre connectivity issues, as well as troubleshoot them more efficiently. The launch comes as global demand for data centre capacity continues to rise, driven by artificial intelligence (AI), cloud computing, and hyperscale facilities. With increasing density in fibre connections, contamination and testing challenges are becoming more significant risks to uptime. Kits for fibre inspection and troubleshooting The new kits include: • Fibre and Copper Commissioning and Troubleshooting Kit – for verifying and optimising networks throughout a data centre’s lifecycle, from commissioning to upgrades and troubleshooting • Fibre Inspection Kit – aimed at reducing failures by addressing end-face contamination, a leading cause of fibre performance issues • MPO Maintenance and Troubleshooting Kit – designed to speed up multi-fibre trunk testing by up to 80% with single-button operation Alongside the kits, Fluke is also releasing accessories that support Very Small Form Factor (VSFF) connectors, which enable higher connection density. These accessories allow users of the CertiFiber Pro Optical Loss Test Set to apply the recommended single-jumper reference method for testing MDC connections, as well as inspect and clean MMC, MDC, and SN connectors. Nigel Hedges, Application and Technical Specialist at Fluke Networks, says, “With over 9,000 data centres worldwide, and AI, cloud, and hyperscale technologies driving explosive growth, infrastructure teams are under unprecedented pressure. "The new Versiv Data Center Kits are designed to help technicians and engineers meet that challenge head-on - equipping them with tools to prevent failures, speed up troubleshooting, and ensure high-density fibre connections are clean, tested, and reliable.” Fluke Networks says the kits are intended to support teams working in hyperscale and enterprise environments, where the margin for error is minimal and preventative maintenance is essential to maintaining resilience.



Translate »