22 April 2026
Yondr powers up 27MW Toronto data centre
 
22 April 2026
FTTH Conference 2026 highlights Europe’s fibre momentum
 
22 April 2026
ECL developing 35MW Santa Clara data centre
 
21 April 2026
Mitie acquires Nordic data centre security firms
 
21 April 2026
Huber+Suhner expands Microsoft Azure fibre collaboration
 

Latest News


'AI growth doesn’t have to break the grid'
A UK high‑performance computing (HPC) data centre has reportedly cut its carbon emissions by three quarters while easing pressure on the electricity system, offering a blueprint for how the fast‑growing AI sector can expand without overwhelming the grid. Stellium Datacenters, which operates one the UK's largest purpose-built data centre campuses near Newcastle, has switched to a new way of sourcing electricity. This matches its power use with renewable generation hour by hour, rather than relying on annual averages. The move comes as data centres face mounting scrutiny over their energy use, with concerns growing that AI and cloud computing could strain local grids and push up energy costs. That scrutiny has intensified in recent months, with MPs launching an inquiry through the Environmental Audit Committee into the environmental impact of data centres, including their growing electricity and water use and the pressure they place on local grids. Working with renewable energy supplier Good Energy, Stellium now runs its site on a 100% renewable, hourly‑matched electricity supply, linking consumption directly to power generated by more than 3,300 independent UK renewable generators. This approach allows the company to show exactly when its electricity demand is met by renewable sources, achieving an hourly matching score of 95.4%, more than double the current market average of around 43%. Planned additions, including large-scale battery storage, are expected to lift this to 97–98% while being able to show exactly which UK renewable assets powered the data centre and when. 'Hourly matching' as an improved metric Traditionally, many data centres rely on renewable certificates that show clean electricity was generated somewhere on the grid over a year, even if fossil fuels were used at the time power was actually consumed. Some “100% renewable” tariffs relying on this system mask continued reliance on fossil-fuelled power at precisely the moments when the grid is most constrained. By contrast, hourly matching provides a much clearer picture of real‑world impact, demonstrating which users are sourcing clean, homegrown power versus relying on fossil‑fuelled generation at peak times. Stellium says the change has transformed conversations with customers, regulators, and auditors, particularly global AI and technology firms with strict net zero and reporting requirements. The company says it can now demonstrate, in detail, which renewable assets powered its operations, when they did so, and where they are located. Paul Mellon, Operations Director at Stellium, notes, “Data centres often get bad press for their high, inflexible energy use. But this shows that AI and high‑performance computing don’t have to come at the expense of the grid or the climate. "By switching to hourly‑matched renewable power, we’ve been able to cut emissions dramatically while giving customers the transparency they increasingly demand.” Nigel Pocklington, CEO of Good Energy, adds, “By matching electricity use with renewable generation hour by hour, Stellium can show when clean power is actually being used. "That kind of transparency cuts carbon emissions, reduces reliance on fossil fuels at peak times, and proves that digital growth and a resilient energy system can go hand in hand.” Explosive data centre growth in the UK The case comes as the UK prepares for a major expansion in data centre capacity to support AI, cloud computing, and data‑driven industries. As planners, communities, and policymakers look more closely at how new developments will affect local infrastructure, Stellium’s experience suggests that data centres can respond by sourcing and reporting their energy responsibly, rather than relying on offsetting or misleading annualised accounting. With pressure growing on the sector to prove its environmental credentials, the model demonstrates that practical solutions may already exist, and that AI‑driven growth can be aligned with a cleaner, more resilient electricity system. For more from Stellium Datacenters, click here.

How to define the right sovereign cloud strategy
In this exclusive article for DCNN, Joe Baguley, CTO EMEA at Broadcom, gives his insight into how a workload-first approach to sovereign cloud, underpinned by data classification, flexible architecture, and strong partnerships, is reshaping European digital competitiveness: Reclaiming control and competitiveness Across Europe, governments and enterprises alike are increasingly recognising that data control holds the keys to innovation. This means a change in attitudes towards cloud sovereignty; it’s no longer seen as a simple compliance factor, but as a top priority for competitiveness and trust. The European Union is taking steps to support this shift, placing greater emphasis on sovereign infrastructure as part of its broader digital strategy. A clear example is the €180 million (£156 million) tender launched by the European Commission through its Cloud III Dynamic Purchasing System, aimed at procuring sovereign cloud services for EU institutions. To ensure cloud sovereignty, the first step is preparation: organisations need a clear understanding of where their data resides, how it moves, and who controls it. Answering these questions requires a clearly defined strategy, one that aligns workloads with the most appropriate cloud environments and establishes effective data governance. Importantly, it has to support the development of flexible cloud architectures capable of meeting regulatory demands while still enabling innovation. Designing cloud strategies around workload needs At the heart of a successful sovereign cloud strategy lies a simple principle: placing the right workload in the right environment. There is no single solution that fits all applications. Enterprises must align each workload with the cloud environment that best meets its compliance, operational, and performance requirements to determine whether it belongs in a public, private, or sovereign cloud. Some applications may thrive in a hyperscaler environment, while others require the control and security of a sovereign setup. This reality has made hybrid cloud strategies the norm. Over the past decade, many organisations initially committed to a single hyperscaler for all workloads only to realise that different applications have different requirements. Today, IT leaders increasingly need to adopt a ‘right workload, right place’ mindset, recognising that some applications may remain on premises, others run optimally in public clouds, and some require sovereign environments for regulatory or operational reasons. This hybrid approach enables organisations to balance innovation with control while avoiding vendor lock-in and making more effective use of the strengths of different cloud ecosystems. Data classification comes first Of course, organisations cannot secure or govern what they do not fully understand. Comprehensive data classification is a critical first step. Misclassified data is a frequent source of compliance risk and over-classification, often a product of risk aversion, which can create extra operational complexity and cost. Many organisations treat all data as highly classified simply to be safe, but this can lead to over-investment in secure infrastructure where it is not needed. Mapping data flows across borders and providers is equally important. Compliance blind spots often appear when data is inadvertently stored or processed in jurisdictions with restrictive data laws. Understanding where sensitive data resides, how it moves, and which regulations apply is essential to reducing risk, demonstrating accountability, and maintaining trust with partners and customers. Retrofitting compliance into existing infrastructure is costly and complex; embedding that understanding into cloud architecture from the outset is far more efficient. Building flexibility into architecture Flexibility is the cornerstone of effective sovereign cloud implementations. Architectures built for interoperability and portability allow workloads to move seamlessly across private, public, and sovereign clouds. This adaptability is vital for risks posed by geopolitical or regulatory change. Hyperscalers cannot always guarantee sovereignty due to extraterritorial legislation such as the US CLOUD Act, which permits government access to data held by American companies abroad. By contrast, working with local cloud operators enables enterprises to maintain jurisdictional control over their data while still leveraging the latest technology. Moreover, working with local cloud operators can provide additional technological sovereignty benefits ranging from the investment to the local ecosystem and industrial base, all the way to addressing supply chain concerns, promoting interoperability, avoiding vendor lock-in, having stronger operational control, and managing dependency concerns. Sovereignty should be viewed not as a constraint, but as a design principle guiding infrastructure, data placement, and application deployment. Organisations that prioritise adaptability can balance regulatory compliance with innovation and long-term strategic growth. Partnerships powering sovereign cloud Partnerships also play a pivotal role. No single vendor or platform can solve sovereignty challenges by themselves and, in the current interconnected supply chain, there does not exist a perfect vertical integration of suppliers within one region. Open source is often presented as a solution to more autonomy. The reality, however, is that open source solutions create questions on code providence, reliability of a solution when deployed at scale, and different dependencies on support. The most successful sovereign cloud environments combine global technology providers, local operators, and trusted EMEA partners (such as evoila and Arvato). This collaborative approach not only strengthens compliance and transparency, but also accelerates innovation by ensuring that governance does not become a barrier to progress. Meanwhile, the presence of a local ecosystem guarantees the ability to operate and support solutions with a high degree of autonomy. As regulatory and geopolitical landscapes evolve, organisations that foster open dialogue across their supply chain and internal teams will be best placed to adapt. Sovereignty is as much about alignment, strategic choices, and accountability as it is about infrastructure. From compliance requirement to strategic asset Sovereign cloud has moved beyond a purely compliance-driven requirement and is increasingly becoming a source of strategic advantage. Organisations that commit to the ‘right workload, right place’ mindset and have clear data classification, flexible architecture, and prioritise interoperability are the ones that will have a competitive advantage. This approach allows organisations to scale globally whilst remaining aligned to regulatory and geopolitical shifts. Sovereignty is an enabler of AI and should be treated as such.

STL launches Neuralis US data centre platform
STL, an optical and digital systems company, has launched its Neuralis data centre connectivity portfolio in the United States, targeting infrastructure designed for artificial intelligence and high-density computing environments. The announcement was made by STL Optical Connectivity NA, the company’s US subsidiary, at Data Center World 2026 in Washington, D.C. Neuralis is designed to support evolving data centre requirements, particularly the shift towards AI workloads, hyperscale computing, and edge deployments. These trends are increasing demand for high-speed, high-density connectivity within and between facilities. The portfolio focuses on managing the transition from traditional north–south traffic flows to more intensive east–west traffic, driven by GPU-based architectures and AI training processes. Designed for high-density AI infrastructure The Neuralis portfolio is structured around two main areas: The first focuses on maximising data centre space through the use of high-density, pre-terminated fibre cabling. This approach moves connection work into manufacturing environments, reducing on-site installation time and complexity. The second area addresses data centre interconnect (DCI), supporting large-scale data transfer between sites. This includes fibre infrastructure designed for high-capacity environments, with cables capable of supporting large fibre counts for AI deployments. STL has developed the portfolio through collaboration with customers, with a focus on addressing space, density, and deployment challenges in modern data centres. The company’s manufacturing process covers the full fibre lifecycle, including preform production, fibre drawing, cabling, and connector integration. Production for the US market is supported by STL’s facility in Lugoff, South Carolina. Ankit Agarwal, Managing Director of STL, notes, "AI demands a level of precision and density that traditional cabling simply cannot meet. "With STL Neuralis, we are providing the high-speed, low-latency foundation that allows GPU clusters to perform at their peak, moving complexity out of the field and into a controlled, high-precision factory environment." The launch reflects increasing demand for infrastructure capable of supporting AI-driven workloads, as operators continue to scale data centre capacity across North America. For more from STL, click here.

Scaleway selected for EU sovereign cloud framework
French cloud computing provider Scaleway has been selected by the European Commission as one of four cloud providers under the Cloud III Dynamic Purchasing System, a €180 million (£156 million) programme supporting access to sovereign cloud services for EU institutions. The framework, which runs for up to six years, enables EU bodies and agencies to procure cloud services through a pre-approved group of providers. Selection follows an evaluation process based on the European Commission’s Cloud Sovereignty Framework, which assesses legal, operational, and technical criteria. As part of the programme, Scaleway will be eligible to participate in project-specific competitions to deliver cloud services, including for sensitive and critical workloads. Cloud III is managed by the Directorate-General for Digital Services and was introduced in 2025 as the European Commission’s primary framework for cloud procurement. The initiative promotes a multi-cloud model, allowing institutions to select from a limited group of approved providers rather than relying on a single vendor. It is designed to support resilience, continuity, and flexibility across public sector digital infrastructure. The framework also supports deployment of cloud environments for critical systems, alongside fallback capabilities for existing cloud or on-premises infrastructure in the event of disruption. A framework supporting a sovereign and multi-cloud approach A key element is the Cloud Sovereignty Framework, which establishes a consistent set of criteria for assessing cloud providers. This is intended to improve transparency and standardisation in how sovereignty is defined and applied across the European cloud sector. Scaleway operates as a European-owned provider, with infrastructure and operations based within Europe. Its platform is designed to support data localisation and compliance with European regulatory requirements. Damien Lucas, CEO of Scaleway, comments, "At Scaleway, we are committed to contributing to Europe’s digital autonomy, not only through our technology and our alignment with European regulatory frameworks, but also through how we build and invest in our ecosystem. “Today, for every euro spent with Scaleway, around 68 cents are reinvested in the European economy, compared to around 20 cents when relying on international hyperscalers. "Directing investment towards truly European cloud providers helps strengthen local capabilities and ensures that value, expertise, and innovation remain anchored in Europe”. The company notes that the selection reflects an increasing focus across Europe on sovereign cloud infrastructure, as demand grows for secure, compliant platforms to support data and artificial intelligence workloads.

DE-CIX, Ooredoo link Doha IX to Marseille
Internet exchange (IX) operator DE-CIX and Qatari telecommunications company Ooredoo have connected Doha IX to DE-CIX Marseille, expanding international interconnection for networks in Qatar. The link connects Qatar’s first commercial internet exchange with a wider European ecosystem, enabling direct access to networks in Marseille and remote connectivity to those linked via DE-CIX Frankfurt. Doha IX is operated by Ooredoo under the DE-CIX-as-a-Service model and is hosted in one of the company’s data centres. The interconnection is intended to improve access to cloud platforms and digital services not currently available locally. The connection allows networks in Qatar to exchange data directly with almost 120 networks in Marseille, as well as access a broader pool of networks connected through Frankfurt, one of Europe’s largest internet exchanges. This supports lower-latency connectivity and provides additional resilience for cloud and content delivery. It also enables access to major cloud providers through dedicated and private connections, alongside tools designed to support hybrid and multi-cloud environments. Expanding low-latency access to global networks Since its launch in October, Doha IX has developed as a carrier-neutral interconnection hub, supporting local and international data exchange. The platform also offers services including cloud connectivity, IP transit, hosting, and colocation. Ivo Ivanov, CEO of DE-CIX, says, “The direct interconnection between the IXs in Doha and Marseille brings the world closer together. “By providing even better performance and user experience for internet-based content and applications, our collaboration with Ooredoo opens up new opportunities for Qatar’s digital economy. "Enhanced connectivity will further strengthen the digital ecosystem in the GCC, supporting economic growth and innovation while paving the way for the amazing digital decades ahead of us.” Hassan Ismail Al Emadi, Chief Business Officer at Ooredoo Qatar, adds, “The direct interconnection between Doha IX and DE-CIX Marseille represents a strategic expansion of Qatar’s global digital reach. “By linking our national interconnection platform with one of Europe’s leading internet exchange ecosystems, we are enabling differentiated digital performance through lower latency, enhanced resilience, and secure, seamless access to global cloud and content networks. "This collaboration reinforces Qatar’s position as a regional digital gateway and enables enterprises to operate with greater performance, reach, and competitiveness, accelerating digital transformation across Qatar and the wider GCC.” The companies say the development reflects continued investment in interconnection infrastructure to support growing demand for cloud services and international data exchange. For more from DE-CIX, click here.

Carrier opens €12m Montluel HVAC testing facility
Carrier, a manufacturer of HVAC, refrigeration, and fire and security equipment, has opened a new testing facility at its European Centre of Excellence in Montluel, France, to support the development of cooling and heating technologies for data centres, industry, and large commercial buildings. The €12 million (£10.4 million) investment expands the company’s research and development capacity, with a focus on high-performance systems aligned with electrification trends and the use of lower-impact refrigerants. Testing at the site follows Eurovent-certified performance methodologies. The expansion comes as demand for data centre infrastructure continues to grow across Europe. According to JLL’s 2026 Global Data Center Outlook, the EMEA region is expected to add 13GW of new capacity by 2030, driven by hyperscale deployments and artificial intelligence workloads, particularly in markets such as London, Frankfurt, and Paris. Increased capacity for HVAC system testing The new laboratory is designed to support testing across a wide range of operating conditions. It enables evaluation of air-cooled chillers up to 3,200kW, air-source heat pumps up to 1,500kW, and water-source systems up to 6,000kW. The facility can simulate temperatures ranging from −20°C to +60°C, with humidity control, and supports water flow rates of up to 1,600m³/h. This allows for testing under varied and demanding conditions relevant to real-world applications. Bertrand Rotagnon, Executive Director, Commercial Business Line and Data Centres Europe at Carrier, comments, “With these new test laboratory facilities, we’re raising the bar on how we support customers and partners in Europe. “The combination of higher test capacity and advanced environmental control lets us validate performance with zero tolerance, earlier, and bring solutions to market faster, giving customers the confidence to move ahead on high-efficiency cooling and heating for data centres, industry, and district heating.” Nicolas Fonte, Director, Systems Engineering at Carrier Climate Solutions Europe, adds, “The new testing facility expands our engineering team's ability to test and validate chillers and heat pumps for very wide and [the] most critical operating conditions. “This new equipment enables us to validate performance, with high precision, of next-generation chillers and large heat pump platforms supporting [increasing] customers' requests for future infrastructures.” The development forms part of the company's stated ongoing investment in HVAC technologies to meet increasing performance, efficiency, and regulatory requirements across European markets. For more from Carrier, click here.

OVHcloud expands quantum cloud platform with Quandela
OVHcloud, a French cloud computing provider, has made photonic quantum computing company Quandela’s Belenos quantum computer available through its Quantum platform, expanding access to quantum computing across Europe. The announcement was made at the Quantum Defence Summit, with the addition of Belenos marking a further development of OVHcloud’s cloud-based quantum offering. The OVHcloud Quantum platform provides access to quantum systems through a Quantum-as-a-Service model, allowing organisations to use quantum computing resources without requiring dedicated hardware. Belenos is based on photonic quantum technology and offers a capacity of 12 qubits. It is intended to support experimentation with algorithms across a range of areas, including image processing, artificial intelligence, and quantum machine learning. Potential applications also extend to fields such as simulation, engineering, and environmental modelling. Expanding access to quantum computing in Europe OVHcloud says it has been supporting the European quantum ecosystem since 2022, providing access to quantum emulators through its infrastructure. The platform currently includes multiple emulators, enabling users to test and develop applications across different quantum computing approaches. The addition of Belenos introduces a physical quantum processing unit to the platform, complementing existing emulator-based access. Miroslaw Klaba, R&D Director at OVHcloud, comments, “We are delighted to deliver on the promise of the Quantum platform by adding a second reference quantum computer, Belenos, from the French company Quandela. "The quantum revolution accelerates and OVHcloud is taking its part as the European cloud leader within the ecosystem.” The system is available through a usage-based pricing model, with billing calculated per second and no long-term commitment required. Niccolò Somaschi, CEO and co-founder of Quandela, notes, “The integration of Belenos 12 qubits into the OVHcloud portfolio marks a decisive step for quantum in Europe. Accessible through the cloud, this photonic computer becomes a concrete tool for businesses. "With OVHcloud, we are offering data scientists and innovators alike the means to develop their algorithms on a flexible and sovereign infrastructure.” The expansion reflects ongoing efforts to increase accessibility to quantum computing, supporting research and development across industry and academia. For more from OVHcloud, click here.

EPRI, OCP aim to advance DCs as flexible grid resources
EPRI (the Electric Power Research Institute), an independent, non-profit energy research and development organisation, and the Open Compute Project (OCP), a non-profit organisation that develops and shares open hardware standards and designs for data centre infrastructure, have announced a collaboration focused on developing data centres as flexible resources for power systems. The initiative aims to support digital infrastructure growth while improving how data centres interact with electricity networks, particularly as demand increases from artificial intelligence and other compute-intensive workloads. By working together, the organisations intend to support improved integration between data centres and power systems while developing technical frameworks to enable more flexible operation. Arshad Mansoor, President and CEO of EPRI, comments, “We’re in the midst of an energy revolution, and it must be smart, flexible, and innovative to keep rates affordable for customers across the globe. “Through this collaboration with OCP, EPRI is combining rigorous power system science with open, scalable data centre innovation to advance practical solutions that enable data centres to operate as flexible, grid-supporting resources - strengthening reliability and affordability for all.” Developing flexible data centre energy models The collaboration brings together stakeholders across the energy and data centre sectors, including a European group involving DCFlex, National Grid, NESO, PPC, RTE, and RWE. This group is working to develop frameworks that reflect operational requirements, with a focus on improving resilience and scalability as data centre capacity expands. Activities include work on shared standards, testing environments, and implementation guidance for flexible data centre operations. Zane Ball, Chief Technology Officer at OCP, notes, “With a growing member base and top-tier data centre expertise coming together with a single vision, our collaboration creates opportunities for harmonised standards, shared testing environments, and coordinated guidance for implementing flexible, resilient, and affordable data centre solutions.” EPRI says it is also supporting the work through field demonstrations at data centres in Europe and the United States, exploring flexible load approaches that could support grid stability and reduce barriers to connection.

LS Electric wins $115m data centre contract
LS Electric, a South Korean manufacturer of electrical equipment and automation systems, has secured a $115 million (£84.9 million) contract to supply power infrastructure for a series of data centre developments across North America. The projects will support major technology companies expanding capacity for artificial intelligence and other compute-intensive applications, where consistent and high-quality power is required. Under the agreement, LS Electric will deliver switchgear and distribution transformers designed for continuous operation in high-demand environments. Expanding North American manufacturing footprint The deal comes at a time as data centre operators are increasing focus on power systems that offer reliability, adaptability, and long-term support as facilities scale to meet rising workloads. Large-scale developments of this kind also require suppliers able to meet strict technical standards while maintaining consistent delivery across manufacturing, logistics, and on-site coordination. LS Electric says it will support the projects from design through to commissioning. To fulfil the contract, LS Electric will utilise its growing industrial presence in North America, including operations in Utah and Texas, such as MCM Engineering II and its Bastrop campus. These facilities will support production and system integration, as well as ongoing regional expansion in engineered power infrastructure. LS Electric states it will continue to expand its offering for the sector, focusing on technologies that support reliable and energy-efficient data centre performance. For more from LS Electric, click here.

Lonestar unveils space-based data storage service, StarVault
Lonestar, a space-based data storage company building orbital and lunar data centres, has announced the launch of StarVault, the "world's first" commercial space-based data storage service, alongside plans to expand its orbital infrastructure through a new agreement with Sidus Space, a US space and defence technology company. The platform is designed to store data off-planet, combining space-based infrastructure with cryptographic key management. It is intended for use by organisations seeking additional resilience for critical data. Lonestar has also ordered a second orbital payload from Sidus Space to increase storage capacity and redundancy. The first payload is currently in development and is scheduled to launch in October aboard the LizzieSat-4 satellite, with a second launch planned for 2027. Expansion of orbital data infrastructure The expansion follows earlier test missions and increasing interest from sectors including government, finance, and critical infrastructure. The StarVault platform is designed to provide an additional layer of data protection, supporting resilience against risks such as cyber incidents, environmental disruption, and geopolitical instability. Steve Eisele, CEO of Lonestar, says, “Demand for off-planet data security has exceeded expectations. With StarVault, we are not just launching a new category; we are scaling it.” Sidus Space is building the initial payload, with further deployments expected as Lonestar develops its orbital data storage network. The companies state that the initiative represents an early step in the development of space-based data infrastructure, with a focus on secure storage beyond traditional terrestrial data centres.



Translate »