Data Centre Build News & Insights


Rethinking fuel control
In this exclusive article for DCNN, Jeff Hamilton, Fuel Oil Team Manager at Preferred Utilities Manufacturing Corporation, explores how distributed control systems can enhance reliability, security, and scalability in critical backup fuel infrastructure: Distributed architecture for resilient infrastructure Uninterrupted power is non-negotiable for data centres to provide continuity through every possible scenario, from extreme weather events to grid instability in an ageing infrastructure. Generators, of course, are central to this resilience, but we must also consider the fuel storage infrastructure that powers them. The way the fuel is monitored, delivered, and secured by a control system ultimately determines whether a backup system succeeds or fails when it is needed most. The risks of centralised control A traditional fuel control system typically uses a centralised controller such as a programmable logic controller (PLC) to manage all components. The PLC coordinates data from sensors, controls pumps, logs events, and communicates with building automation systems. Often, this controller connects through hardwired, point-to-point circuits that span large distances throughout the facility. This setup creates a couple of potential vulnerabilities: 1. If the central controller fails, the entire fuel system can be compromised. A wiring fault or software error may take down the full network of equipment it supports. 2. Cybersecurity is also a concern when using a centralised controller, especially if it’s connected to broader network infrastructure. A single breach can expose your entire system. Whilst these vulnerabilities may be acceptable in some industrial situations, modern data centres demand more robust and secure solutions. Decentralisation in control architecture addresses these concerns. Distributed logic and redundant communications Next-generation fuel control systems are adopting architectures with distributed logic, meaning that control is no longer centralised in one location. Instead, each field controller—or “node”—has its own processor and local interface. These nodes operate autonomously, running dedicated programs for their assigned devices (such as tank level sensors or transfer pumps). These nodes then communicate with one another over redundant communication networks. This peer-to-peer model eliminates the need for a master controller. If one node fails or if communication is interrupted, others continue operating without disruption. This means that pump operations, alarms, and safety protocols all remain active because each node has its own logic and control. This model increases both uptime and safety; it also simplifies installation. Since each node handles its own logic and display, it needs far less wiring than centralised systems. Adding new equipment involves simply installing a new node and connecting it to the network, rather than overhauling the entire system. Built-in cybersecurity through architecture A system’s underlying architecture plays a key role in determining its vulnerability to cybersecurity hacks. Centralised systems can provide a single entry point to an entire system. Distributed control architectures offer a fundamentally different security profile. Without a single controller, there is no single target. Each node operates independently and the communication network does not require internet-facing protocols. In some applications, distributed systems have even been configured to work in physical isolation, particularly where EMP protection is required. Attackers seeking to disrupt operations would need to compromise multiple nodes simultaneously, a task substantially more difficult than targeting a central controller. Even if one segment is compromised or disabled, the rest of the system continues to function as designed. This creates a hardened, resilient infrastructure that aligns with zero-trust security principles. Safety and redundancy by default Of course, any fuel control system must not just be secure; it must also be safe. Distributed systems offer advantages here as well. Each node can be programmed with local safety interlocks. For example, if a tank level sensor detects overfill, the node managing that tank can shut off the pump without needing permission from a central controller. Other safety features often include dual-pump rotation to prevent uneven wear, leak detection, and temperature or pressure monitoring with response actions. These processes run locally and independently. Even if communication between nodes is lost, the safety routines continue. Additionally, touchscreens or displays on individual nodes allow on-site personnel to access diagnostics and system data from any node on the network. This visibility simplifies troubleshooting and provides more oversight of real-time conditions. Scaling with confidence Data centres require flexibility to grow and adapt. However, traditional control systems make changes like upgrading infrastructure, increasing power, and installing additional backup systems costly and complex, often requiring complete rewiring or reprogramming. Distributed control systems make scaling more manageable. Adding a new generator or day tank, for example, involves connecting a new controller node and loading its program. Since each node contains its own logic and communicates over a shared network, the rest of the system continues operating during the upgrade. This minimises downtime and reduces installation costs. Some systems even allow live diagnostics during commissioning, which can be particularly valuable when downtime is not an option. A better approach for critical infrastructure Data centres face incredible pressure to deliver continuous performance, efficiency, and resilience. Backup fuel systems are a vital part of this reliability strategy, but the way these systems are controlled and monitored is changing. Distributed control architectures offer a smarter, safer path forwards. Preferred Utilities Manufacturing Corporation is committed to supporting data centres to better manage their critical operations. This commitment is reflected in products and solutions like its Preferred Fuel System Controller (FSC), a distributed control architecture that offers all the features described throughout this article, including redundant, masterless/node-based communication, providing secure, safe, and flexible fuel system control. With Preferred’s expertise, a distributed control architecture can be applied to system sizes ranging from 60 to 120 day tanks.

Echelon announces new €3bn Milan data centre site
Echelon, a developer and operator of large-scale data centre infrastructure, has partnered with controlled affiliates of Starwood Capital Group to acquire a 37-acre site with grid power near Milan. Echelon says this investment marks the next step of its expansion into Continental Europe and follows the announcement of a €2 billion (£1.74 billion) joint venture (JV) with Spanish energy company Iberdrola to develop data centres in Spain. Development will begin immediately to create one of Italy’s largest data centre campuses. The site has electrical capacity of 250MVA gross power - 100MVA of which is available immediately through the existing onsite substation. Up to €3 billion (£2.6 billion) will be invested in the development of the LIN10 data centre campus. Industry comments Niall Molloy, CEO of Echelon, says, “Echelon is very pleased to partner with Starwood Capital to enter this new market. LIN10 has in place grid power, scale, and flexibility, which makes it one of the most attractive projects in Europe. "It is ready to build and offers exceptional opportunities for hyperscale operators. We expect to start construction imminently and have the facility operational in 18 to 24 months. Everyone at Echelon is delighted to have secured our first development site in continental Europe.” David Smith, Chief Investment Officer at Echelon, comments, “Entering the Italian market is another significant milestone on Echelon’s growth trajectory, and we are delighted to have made this strategic step. "We have a strong pipeline of exciting opportunities across Europe and expect to add additional markets over the next 24 months to continue to support the growth of our customers.” Maximilian Gentile, Senior Vice President at Starwood Capital, adds, “We believe in the fundamental growth drivers of the Milan data centre market. "Demand for data centre capacity continues to grow exponentially globally and this investment demonstrates Echelon’s commitment to delivering power and scale to help customers meet the requirements of an increasingly AI-driven digital economy.” Echelon currently has seven data centre facilities either operational or in development across Ireland, the United Kingdom, and Spain, with a combined capacity of approximately 1.25GW. The acquisition of LIN10 forms part of the company’s growth plans to develop an additional 1.5GW of capacity across new locations over the next five years. For more from Echelon, click here.

Ramboll report outlines roadmap to sustainable data centres
A new report published by Ramboll, an architecture, engineering, and consultancy company, at Climate Week NYC sets out a strategic framework for reducing the environmental impact of data centres and achieving net zero carbon. The report, Developing sustainable data centres: A strategic roadmap to achieve net zero carbon and reduce environmental impact, provides guidance across the full value chain, with recommendations for owners, developers, operators, and consultants. It addresses key sustainability challenges including embodied and operational carbon, biodiversity, circularity, energy, and water use. Tackling operational and embodied carbon Data centres accounted for around 1.5% of global electricity consumption last year, a figure projected by the International Energy Agency (IEA) to double by 2030. Given this demand, operational carbon is the largest component of emissions. The report states that net zero operational carbon is achievable through measures such as optimised energy efficiency, renewable energy procurement, energy reuse and export, and demand response. Embodied carbon, associated with construction materials, can be reduced by using low-carbon steel and concrete, sourcing locally, and reusing materials from decommissioned buildings. Ed Ansett, Ramboll’s Global Director of Technology and Innovation, says, “The construction of data centres powered by the rise of artificial intelligence is booming across the globe, driving unprecedented demand for electricity and significantly contributing to global greenhouse gas emissions, increased water consumption, waste production, habitat destruction, and resource depletion. "These challenges can be managed and mitigated if data centres are built with climate, biodiversity, and circularity impacts in mind from the very start.” Biodiversity, circularity, and water use The report highlights the importance of integrating biodiversity into site planning, recommending ecological surveys to identify protected species and habitats at an early stage. It also calls for the involvement of landscape architects to help reduce ecological impacts. For circularity, Ramboll proposes a benchmark of 100% reuse, reusability, or recyclability of materials, with no output to landfill or incineration. Water consumption, a major concern in regions with limited supply, can be reduced by achieving water neutrality. Strategies include avoiding water-based cooling, maximising cycles of concentration, and making use of alternative sources such as rainwater. Ed continues, “There are economic benefits for data centre owners if they focus on circular practices. For instance, the sole physical byproduct of data centre energy consumption is heat, which has historically been unused and released to atmosphere. Data centres are in an excellent position to export what would otherwise be wasted energy.” For more from Ramboll, click here.

EcoDataCenter breaks ground on mega campus in Borlänge
EcoDataCenter has started construction of the new mega campus Kvarnsveden in Borlänge, Sweden. The project represents a long-term establishment of significant industrial scale, with the first data centre at the site scheduled for completion in early 2027. Peter Michelson, CEO of EcoDataCenter, comments, “This is a historic day for EcoDataCenter, for Borlänge, and for Sweden. AI infrastructure is a new base industry, and Kvarnsveden will play a key role in supporting digitalisation. The facility in Borlänge will become one of the largest projects of its kind in Europe.” At launch, EcoDataCenter 2 in Borlänge will have access to 250 MW, with the potential to scale up to 600 MW. The development follows EcoDataCenter’s acquisition of the former Kvarnsveden paper mill in 2024, creating a unique opportunity to transform an industrial landmark into a hub for next-generation technology. Peter continues, “The facility once produced paper – the raw material of the newspaper information age. Now, Borlänge will produce the raw material for AI and the next information age.” In parallel with the construction start, EcoDataCenter has also signed an exclusive agreement to acquire additional land at the site, ensuring additional capacity to meet the rapidly growing demand for compute power. Erik Nises (S), Chairman of the Municipal Board in Borlänge, concludes, “We value what EcoDataCenter brings to our municipality and are pleased that construction can begin so soon after the site acquisition. We look forward to seeing the Kvarnsveden paper mill brought to life in a new form,” says Erik Nises (S), chairman of the municipal board in Borlänge. For more from EcoDataCenter, click here.

Renewables key to public support for DCs, says report
A new poll has found that public support for UK data centre development depends heavily on the use of renewable energy. The research, carried out by YouGov for net zero communications agency Alpaca Communications and supported by TechUK, shows that while most people are in favour of new data centres, they are cautious about their environmental and social impact. Renewables drive public approval According to the findings, 75% of respondents support data centres powered by renewable energy. This drops to 40% for nuclear power and just 20% for fossil fuels. The report, Powering the Fourth Industrial Revolution, identifies renewable energy as the strongest driver of support. Sustainability concerns, including the environmental impact of construction (40%) and ongoing operations (28%), ranked as key public priorities, alongside cyber security (35%) and cost (28%). By contrast, appearance (15%) and distance from homes (24%) were lower priorities. Despite the role data centres play in everyday life - from NHS records to online banking, streaming, and AI - awareness remains low. Only 8% of people say they “know a lot” about data centres, while 27% have never heard of them. Even among 18–24 year olds, often viewed as the most digitally engaged, just 3% claim to know much about the sector. National support drops at local level The research highlights a gap between national and local support. More than half of people (52%) back additional data centres across the UK, but this falls to 44% when projects are located near their communities. The report argues that developers can address this by making projects relatable to communities, highlighting benefits such as jobs, training, digital access, and investment in local infrastructure. AI, another driver of demand for data centres, also divides opinion. While most people have heard of it, only 18% feel positive about its impact on the UK compared with 42% who feel negative. Sector urged to focus on trust and sustainability Peter Elms, Founder and Director at Alpaca Communications, says, “Data centres are the critical infrastructure powering the UK’s AI revolution, but they’re invisible to the public. The sector has a choice: keep quiet and risk opposition, or go green, engage locally, and earn trust.” Luisa Cardani, Head of Data Centres Programme at TechUK, adds, “With data centres contributing £4.7 billion annually to the UK economy and supporting 43,000 jobs, the industry must now make sustainability central. The message from the public is clear: renewable power is the only option.” The report concludes that to secure public support, data centres need to be explained in clear, relatable terms; powered sustainably; and developed in partnership with local communities. With demand for AI and digital services rising, the research points to an opportunity for the technology and energy sectors to align infrastructure with public expectations.

Schneider Electric unveils AI DC reference designs
Schneider Electric, a French multinational specialising in energy management and industrial automation, has announced new data centre reference designs developed with NVIDIA, aimed at supporting AI-ready infrastructure and easing deployment for operators. The designs include integrated power management and liquid cooling controls, with compatibility for NVIDIA Mission Control, the company’s AI factory orchestration software. They also support deployment of NVIDIA GB300 NVL72 racks with densities of up to 142kW per rack. Integrated power and cooling management The first reference design provides a framework for combining power management and liquid cooling systems, including Motivair technologies. It is designed to work with NVIDIA Mission Control to help manage cluster and workload operations. This design can also be used alongside Schneider Electric’s other data centre blueprints for NVIDIA Grace Blackwell systems, allowing operators to manage the power and liquid cooling requirements of accelerated computing clusters. A second reference design sets out a framework for AI factories using NVIDIA GB300 NVL72 racks in a single data hall. It covers four technical areas: facility power, cooling, IT space, and lifecycle software, with versions available under both ANSI and IEC standards. Deployment and performance focus According to Schneider Electric, operators are facing significant challenges in deploying GPU-accelerated AI infrastructure at scale. Its designs are intended to speed up rollout and provide consistency across high-density deployments. Jim Simonelli, Senior Vice President and Chief Technology Officer at Schneider Electric, says, “Schneider Electric is streamlining the process of designing, deploying, and operating advanced, AI infrastructure with its new reference designs. "Our latest reference designs, featuring integrated power management and liquid cooling controls, are future-ready, scalable, and co-engineered with NVIDIA for real-world applications - enabling data centre operators to keep pace with surging demand for AI.” Scott Wallace, Director of Data Centre Engineering at NVIDIA, adds, “We are entering a new era of accelerated computing, where integrated intelligence across power, cooling, and operations will redefine data centre architectures. "With its latest controls reference design, Schneider Electric connects critical infrastructure data with NVIDIA Mission Control, delivering a rigorously validated blueprint that enables AI factory digital twins and empowers operators to optimise advanced accelerated computing infrastructure.” Features of the controls reference design The controls system links operational technology and IT infrastructure using a plug-and-play approach based on the MQTT protocol. It is designed to provide: • Standardised publishing of power management and liquid cooling data for use by AI management software and enterprise systems• Management of redundancy across cooling and power distribution equipment, including coolant distribution units and remote power panels• Guidance on measuring AI rack power profiles, including peak power and quality monitoring Reference design for NVIDIA GB300 NVL72 The NVIDIA GB300 NVL72 reference design supports clusters of up to 142kW per rack. A data hall based on this design can accommodate three clusters powered by up to 1,152 GPUs, using liquid-to-liquid coolant distribution units and high-temperature chillers. The design incorporates Schneider Electric’s ETAP and EcoStruxure IT Design CFD models, enabling operators to create digital twins for testing and optimisation. It builds on earlier blueprints for the NVIDIA GB200 NVL72, reflecting Schneider Electric’s ongoing collaboration with NVIDIA. The company now offers nine AI reference designs covering a range of scenarios, from prefabricated modules and retrofits to purpose-built facilities for NVIDIA GB200 and GB300 NVL72 clusters. For more from Schneider Electric, click here.

Microgrids are key to accelerating DC growth, research finds
A combination of renewables, grid balancing engines and energy storage make for the most cost-effective microgrids to power data centres, while also cutting emissions and providing vital grid balancing to enable the energy transition, according to a new research paper from technology group Wärtsilä and energy solutions business AVK. The paper, Data centre dispatchable capacity: a major opportunity for Europe’s energy transition, provides new analysis on how data centre microgrids can reduce grid infrastructure spending, emissions and wasted energy, while providing a balanced path for the energy transition.The analysis finds that powering the data centres across Europe by optimised microgrids could create a significant bank of dispatchable power, supporting the entire continent’s energy transition. The rapid growth of AI is driving increased demand for data centres across Europe, which is expected to increase by 250% by 2030, from 10GW to 35GW. With the continent’s grid facing constraints from high energy prices and bloated grid connection queues, data centre operators are increasingly turning to off-grid solutions to power these energy-intensive assets. Anders Lindberg, President of Wärtsilä Energy and Executive Vice President of Wärtsilä, says, “The growth of AI over recent years has been extraordinary, and as it continues to transform the way we live and work, it drives a need for more energy. This is causing significant challenges for grid operators across Europe, who are struggling with rising costs and up to a 10-year waiting time for a grid connection. “By investing in microgrids, data centres can sidestep energy constraints, and with the right technology mix of renewables, grid balancing engines and energy storage, can ensure their emissions profiles and costs do not outweigh the huge benefits that AI brings. AVK CEO Ben Pritchard comments, “The answer to the challenges we face in combatting climate change is as much to do with changing behaviours as developing new technologies. And the key to behavioural change is the recognition that there are different ways of doing things. The solutions outlined in this paper are not impractical; they are based on real-world cases and calculations. All that’s needed to make them more widespread is for investors, operators, equipment suppliers, planners, policy makers to recognise the widespread benefits that sharing dispatchable data centre capacity with the grid can bring and pass that knowledge on.” In addition to benefits created by microgrids, engine power plants bring cost efficiencies to data centre power generation. Modelling an 80MW data centre, a combination of engine power plants, renewables, and energy storage provides the lowest levelised cost of electricity – at 108 EUR/MWh – in comparison to three other real-world scenarios. It also offers a low emissions scenario in comparison to the other modelled scenarios, and particularly in comparison to gas turbines. The emissions of engine power plants can also decrease as sustainable fuels become commercially available. “Through investing in flexibility, microgrids can have the lowest possible cost, while cutting emissions dramatically compared to other pathways including turbines. This flexibility can have a significant, positive impact on the continent’s digital and energy transition,” Anders Lindberg states. On current trajectories, 40% of existing AI data centres will be operationally constrained by power availability by 2027. Microgrids can take this new strain off the grid in the short term and when grid connection is achieved, excess energy generated can be sold. As well as furthering cost reductions for data centre operators, this can provide vital flexibility to Europe's power challenges. Read the new research paper by clicking here. For more from AVK, click here.

Quantum-AI data centre opens in New York City
Oxford Quantum Circuits (OQC) and Digital Realty today announced the launch of the first Quantum-AI Data Centre in New York City, located at Digital Realty’s JFK10 facility and built with NVIDIA GH200 Grace Hopper Superchips. - Quantum-AI Data Centre: OQC and Digital Realty are working with NVIDIA to integrate superconducting quantum computers and AI supercomputing under one roof, creating a data centre built for the Quantum-AI era. - Landmark deployment and integration: OQC’s GENESIS quantum computer will integrate NVIDIA Grace Hopper Superchips to become the first-ever quantum computing system deployed in New York City. OQC plans to integrate its quantum hardware with NVIDIA accelerated computing to support the scalability of future systems. - Quantum-AI at scale: Embedded within Digital Realty’s global platform, PlatformDIGITAL, OQC is delivering secure, interconnected Quantum-AI infrastructure to power breakthroughs from Wall Street to Security – a central pillar of the UK–US tech trade Partnership to be announced.The Quantum-AI Data Centre brings together OQC’s quantum computing, NVIDIA accelerated AI hardware, and Digital Realty’s cutting-edge global infrastructure, eliminating geographical and infrastructure barriers to enable businesses to harness the power of quantum compute and AI. This initiative allows enterprises to access an integrated environment where quantum computing powers the AI revolution: enabling faster model training, more efficient data generation, and transformative applications in finance and security. The system features OQC GENESIS, a logical-era quantum computer, installed within Digital Realty’s secure JFK10 site – the first-ever quantum computer installed within a New York City data centre. Integrated with NVIDIA Grace Hopper Superchips, the platform provides a launchpad for hybrid workloads and enterprise adoption at scale. OQC expects that future GENESIS systems will ship with NVIDIA accelerated computing as standard, building on its earlier collaboration integrating the NVIDIA CUDA-Q platform and providing developers seamless tools to build hybrid quantum-AI applications. “This Quantum-AI Data Centre demonstrates how quantum can drive the AI revolution - securely, practically, and at scale - while strengthening the UK–US technology alliance.” says Gerald Mullally, CEO of OQC. “Leveraging Digital Realty’s infrastructure and NVIDIA supercomputing, we are redefining enterprise computing for finance and security.” “Digital Realty’s mission has always been to enable the world’s most innovative technologies by providing secure, interconnected infrastructure at global scale,” adds Andy Power, President & CEO of Digital Realty. “By working with OQC, we’re using NVIDIA supercomputing to make Quantum-AI directly accessible in one of the world’s most important data hubs - empowering enterprises and governments to unlock new levels of performance and resilience.” Science Minister Patrick Vallance comments, “Quantum computing could transform everything - from speeding up drug discovery to supercharging clean energy so we can cut bills. The economic prize is enormous, with £212 billion expected to flow into the UK economy by 2045 and tens of thousands of high-skilled jobs on offer. OQC’s launch of the first quantum computer in New York City showcases British tech excellence and strengthens our transatlantic ties. And the industry’s first quantum-AI data centre will put British innovation at the heart of next-gen computing - delivering speed, scale and security to tackle problems today’s tech is yet to grasp." Applications and impact By integrating quantum computing with NVIDIA AI supercomputing inside a secure enterprise-grade data centre, OQC and Digital Realty are creating a platform that will unlock new possibilities across critical sectors: Finance: Faster and more accurate risk modelling, portfolio optimisation, fraud detection, and derivatives pricing, delivering competitive advantage in the world’s most data-intensive markets. Security: Advanced material simulation, logistics optimisation, and decision-making under uncertainty, strengthening resilience in mission-critical domains. Quantum for AI: Quantum computing will unlock new frontiers for AI itself, from accelerating model training and efficient data generation to emerging quantum machine learning applications with transformative impact across industries. “This milestone shows the strength of a British tech leader scaling globally through international collaboration,” says Jack Boyer, Chair of OQC. “Working with Digital Realty and using NVIDIA supercomputing here in the United States, OQC demonstrates how the UK and US can lead together in the responsible deployment of frontier technologies for finance and security” “The UK–US technology alliance is vital to ensuring that powerful new capabilities like quantum computing protect our nations, improve our prosperity, and are developed securely and in line with democratic values,” remarks Sir Jeremy Fleming, OQC Board member and former Director of GCHQ. “This deployment combines British innovation and American infrastructure, and brings NVIDIA’s AI leadership to deliver trusted computing power for the most critical applications.” Proven technology and roadmap OQC is reportedly the only quantum computing company with live deployments into colocated data centres: OQC already has systems operating in London and Tokyo, and now in New York. Its patented dual-rail Dimon qubit technology represents a breakthrough in error suppression, reducing the hardware overheads needed for error-corrected qubits and accelerating the path to fault-tolerant quantum computing. OQC has set a market leading roadmap – in collaboration with Digital Realty – to deliver scalable, commercially viable systems, with near-term impact in finance, defence, and AI. As a British champion of quantum computing, OQC is committed to building systems that drive both commercial advantage and national resilience. For more from Digital Realty, click here.

NorthC to build new data centre at uptownBasel campus
NorthC, a Dutch provider of sustainable data centre and colocation services, has signed an agreement to develop a regional data centre at the uptownBasel Innovation Campus in Arlesheim, Switzerland. The project expands NorthC’s existing collaboration with uptownBasel and will deliver a facility designed to support advanced technologies such as artificial intelligence, quantum computing, diagnostics, and personalised medicine. Construction is scheduled to begin around 18 months after planning approval, with operations expected to start by mid-2027. Sustainable design and regional focus In its first phase, the facility will cover 2,500 m² and provide 6 MVA of power capacity. It will be powered entirely by renewable energy, with backup systems running on green diesel, part of NorthC’s goal to achieve climate neutrality by 2030. The centre will also incorporate waste heat reuse for residential heating. Hans-Jörg Fankhauser, founder of uptownBasel, says, “With NorthC, we have a partner that shares our vision, one that understands the potential of operating a data centre on a globally recognised innovation campus.” Fankhauser adds that uptownBasel aims to be “a platform for leading networks in medical technology, quantum computing, artificial intelligence, and the future of work. We aim to attract startups and talent to Arlesheim early on in their journey.” NorthC already operates two facilities in nearby Münchenstein and says the new site will strengthen its role as an infrastructure provider for MedTech, Industry 4.0, and AI in the Basel region. With this development, the company will operate five data centres in Switzerland. Alexandra Schless, CEO of NorthC Group, comments, “We are very pleased to deepen our partnership with uptownBasel. The construction of this data centre underscores our continued commitment to Switzerland as a strategic location and reflects our belief in innovation and the future.” For more from NorthC, click here.

Quantum-ready FN-DSA (FIPS 206) nears draft approval
NIST has submitted the draft standard for FN-DSA (FIPS 206), the FALCON-based digital signature scheme, moving it closer to formal adoption as part of the post-quantum cryptography (PQC) standardisation process. FN-DSA was selected alongside ML-DSA and SLH-DSA for PQC standardisation, but its approval has taken longer due to mathematical complexity and refinements to its components. With the draft now submitted, the first release is imminent. The draft will be published as an Initial Public Draft (IPD) for open review. While the timeline has not been finalised, it may coincide with the NIST PQC Standardisation Conference in September 2025. Based on past schedules, the review period is expected to last around one year, with a final standard likely in late 2026 or early 2027. Industry preparations Companies such as DigiCert, a US-based digital security company, are preparing for FN-DSA’s rollout. To avoid confusion around naming and identifiers, DigiCert has stated it will not implement FN-DSA in production until the standard is finalised. In the meantime, the company will make the IPD version available for experimentation through DigiCert Labs, which already hosts FALCON for testing. This will enable the wider community to trial the draft standard before formal approval. Role in post-quantum cryptography FN-DSA is seen as a special purpose scheme rather than a replacement for ML-DSA. Its smaller signature sizes could reduce certificate chain lengths, which is valuable in environments where efficiency is a priority. However, due to the complexity of FALCON’s signing process, FN-DSA is less suited for frequently signed leaf certificates. Instead, it is expected to be more useful for root and intermediate certificates. NIST has also signalled potential adjustments to signing and sampling methods, which could broaden FN-DSA’s applications once the draft specification is published. The progress of FN-DSA marks another milestone in the move towards quantum-safe standards. Organisations are being encouraged to begin preparation now by testing draft algorithms, trialling implementations, and developing crypto-agility strategies to ensure a smooth transition as PQC standards are finalised. For more from DigiCert, click here.



Translate »