Advertise on DCNN Advertise on DCNN Advertise on DCNN

Quantum Computing: Infrastructure Builds, Deployment & Innovation


Sitehop, Red Helix testing quantum-ready encryption
Sitehop, a UK startup focused on quantum encryption, has announced its partnership with Red Helix, a network and security testing company, to bring advanced testing in-house and to "supercharge" the critical speed-testing of its encrypted data transmission, utilising a Teledyne LeCroy Xena system. With support from a five-figure 'productivity grant' from South Yorkshire Mayoral Authority, Sitehop has invested in a Teledyne LeCroy Xena Loki 100G traffic-generation and testing platform, which enables bi-directional testing of sub-microsecond latency in 100Gbps networks. Bringing testing in-house has also reportedly eliminated delays and risks in export and customs, which included a minimum two-week turnaround at more than £18,000 per testing cycle. Previously, Sitehop relied on an outsourced facility in France, but the new UK-based set up enables them to complete testing in a single day, freeing the time of Sitehop’s engineering teams and boosting their productivity. Sitehop uses the Xena Loki device to test and validate its Sitehop SAFEcore platform, capable of 835 nanoseconds latency at 100Gbps encryption. The platform can support 4,000 concurrent connections, deploying "crypto-agile" encryption for use in sectors such as telecoms, financial services, government, and critical national infrastructure. Testing with the Xena Loki device covers peak load conditions, burst traffic, error injection and fault recovery, and end-to-end encrypted traffic flows. Multi-stream stress tests, mixed protocol environments, and real-time encrypted traffic benchmarking are part of the processes. According to the company, the "speed and accuracy" of the Xena Loki platform enables Sitehop to validate latency, throughput, packet-loss, and error-handling across different profiles. This is important to prove the Sitehop SAFEcore platform has the necessary performance and resilience in high-bandwidth, low-latency environments and is ready for new use cases such as 5G backhaul, wearable security technology, and the evolution of post-quantum cryptography. “Testing in this way is a strategic enabler for us, accelerating product release cycles and reducing the risk of field failure while providing clients with higher levels of confidence during procurement,” says Melissa Chambers, co-founder and CEO of Sitehop. “This is a major selling-point for enterprise and critical infrastructure environments.” “We are incredibly proud to be at the forefront of the next generation of British tech manufacturing and believe we are part of a resurgence of innovation in the UK. We are proving that deep tech, hardware innovation, and cyber resilience can thrive here. "As we expand globally and target high-assurance sectors, our ability to validate performance independently and rapidly becomes a cornerstone of our growth model. The grant we received has been hugely important, enabling us to bring a critical capability in-house that has accelerated our growth momentum.” Baseline validation using the Xena Loki device is in line with the benchmarks RFC 2544 and Y.1564. In practice, however, the Sitehop SAFEcore system - the company claims - "frequently outperforms the scope of traditional methodologies, requiring custom profiles including simulated threat-scenarios, multi-session encrypted traffic under dynamic key exchange, and adaptive stream-shaping." Liam Jackson, Director of Technology Solutions at Red Helix, comments, “We are thrilled to work with Sitehop, an exciting start-up company demonstrating that hardware-based security innovation is alive and well in the UK. "Testing quantum-ready security platforms requires precise accuracy, reliability, and sustained high-speed throughput, which software-only traffic-generation tools can struggle to deliver. "Sitehop understands this, and by harnessing the hardware-based Teledyne LeCroy Xena Loki platform, it hugely accelerates essential testing, gaining the speed, precision, and confidence to bring its cutting-edge solutions to market faster without impacting quality.”

DigiCert opens registration for World Quantum Readiness Day
DigiCert, a US-based digital security company, today announced open registration for its annual World Quantum Readiness Day virtual event, which takes place on Wednesday, 10 September 2025. The company is also accepting submissions for its Quantum Readiness Awards. Both initiatives intend to spotlight the critical need for current security infrastructures to adapt to the imminent reality of quantum computing. World Quantum Readiness Day is, according to DigiCert, a "catalyst for action, urging enterprises and governments worldwide to evaluate their preparedness for the emerging quantum era." It seeks to highlight the growing urgency to adopt post-quantum cryptography (PQC) standards and provide a "playbook" to help organisations defend against future quantum-enabled threats. “Quantum computing has the potential to unlock transformative advancements across industries, but it also requires a fundamental rethink of our cybersecurity foundations,” argues Deepika Chauhan, Chief Product Officer at DigiCert. “World Quantum Readiness Day isn’t just a date on the calendar, it’s a starting point for a global conversation about the urgent need for collective action to secure our quantum future.” The Quantum Readiness Awards were created to celebrate organisations that are leading the charge in quantum preparedness. Judges for the Quantum Readiness Awards include: · Bill Newhouse, Cybersecurity Engineer & Project Lead, National Cybersecurity Center of Excellence, NIST· Dr Ali El Kaafarani, CEO, PQShield· Alan Shimel, CEO, TechStrong Group· Blair Canavan, Director, Alliances PQC Portfolio, Thales· Tim Hollebeek, Industry Technology Strategist, DigiCert For more from DigiCert, click here.

IBM, RIKEN unveil first Quantum System Two outside of the US
IBM, an American multinational technology corporation, and RIKEN, a national research laboratory in Japan, have unveiled the first IBM Quantum System Two ever to be deployed outside of the United States and beyond an IBM quantum data centre. The availability of this system also marks a milestone as the first quantum computer to be co-located with RIKEN's supercomputer, Fugaku — one of the most powerful classical systems on Earth. This effort is supported by the New Energy and Industrial Technology Development Organisation (NEDO), an organisation under the jurisdiction of Japan's Ministry of Economy, Trade, and Industry (METI)'s 'Development of Integrated Utilisation Technology for Quantum and Supercomputers' as part of the 'Project for Research and Development of Enhanced Infrastructures for Post 5G Information and Communications Systems.' IBM Quantum System Two at RIKEN is powered by IBM's 156-qubit IBM Quantum Heron, one of the company's quantum processors. IBM Heron's quality as measured by the two-qubit error rate, across a 100-qubit layered circuit, is 3x10-3 — which, the company claims, is 10 times better than the previous generation 127-qubit IBM Quantum Eagle. IBM Heron's speed, as measured by the CLOPS (circuit layer operations per second) metric, is 250,000, which would reflect another 10 times improvement in the past year over IBM Eagle. At a scale of 156 qubits, with these quality and speed metrics, Heron is the most performant quantum processor in the world. This latest Heron is capable of running quantum circuits that are beyond brute-force simulations on classical computers, and its connection to Fugaku will enable RIKEN teams to use quantum-centric supercomputing approaches to push forward research on advanced algorithms, such as fundamental chemistry problems. The new IBM Quantum System Two is co-located with Fugaku within the RIKEN Center for Computational Science (R-CCS), one of Japan's high-performance computing (HPC) centres. The computers are linked through a high-speed network at the fundamental instruction level to form a proving ground for quantum-centric supercomputing. This low-level integration aims to allow RIKEN and IBM engineers to develop parallelised workloads, low-latency classical-quantum communication protocols, and advanced compilation passes and libraries. Because quantum and classical systems will ultimately offer different computational strengths, this hopes to allow each paradigm to perform the parts of an algorithm for which it is best suited. This new development expands IBM's global fleet of quantum computers and was officially launched during a ribbon-cutting ceremony today (24 June 2025) in Kobe, Japan. The event featured opening remarks from RIKEN President Makoto Gonokami; Jay Gambetta, IBM Fellow and Vice President of IBM Quantum; Akio Yamaguchi, General Manager of IBM Japan; as well as local parliament members and representatives from the Kobe Prefecture and City, METI, NEDO, and MEXT. "The future of computing is quantum-centric and with our partners at RIKEN we are taking a big step forward to make this vision a reality," claims Jay Gambetta, VP, IBM Quantum. "The new IBM Quantum System Two, powered by our latest Heron processor and connected to Fugaku, will allow scientists and engineers to push the limits of what is possible." "By combining Fugaku and the IBM Quantum System Two, RIKEN aims to lead Japan into a new era of high-performance computing," says Mitsuhisa Sato, Division Director of the Quantum-HPC Hybrid Platform Division, RIKEN Center for Computational Science. "Our mission is to develop and demonstrate practical quantum-HPC hybrid workflows that can be explored by both the scientific community and industry. The connection of these two systems enables us to take critical steps toward realising this vision." The installation of IBM Quantum System Two at RIKEN is poised to expand previous efforts by RIKEN and IBM researchers as they seek to discover algorithms that offer quantum advantage: the point at which a quantum computer can solve a problem faster, cheaper, or more accurately than any known classical method. This includes work recently featured on the cover of Science Advances, based on sample-based quantum diagonalisation (SQD) techniques to accurately model the electronic structure of iron sulphides, a compound present widely in nature and organic systems. The ability to realistically model such a complex system is essential for many problems in chemistry, and was historically believed to require fault-tolerant quantum computers. SQD workflows are among the first demonstrations of how the near-term quantum computers of today can provide scientific value when integrated with powerful classical infrastructure. For more from IBM, click here.

KETS Quantum Security reacts to Salt Typhoon cyber attacks
On the back of the Salt Typhoon cyber attacks, Chris Erven, CEO & Co-Founder of KETS Quantum Security, comments on the potential threat of China developing a quantum computer and the danger for telecommunications companies. Chris takes up the story: “This is a fully global threat. Every single telco should be considering their cyber defences in the wake of the Salt Typhoon attacks. “China is making some of the largest investments in quantum computing, pumping in billions of dollars into research and development in the hope of being the first to create a large-scale, cryptographically relevant machine. And although they may be a few years away from being fully operational, we know a quantum computer will be capable of breaking all traditional cyber defences we currently use. So they, and others, are actively harvesting now, to decrypt later. “Telcos are particularly vulnerable since they provide the communication services for major enterprises and many governments, so these organisations should be the first to upgrade to quantum-safe methods, including a defence in depth approach with quantum key distribution and post quantum algorithms. “Adding to the danger, many telcos are moving to software-defined networks which use software-based controllers to manage the underlying network infrastructure rather than dedicated and more restricted hardware devices. This makes them particularly vulnerable because if an adversary gets into the management plane of a telco's SDN, they will have complete control of that network; whereas in the past, the access would have been much more limited. We really are talking about taking down the UK’s national telecommunications network. “Despite warning bells being raised for the last decade, Q Day is rapidly approaching, and telcos have to prepare now to avoid a catastrophic data breach. Thankfully, telcos - like BT and SK Telecom - are actively working to upgrade their systems to make them quantum-safe in the future. However, this transition needs to happen even quicker, and the Salt Typhoon attacks serve as a timely reminder that robust cyber defences are not a ‘nice to have’ - they are essential to protecting our way of living.”

Amidata implements Quantum ActiveScale to launch new cloud storage service
Quantum Corporation has announced that Amidata has implemented Quantum ActiveScale Object Storage as the foundation for its new Amidata Secure Cloud Storage service. Amidata has deployed ActiveScale object storage to build a secure, resilient set of cloud storage services accessible from across all of Australia, where the company is based. This way the company achieves simple operational efficiency, seamless scalability, and the ability to address customer needs across a wide range of use cases, workflows, and price points.  Amidata’s adoption of object storage also aligns with current IT trends. “More and more organisations are looking at object storage to create secure and massively scalable hybrid clouds,” says Michael Whelan, Managing Director, Amidata. “ActiveScale provides a durable, cost-effective approach for backing up and archiving fast-growing data volumes while also protecting data from ransomware attacks. Plus, by deploying the ActiveScale Cold Storage feature, we are delivering multiple storage classes as part of our service offerings, allowing us to target a wider set of customers and use cases. With our Secure Cloud cold storage option, customers can retain data longer and at a lower cost; that’s useful for offsite copies, data compliance, and increasingly, for storing the growing data sets that are fuelling AI-driven business analytics and insights.”  ActiveScale also supports multiple S3-compatible storage classes using flash, disk, and tape medias, providing a seamless environment that can flexibly grow capacity and performance to any scale. Cold Storage, a key feature, integrates Quantum Scalar tape libraries as a lower cost storage class to efficiently store cold and archived data sets. Quantum’s tape libraries are nearline storage, where customers can easily access and retrieve cold or less used data with slightly longer latency—minutes instead of seconds—but at a low cost, leveraging the same infrastructure used by the major hyperscalers. It intelligently stores and protects data across all storage resources using Quantum’s patented two-dimensional erasure coding to achieve extreme data durability, performance, availability, and storage efficiency.  For more information on Amidata’s implementation of ActiveScale, view the video case study.

MR Datentechnik launches a new data storage service
MR Datentechnik has implemented Quantum ActiveScale object storage along with Veeam Backup and Replication to launch a new S3-compatible storage service. By using ActiveScale, the company offers a resilient, highly scalable service that has the flexibility to support a wide range of S3-enabled apps and workflows.“Quantum ActiveScale provides the reliable, highly scalable S3-compatible object storage we needed for building our new storage service. The platform is stable even under high loads, and it offers sophisticated software that is extremely useful for multi-tenant management,” says Jochen Kraus, Managing Director, MR Datentechnik. Solution overview Quantum ActiveScale Object Storage  Veeam Backup and Replication Key benefits Built an easily scalable online data storage service to accommodate rapidly rising customer data volumes Seamlessly integrated with software, Veeam Backup and Replication v12 Accelerated customer onboarding to under a day and achieved customer growth targets one year early Created a resilient, always-on service that provides reliable data access for customers Streamlined storage administration to minimise overhead and efficiently scale the managed service Gained the flexibility for future use cases by supporting the S3 storage protocol Headquartered in the German state of Bavaria, MR Datentechnik offers a full range of IT solutions and managed services. Organisations engage the company for everything from infrastructure deployment and systems integration to digitisation initiatives and fully outsourced IT management. Recently, the leadership team at MR Datentechnik decided to launch a new storage service to support customers’ needs to preserve and protect fast-growing data volumes. The service, which would be designed for online storage of object data, could be used for backup and recovery, archiving and data security. This online service would enable organisations to retrieve data rapidly — anytime, from anywhere. Creating an S3-compatible service was a top priority. The team wanted to support S3 applications and workflows and facilitate integration with S3 cloud storage environments. For the service’s launch, it decided to focus first on the backup use case. As a result, the underlying storage platform for the service had to integrate seamlessly with the latest version of Veeam Backup and Replication.

Quantum announces ActiveScale Cold Storage bundles
Quantum has announced new pre-configured bundles to make it even easier to purchase and deploy Quantum ActiveScale Cold Storage, an S3-enabled object storage solution architected for both active and cold data sets, that reduce cold storage costs by up to 60%.  With the massive amount of data that customers need to retain for business and compliance purposes, they are using both public and private cloud resources to store and manage it, driven by their budget, the frequency with which they need to access the data, and their data protection requirements. With ActiveScale, customers can build their own cloud storage resource to control costs and ensure fast, easy access to their data for compliance, analysis, and to gather insights to drive business forward.  As a leading 'outperformer' in the latest GigaOm Object Storage: Enterprise Radar Report, ActiveScale combines advanced object store software with hyperscale tape technology to provide massively scalable, highly durable, and extremely low-cost storage for archiving cold data, enabling organisations to maintain control of their most valuable data assets and unlock value in cold data over years and decades without unpredictable and expensive access fees.  Whether customers are developing solutions for life and Earth sciences, media production, government programs, web services, IoT infrastructure, or video surveillance, ActiveScale is ideal for unstructured data management, data analytics and AI workloads, active archiving, and long-term retention and protection of massive datasets.   To simplify purchasing, ActiveScale Cold Storage is now available in pre-configured bundles, complete with all the components that customers need to easily deploy the solution. The bundles are available in four standard capacity sizes — small, medium, large and extra large — ranging from 10PB up to 100PB. 

Quantum expands hybrid cloud leadership with new features 
Quantum has announced new features in the company’s end-to-end data platform, including advances to its policy-driven data movement technologies. This is to help customers build their ideal hybrid cloud workflow to seamlessly bridge on-prem deployments with multi-cloud integration.  With the massive amount of data customers need to retain for business and compliance purposes, customers are using both public and private cloud resources to store and manage this data, driven by their budget, frequency with which they need to access the data, and their data protection requirements. With these new features, customers can place data exactly where it's needed, when it’s needed. By using a highly flexible and powerful hybrid cloud environment, customers increase operations agility, reduce business risk, and optimise costs across on-prem and public cloud resources.   New capabilities include:  Quantum DXi Cloud Share: With DXi Cloud Share, customers get more flexibility as DXi appliances can now be tier compressed, deduplicated backup data sets to both private and public storage clouds, including Quantum ActiveScale, providing up to 70 times more efficient use of cloud storage. This reduces business risks and costs, enabling offsite protection against ransomware and long-term retention of backup data for regulatory and in-house data compliance.  Quantum FlexSync 3: FlexSync 3 adds fast, simple data replication to and from public and private clouds, including Quantum ActiveScale and in a future release, Quantum Myriad. It provides a versatile data movement tool across Quantum’s entire end-to-end portfolio, enabling customers to unite multiple on-premises and public cloud workflows across geographies with a shared, centralised object repository. This enables better collaboration among dispersed teams and delivers enhanced data protection and disaster recovery.   Quantum ActiveScale cold replication: As organisations continue to collect, retain and analyse petabytes to exabytes of data with advanced AI analytics, they must reduce the total cost of ownership required to preserve this data. The ActiveScale object storage platform now provides the industry’s first immutable object replication between cold data services, replicating cold data between its systems, as well as replicating its cold data to AWS S3 Glacier Flexible Retrieval and Glacier Deep Archive Services. For massive data sets whose useful life spans from years to decades, it delivers the most durable, cost-effective multi-copy solution for long-term retention so customers can analyse and derive insights from their data to drive business forward.  The new ActiveScale Cold Replication feature and FlexSync 3 are available immediately. DXi Cloud Share is planned for release in Q4 2023. FlexSync 3 for Myriad is planned for release next year.

IBM to build its first European quantum data centre
The IBM Facility in Ehningen, Germany, has announced plans to open its first Europe-based quantum data centre to facilitate access to cutting-edge quantum computing for companies, research institutions and government agencies. The data centre is expected to be operational in 2024, with multiple IBM quantum computing systems, each with utility scale quantum processors (those of more than 100 qubits). The data centre will be located at IBM’s facility in Ehningen and will serve as IBM Quantum’s European cloud region, for users in Europe to provision services at the data centre for their quantum computing research and exploratory activity. The data centre is being designed to help clients continue to manage their European data regulation requirements, including processing all job data within EU borders. The facility will be IBM’s second quantum data centre and quantum cloud region, after its New York facility. “Europe has some of the world’s most advanced users of quantum computers, and interest is only accelerating with the era of utility scale quantum processors,” says Jay Gambetta, IBM Fellow and Vice President of IBM Quantum. “The planned quantum data centre and associated cloud region will give European users a new option as they seek to tap the power of quantum computing in an effort to solve some of the world’s most challenging problems.” “Our quantum data centre in Europe is an integral piece of our global endeavour,” says Ana Paula Assis, IBM General Manager for EMEA. “It will provide new opportunities for our clients to collaborate side-by-side with our scientists in Europe, as well as their own clients, as they explore how best to apply quantum in their industry.”

Why cold storage and the new data era go hand in hand
By Timothy Sherbak, Enterprise Products and Solutions Marketing at Quantum. The phrase 'digital transformation' has been bandied around a lot over recent years, as many companies adopt technology into their processes to meet changing business and market requirements. The shift in the way companies operate means that the scale of data being generated has dramatically increased. With the amount of unstructured data growing at a rate of up to 60% per year and expected to make up 80 to 90% of all data by 2025 – the are no signs of things slowing down. To put this into perspective, every year, a semiconductor manufacturer produces over a billion image scans documenting the manufacturing process of 4000 wafers every week, which results in petabytes of data that must be stored for several years. Another example is autonomous cars, an autonomous car produces up to 2TBs of data every hour of use that must be maintained for several years for potential safety analysis or remodelling. As data growth increases, companies’ storage budgets and data storage capacities are struggling to keep up. The issue of storing unstructured and inactive data needs to be addressed - this is where cold storage comes in.   What is cold storage all about? Like every great story, the lifecycle can also be divided up into three parts: the beginning, the middle and the end. The importance of data to a project and how often it needs to be accessed dictates how data can be classified - ‘hot’, ‘warm’ or ‘cold’ data. Data that is ‘on demand’ and being used every day for a project, is referred to as ‘hot’ data. Data that needs to be accessed on a regular basis but not daily is called ‘warm’ data. Then, last but by no means least, there is ‘cold’ data or ‘dormant’ data that is no longer or very rarely needed by organisations for projects they are working on. This data makes up over 60% of all stored data.  Many companies have had to preserve cold data, not out of choice but of necessity to comply with strict regulations or internal policies around storing data. However, attitudes towards cold data are starting to change. More companies are deciding to hold on to cold data because of the long-term strategic value it holds, the future potential of the data to be re-used for projects, as well as the chance to enrich the data over time to augment its value.  Traditionally, cold data has been written to tape media, moved offline, and transferred to a storage facility where it can be accessed for future use. Cold data storage is a highly durable, secure, and inexpensive way to store data in the long term. The emergence of new data analysis methods, increase in computing power, and the rapid acceleration of digital transformation, means there is a greater demand for data to be stored and readily available online. Despite this, there are still concerns around cold storage’s durability, level of security, and storage costs that need to be resolved. Is a cold storage strategy really necessary?  It is predicted that data created during the next three years will amount to more than all of the data created during the past 30 years. As organisations are realising the value of cold data, combined with the surge in the volume of data being generated, the case for cold storage is clear - if companies want to stay ahead of the data curve, they must put act quickly and implement a cold storage strategy. Companies must also prioritise building meaningful and scalable solutions to harvest data-driven innovations and opportunities to reduce the common problems associated with massive data accumulation and continued data growth.  What are the perks of cold storage?  Recent innovations in cold data storage, have improved performance, data durability, and storage efficiency, raising the standard overall expectations. Storing vast amounts of data can be costly for organisations, however, cold data storage is considerably cheaper than the NVMe and solid-state disk technologies deployed for high performance access to hot data. One of the key reasons for this is because cold data can be stored on lower performing and cheaper storage infrastructures, either in-house or in the cloud. This means organisations can cost-effectively store more of their growing data sets.  Most modern cold storage archives have been developed by some of the world’s largest cloud solution providers, but with emerging architectures and services, cold storage solutions are now deployable within an organisation’s data centre, colocation facility or hosted IT environment that meets data sovereignty and data residency requirements. As a result, the data is more easily accessible with no access fees and retrieval times are reduced from days to minutes. Additionally, new erasure coding algorithms are now optimised specifically for cold storage which reduces the storage overhead compared to the traditional method of storing multiple copies. Unlocking the value of data Cold storage enables organisations to safely store data and maintain in-house control of these invaluable assets and easily retrieve them for future projects. It increasingly clear in this new data era that organisations cannot just disregard data if it doesn’t serve any immediate purpose. Certain data must be stored, both for its historic relevance and value for future projects. Advances in cold storage, mean companies don’t have to sacrifice inactive data for active data. Whether an organisation is looking to reduce storage costs or boost the value of its data, cold storage provides a real solution to the problem of data accumulation.



Translate »