Advertise on DCNN Advertise on DCNN Advertise on DCNN

Quantum Computing


KETS Quantum Security reacts to Salt Typhoon cyber attacks
On the back of the Salt Typhoon cyber attacks, Chris Erven, CEO & Co-Founder of KETS Quantum Security, comments on the potential threat of China developing a quantum computer and the danger for telecommunications companies. Chris takes up the story: “This is a fully global threat. Every single telco should be considering their cyber defences in the wake of the Salt Typhoon attacks. “China is making some of the largest investments in quantum computing, pumping in billions of dollars into research and development in the hope of being the first to create a large-scale, cryptographically relevant machine. And although they may be a few years away from being fully operational, we know a quantum computer will be capable of breaking all traditional cyber defences we currently use. So they, and others, are actively harvesting now, to decrypt later. “Telcos are particularly vulnerable since they provide the communication services for major enterprises and many governments, so these organisations should be the first to upgrade to quantum-safe methods, including a defence in depth approach with quantum key distribution and post quantum algorithms. “Adding to the danger, many telcos are moving to software-defined networks which use software-based controllers to manage the underlying network infrastructure rather than dedicated and more restricted hardware devices. This makes them particularly vulnerable because if an adversary gets into the management plane of a telco's SDN, they will have complete control of that network; whereas in the past, the access would have been much more limited. We really are talking about taking down the UK’s national telecommunications network. “Despite warning bells being raised for the last decade, Q Day is rapidly approaching, and telcos have to prepare now to avoid a catastrophic data breach. Thankfully, telcos - like BT and SK Telecom - are actively working to upgrade their systems to make them quantum-safe in the future. However, this transition needs to happen even quicker, and the Salt Typhoon attacks serve as a timely reminder that robust cyber defences are not a ‘nice to have’ - they are essential to protecting our way of living.”

Amidata implements Quantum ActiveScale to launch new cloud storage service
Quantum Corporation has announced that Amidata has implemented Quantum ActiveScale Object Storage as the foundation for its new Amidata Secure Cloud Storage service. Amidata has deployed ActiveScale object storage to build a secure, resilient set of cloud storage services accessible from across all of Australia, where the company is based. This way the company achieves simple operational efficiency, seamless scalability, and the ability to address customer needs across a wide range of use cases, workflows, and price points.  Amidata’s adoption of object storage also aligns with current IT trends. “More and more organisations are looking at object storage to create secure and massively scalable hybrid clouds,” says Michael Whelan, Managing Director, Amidata. “ActiveScale provides a durable, cost-effective approach for backing up and archiving fast-growing data volumes while also protecting data from ransomware attacks. Plus, by deploying the ActiveScale Cold Storage feature, we are delivering multiple storage classes as part of our service offerings, allowing us to target a wider set of customers and use cases. With our Secure Cloud cold storage option, customers can retain data longer and at a lower cost; that’s useful for offsite copies, data compliance, and increasingly, for storing the growing data sets that are fuelling AI-driven business analytics and insights.”  ActiveScale also supports multiple S3-compatible storage classes using flash, disk, and tape medias, providing a seamless environment that can flexibly grow capacity and performance to any scale. Cold Storage, a key feature, integrates Quantum Scalar tape libraries as a lower cost storage class to efficiently store cold and archived data sets. Quantum’s tape libraries are nearline storage, where customers can easily access and retrieve cold or less used data with slightly longer latency—minutes instead of seconds—but at a low cost, leveraging the same infrastructure used by the major hyperscalers. It intelligently stores and protects data across all storage resources using Quantum’s patented two-dimensional erasure coding to achieve extreme data durability, performance, availability, and storage efficiency.  For more information on Amidata’s implementation of ActiveScale, view the video case study.

MR Datentechnik launches a new data storage service
MR Datentechnik has implemented Quantum ActiveScale object storage along with Veeam Backup and Replication to launch a new S3-compatible storage service. By using ActiveScale, the company offers a resilient, highly scalable service that has the flexibility to support a wide range of S3-enabled apps and workflows.“Quantum ActiveScale provides the reliable, highly scalable S3-compatible object storage we needed for building our new storage service. The platform is stable even under high loads, and it offers sophisticated software that is extremely useful for multi-tenant management,” says Jochen Kraus, Managing Director, MR Datentechnik. Solution overview Quantum ActiveScale Object Storage  Veeam Backup and Replication Key benefits Built an easily scalable online data storage service to accommodate rapidly rising customer data volumes Seamlessly integrated with software, Veeam Backup and Replication v12 Accelerated customer onboarding to under a day and achieved customer growth targets one year early Created a resilient, always-on service that provides reliable data access for customers Streamlined storage administration to minimise overhead and efficiently scale the managed service Gained the flexibility for future use cases by supporting the S3 storage protocol Headquartered in the German state of Bavaria, MR Datentechnik offers a full range of IT solutions and managed services. Organisations engage the company for everything from infrastructure deployment and systems integration to digitisation initiatives and fully outsourced IT management. Recently, the leadership team at MR Datentechnik decided to launch a new storage service to support customers’ needs to preserve and protect fast-growing data volumes. The service, which would be designed for online storage of object data, could be used for backup and recovery, archiving and data security. This online service would enable organisations to retrieve data rapidly — anytime, from anywhere. Creating an S3-compatible service was a top priority. The team wanted to support S3 applications and workflows and facilitate integration with S3 cloud storage environments. For the service’s launch, it decided to focus first on the backup use case. As a result, the underlying storage platform for the service had to integrate seamlessly with the latest version of Veeam Backup and Replication.

Quantum announces ActiveScale Cold Storage bundles
Quantum has announced new pre-configured bundles to make it even easier to purchase and deploy Quantum ActiveScale Cold Storage, an S3-enabled object storage solution architected for both active and cold data sets, that reduce cold storage costs by up to 60%.  With the massive amount of data that customers need to retain for business and compliance purposes, they are using both public and private cloud resources to store and manage it, driven by their budget, the frequency with which they need to access the data, and their data protection requirements. With ActiveScale, customers can build their own cloud storage resource to control costs and ensure fast, easy access to their data for compliance, analysis, and to gather insights to drive business forward.  As a leading 'outperformer' in the latest GigaOm Object Storage: Enterprise Radar Report, ActiveScale combines advanced object store software with hyperscale tape technology to provide massively scalable, highly durable, and extremely low-cost storage for archiving cold data, enabling organisations to maintain control of their most valuable data assets and unlock value in cold data over years and decades without unpredictable and expensive access fees.  Whether customers are developing solutions for life and Earth sciences, media production, government programs, web services, IoT infrastructure, or video surveillance, ActiveScale is ideal for unstructured data management, data analytics and AI workloads, active archiving, and long-term retention and protection of massive datasets.   To simplify purchasing, ActiveScale Cold Storage is now available in pre-configured bundles, complete with all the components that customers need to easily deploy the solution. The bundles are available in four standard capacity sizes — small, medium, large and extra large — ranging from 10PB up to 100PB. 

Quantum expands hybrid cloud leadership with new features 
Quantum has announced new features in the company’s end-to-end data platform, including advances to its policy-driven data movement technologies. This is to help customers build their ideal hybrid cloud workflow to seamlessly bridge on-prem deployments with multi-cloud integration.  With the massive amount of data customers need to retain for business and compliance purposes, customers are using both public and private cloud resources to store and manage this data, driven by their budget, frequency with which they need to access the data, and their data protection requirements. With these new features, customers can place data exactly where it's needed, when it’s needed. By using a highly flexible and powerful hybrid cloud environment, customers increase operations agility, reduce business risk, and optimise costs across on-prem and public cloud resources.   New capabilities include:  Quantum DXi Cloud Share: With DXi Cloud Share, customers get more flexibility as DXi appliances can now be tier compressed, deduplicated backup data sets to both private and public storage clouds, including Quantum ActiveScale, providing up to 70 times more efficient use of cloud storage. This reduces business risks and costs, enabling offsite protection against ransomware and long-term retention of backup data for regulatory and in-house data compliance.  Quantum FlexSync 3: FlexSync 3 adds fast, simple data replication to and from public and private clouds, including Quantum ActiveScale and in a future release, Quantum Myriad. It provides a versatile data movement tool across Quantum’s entire end-to-end portfolio, enabling customers to unite multiple on-premises and public cloud workflows across geographies with a shared, centralised object repository. This enables better collaboration among dispersed teams and delivers enhanced data protection and disaster recovery.   Quantum ActiveScale cold replication: As organisations continue to collect, retain and analyse petabytes to exabytes of data with advanced AI analytics, they must reduce the total cost of ownership required to preserve this data. The ActiveScale object storage platform now provides the industry’s first immutable object replication between cold data services, replicating cold data between its systems, as well as replicating its cold data to AWS S3 Glacier Flexible Retrieval and Glacier Deep Archive Services. For massive data sets whose useful life spans from years to decades, it delivers the most durable, cost-effective multi-copy solution for long-term retention so customers can analyse and derive insights from their data to drive business forward.  The new ActiveScale Cold Replication feature and FlexSync 3 are available immediately. DXi Cloud Share is planned for release in Q4 2023. FlexSync 3 for Myriad is planned for release next year.

IBM to build its first European quantum data centre
The IBM Facility in Ehningen, Germany, has announced plans to open its first Europe-based quantum data centre to facilitate access to cutting-edge quantum computing for companies, research institutions and government agencies. The data centre is expected to be operational in 2024, with multiple IBM quantum computing systems, each with utility scale quantum processors (those of more than 100 qubits). The data centre will be located at IBM’s facility in Ehningen and will serve as IBM Quantum’s European cloud region, for users in Europe to provision services at the data centre for their quantum computing research and exploratory activity. The data centre is being designed to help clients continue to manage their European data regulation requirements, including processing all job data within EU borders. The facility will be IBM’s second quantum data centre and quantum cloud region, after its New York facility. “Europe has some of the world’s most advanced users of quantum computers, and interest is only accelerating with the era of utility scale quantum processors,” says Jay Gambetta, IBM Fellow and Vice President of IBM Quantum. “The planned quantum data centre and associated cloud region will give European users a new option as they seek to tap the power of quantum computing in an effort to solve some of the world’s most challenging problems.” “Our quantum data centre in Europe is an integral piece of our global endeavour,” says Ana Paula Assis, IBM General Manager for EMEA. “It will provide new opportunities for our clients to collaborate side-by-side with our scientists in Europe, as well as their own clients, as they explore how best to apply quantum in their industry.”

Why cold storage and the new data era go hand in hand
By Timothy Sherbak, Enterprise Products and Solutions Marketing at Quantum. The phrase 'digital transformation' has been bandied around a lot over recent years, as many companies adopt technology into their processes to meet changing business and market requirements. The shift in the way companies operate means that the scale of data being generated has dramatically increased. With the amount of unstructured data growing at a rate of up to 60% per year and expected to make up 80 to 90% of all data by 2025 – the are no signs of things slowing down. To put this into perspective, every year, a semiconductor manufacturer produces over a billion image scans documenting the manufacturing process of 4000 wafers every week, which results in petabytes of data that must be stored for several years. Another example is autonomous cars, an autonomous car produces up to 2TBs of data every hour of use that must be maintained for several years for potential safety analysis or remodelling. As data growth increases, companies’ storage budgets and data storage capacities are struggling to keep up. The issue of storing unstructured and inactive data needs to be addressed - this is where cold storage comes in.   What is cold storage all about? Like every great story, the lifecycle can also be divided up into three parts: the beginning, the middle and the end. The importance of data to a project and how often it needs to be accessed dictates how data can be classified - ‘hot’, ‘warm’ or ‘cold’ data. Data that is ‘on demand’ and being used every day for a project, is referred to as ‘hot’ data. Data that needs to be accessed on a regular basis but not daily is called ‘warm’ data. Then, last but by no means least, there is ‘cold’ data or ‘dormant’ data that is no longer or very rarely needed by organisations for projects they are working on. This data makes up over 60% of all stored data.  Many companies have had to preserve cold data, not out of choice but of necessity to comply with strict regulations or internal policies around storing data. However, attitudes towards cold data are starting to change. More companies are deciding to hold on to cold data because of the long-term strategic value it holds, the future potential of the data to be re-used for projects, as well as the chance to enrich the data over time to augment its value.  Traditionally, cold data has been written to tape media, moved offline, and transferred to a storage facility where it can be accessed for future use. Cold data storage is a highly durable, secure, and inexpensive way to store data in the long term. The emergence of new data analysis methods, increase in computing power, and the rapid acceleration of digital transformation, means there is a greater demand for data to be stored and readily available online. Despite this, there are still concerns around cold storage’s durability, level of security, and storage costs that need to be resolved. Is a cold storage strategy really necessary?  It is predicted that data created during the next three years will amount to more than all of the data created during the past 30 years. As organisations are realising the value of cold data, combined with the surge in the volume of data being generated, the case for cold storage is clear - if companies want to stay ahead of the data curve, they must put act quickly and implement a cold storage strategy. Companies must also prioritise building meaningful and scalable solutions to harvest data-driven innovations and opportunities to reduce the common problems associated with massive data accumulation and continued data growth.  What are the perks of cold storage?  Recent innovations in cold data storage, have improved performance, data durability, and storage efficiency, raising the standard overall expectations. Storing vast amounts of data can be costly for organisations, however, cold data storage is considerably cheaper than the NVMe and solid-state disk technologies deployed for high performance access to hot data. One of the key reasons for this is because cold data can be stored on lower performing and cheaper storage infrastructures, either in-house or in the cloud. This means organisations can cost-effectively store more of their growing data sets.  Most modern cold storage archives have been developed by some of the world’s largest cloud solution providers, but with emerging architectures and services, cold storage solutions are now deployable within an organisation’s data centre, colocation facility or hosted IT environment that meets data sovereignty and data residency requirements. As a result, the data is more easily accessible with no access fees and retrieval times are reduced from days to minutes. Additionally, new erasure coding algorithms are now optimised specifically for cold storage which reduces the storage overhead compared to the traditional method of storing multiple copies. Unlocking the value of data Cold storage enables organisations to safely store data and maintain in-house control of these invaluable assets and easily retrieve them for future projects. It increasingly clear in this new data era that organisations cannot just disregard data if it doesn’t serve any immediate purpose. Certain data must be stored, both for its historic relevance and value for future projects. Advances in cold storage, mean companies don’t have to sacrifice inactive data for active data. Whether an organisation is looking to reduce storage costs or boost the value of its data, cold storage provides a real solution to the problem of data accumulation.

Europe on track for quantum leadership: IQM raises €39m.
IQM Quantum Computers has announced that it has raised €39m in Series A funding, bringing the total amount of funding raised to date to €71m.  This ranks among the highest fundraising rounds by a European deep-tech start-up within a year. MIG Fonds has led this round, with participation from all existing investors including Tesi, OpenOcean, Maki.vc, Vito Ventures, Matadero QED. New investors Vsquared, Salvia GmbH, Santo Venture Capital GmbH, and Tencent, have also joined this round. “IQM has a strong track record of research and in achieving high growth. They continue to attract the best global talent across functions and have exceeded their hardware and software milestones. We are thrilled to lead this round and continue to support IQM as the company accelerates its next phase of business and hardware growth,” says Axel Thierauf, Partner at MIG Fonds, and Chairman of the Board of IQM. Since 2019, IQM has been among the fastest-growing companies in the quantum computing sector and already has one of the world’s largest quantum hardware engineering teams. This funding will be used to accelerate IQM´s hardware development and to co-design application-specific quantum computers. A significant part of the funding will also be used to attract and retain the best global talent in quantum computing, and to establish sales and business development teams.  ”Today’s announcement is part of our ongoing Series-A funding round. I am extremely pleased with the confidence our investors have shown in our vision, team, product, and the ability to execute and commercialize quantum computers. This investment also shows their continued belief in building the future of quantum technologies. This is a significant recognition for our fantastic team that has achieved all our key milestones from the previous round. We’re just getting started,” comments Jan Goetz, CEO of IQM. “It is impressive to be a part of the IQM journey and see the progress of their technology. We’re proud to see another startup from Finland making a global impact. IQM will have a lasting impact on the future of computing, and consequently will help solve some of the global challenges related to healthcare, climate change and development of sustainable materials among many others,” adds Juha Lehtola, Head of Direct VC Investments at Tesi (Finnish Industry Investment).  IQM delivers on-premises computers for research laboratories and supercomputing centres. For industrial customers, IQM follows an innovative co-design strategy to deliver quantum advantage based on application-specific processors, using novel chip architectures and ultrafast quantum operations. IQM provides the full hardware stack for a quantum computer, integrating different technologies, and invites collaborations with quantum software companies.  “We want to invest in deep technology startups that shape the future and advance society. IQM is the perfect example of a company that is on top of its game; their work on quantum computing will make an impact for generations to come,” concludes Herbert Mangesius, Founding Partner at Vsquared and Vito Ventures. While quantum computing is still under development, governments and private organizations across the world are investing today to retain their competitive edge and become ready for the future. The next decade will be the decade of quantum technology, and we will see major breakthroughs with real-world applications using quantum computers in healthcare, logistics, finance, chemistry and beyond.

HPE buys supercomputer manufacturer, Cray, for $1.3 billion
HPE has found itself a partner in which to usher in the era of quantum computing, with the firm announcing its intention to acquire Cray, the firm best-known for having developed some of the world’s most powerful supercomputers, having held the top spot six times since 1976.   The deal to buy Cray is worth $1.3 billion, with the purchase price set at $35 a share, which is a $5.19 premium over its closing price on Thursday. The deal is expected to close in HPE’s fiscal Q1 2020, although it’ll be subject to regulatory oversight. We expect this won’t be too much of an issue, however, given the already intense competition in the computing industry. What is Cray? Cray began life in the 1970s and gained notoriety in 1976 with the Cray-1, a supercomputer which boasted 160MFLOPS of power. Sure, that pales in comparison to the supercomputers being built now, but back then Cray’s computer was so powerful that it managed to sell more than 80 of them for $7.9 million a pop. After the success of the Cray-1, the company developed new supercomputers that surpassed the original and took on the title of the world’s most powerful computer. That includes the Cray X-MP, Cray-2, and Cray Y-MP, each one of which held that top spot between 1983 and 1989. In 1990, Cray lost its edge to competitors such as Fujitsu, NEC, TMC, Intel, Hitachi and IBM, all of which produced more powerful computers than Cray. At the same time, Cray was also being squeezed in the low-end of the high-performance market, thanks to the launch of new mini-supercomputers. It took Cray until 2009 to reclaim the top spot, with the firm building the infamous Jaguar for the Oak Ridge National Laboratory. This computer boasted 1.75 petaflops of power and was performing important tasks for the US Department of Energy, solving problems in areas such as climate modelling, renewable energy, seismology, and much more. It still wasn’t quite powerful enough, however, and in 2012, the Oak Ridge National Laboratory upgraded its supercomputer to the Cray Titan, a computer boasting 17.59 petaflops of power, a tenfold increase. That brings us to 2019. Titan is no longer the most powerful supercomputer housed at the Oak Ridge National Laboratory as it was eclipsed earlier this year by IBM’s Summit. That supercomputer offers the Department of Energy 200 petaflops of power, and was key to the US regaining its superiority in the race for more powerful computers, having lost out in the rankings to China for five years. That doesn’t mean Cray wasn’t still seen as a leader in the high-performance computing field, with the firm already developing its first exascale computer for the very same laboratory. This computer, dubbed ‘Frontier’, should reach peak performance of greater than 1.5 exaflops when it’s operational in 2021. Why has Cray agreed to an acquisition by HPE? Cray is still building some of the world’s most powerful supercomputers, having been the developer of more than 49 of the top 500 supercomputers in the world. Despite this, the company is haemorrhaging cash; it has made significant losses in every quarter since 2017, and while it showed signs of improvement, it was still losing millions. What does HPE want with Cray? In the list of the top 500 supercomputer manufacturers, Cray may have produced 49, but HPE isn’t all that far behind, having produced 45. Despite having built only four less than Cray, HPE’s supercomputers boast notably lower performance when compared to those offered by Cray. In fact, that’s one of the key reasons HPE is buying the company. Cray has amassed a vast array of IP and patents that will help HPE in the transition to quantum computing. That’s seen as the next big frontier in computing, as quantum computers are designed to perform operations much more quickly and use less energy in the process. Cray has already dipped its feet into the market, although it faces stiff competition from the likes of Google and IBM – the latter of which announced the first commercial quantum computer for use outside of the lab earlier this year. Given the steep competition in quantum computing, HPE needs all the firepower it can get in order to compete, and $1.3 billion is a small price to pay for Cray.

Atos delivers quantum simulators to Hartree Centre
Atos, a European IT services corporation, has announced an agreement with the Science and Technology Facilities Council’s (STFC) Hartree Centre that promises to see one of the UK’s leading high-performance computing research facilities take the first UK delivery of an Atos Quantum Learning Machine. The company says that its Quantum Learning Machine will be used to develop new quantum-based services designed to help researchers and industry prepare for the coming quantum computing revolution. These include quantum algorithm development and the first UK repository for quantum algorithms, collaborative research projects on quantum computing applications and specialist training. Atos says this new collaboration builds on an established partnership between Atos and the Hartree Centre, which began with the UK’s first Bull Sequana X1000 supercomputer being hosted at the facility in 2017. The Hartree Centre, based at Daresbury Laboratory and part of the Sci-Tech Daresbury Campus in Cheshire, UK, also hosts the JADE national deep learning service.  Commenting on the partnership announcement, Andy Grant, Vice President, HPC & Big Data, Atos UK and Ireland says, “We are delighted to deepen our existing relationship with the Hartree Centre which we believe will help UK industry future-proof itself for the arrival of quantum computing. Our Quantum Learning Machine as a service will be made available to any organisation wanting to learn about, and experiment with, quantum computing and understand the key opportunities and challenges this technology presents. Quantum is the future of computing and it is crucial that organisations are ready to harness the coming revolution.” Alison Kennedy, Director of the STFC Hartree Centre, adds “We’re thrilled to be enabling UK companies to explore and prepare for the future of quantum computing. This collaboration will build on our growing expertise in this exciting area of computing and result in more resilient technology solutions being developed for industry.” 



Translate »