Advertise on DCNN Advertise on DCNN Advertise on DCNN
Thursday, June 12, 2025

Data


Ciena publishes report on wave services demand
Ciena, an American networking systems and software company specialising in optical networking equipment, has compiled a new report on wavelength services that explores the key drivers of the need for high-speed connectivity. The report examines the critical role of wave services in enabling the expansion of interconnected data centres driven by artificial intelligence (AI), the growing importance of low latency and data sovereignty for AI workloads, and the build-out of terrestrial and critical submarine network infrastructure. It also highlights the pivotal role of managed optical fibre network (MOFN) business models to expand high-speed connectivity into new geographies and markets. “As cloud providers scale data centre networks to address AI performance requirements, wave services must also evolve in terms of capacity, coverage, latency, and route diversity,” says Mark Bieberich, Vice President of Portfolio Marketing, Ciena. “Demand for wave services is growing steadily worldwide as data centre network expansion requires increasingly high-capacity interconnection among various types of network operators and end users.” The total wave services circuits market in the US grew nearly 8% in 2024 and is projected to grow steadily through 2029, based on research from Vertical Systems Group. It observed an increasing use of wave services for cloud on-ramps, which is demonstrated by the metro geographical scope (41%) along with the dominance of retail customers (58%). The report states that, from 2024 to 2029, growth in 400G circuits is set to soar, while 100G circuits will see a steady rise and 10G circuits will experience modest growth. Wave services are the foundation of most high-capacity networks, particularly when connectivity to or between data centres is involved. High bandwidth, protocol transparency, and low latency are some of their fundamental characteristics. Wave services can either act as end services or support higher-layer services. Based on Dense Wavelength Division Multiplexing (DWDM) technology, they enable massive data-transmission bandwidth over a fibre pair. Currently, wave services are dominated by 100G and 400G connections. There is still a high volume of 10G services deployed, but they are being upgraded to 100G at a steady pace. Ciena’s report also looks at the growth of submarine cables. It highlights that a record 161,100km of submarine cables are planned to become ready for service (RFS) in 2025, dwarfing the previous high of 121,000km becoming RFS back in 2001. “With infrastructure expanding rapidly and resource constraints increasingly shaping growth, anticipating demand has never been more important,” continues Mark. “Network operators providing wave services can seize this moment by proactively routing new submarine cables to emerging data centres and innovating to address these challenges. Differentiation through greater route diversity, low-latency connectivity, and compelling managed services is key to staying ahead.” The report provides an analysis of the current industry landscape, evaluating key trends and identifying factors poised to influence the market in the coming years.

UKRI invests £22 million into data spending
The UK Department for Research and Innovation (UKRI) has invested £22 million into data spending and staff over the past three years, underscoring the department's strategic commitment to data as a cornerstone of national research and innovation. Data is playing an increasingly vital role, particularly as artificial intelligence (AI) is being rolled out throughout government departments, with 70% of government bodies already piloting or planning to use AI, highlighting the urgent need for high-quality, structured, and secure data. This development marks a 70% increase in salary investment in just two years, reflecting both rising headcounts and the increasing value of data expertise in shaping the UK’s research landscape. Stuart Harvey, CEO of Datactics, comments, “Both businesses and government departments are keen to implement AI into their business functions but are overlooking the fundamental truth that AI is only as good as the data it learns from. Hiring challenges are becoming an increasing problem, but businesses should follow in the UKRI's footsteps to invest in data spending and staff, and upskill their teams in data management, governance, and quality to improve data readiness. “AI is only as effective as the data it processes and without structured, accurate, and well-governed data, businesses risk AI systems that are flawed. The rush to deploy AI without a strong data foundation is a costly mistake and, in a competitive AI landscape, only those who get their data right will be the ones who thrive.” UKRI’s investment in its data workforce reflects the growing demand for high-quality, well-managed, and accessible data that enables researchers to collaborate, innovate, and respond to global challenges. Between 2022 and 2025, UKRI’s data-related salary investment rose by 85%, from £5.35 million to £9.89 million, reflecting both growing headcounts and the escalating value of data expertise across the UK’s research ecosystem. Over the same period, the number of staff with “data” in their job titles rose from 138 in 2022 to 203 in 2025 - a 47% increase. Sachin Agrawal, Managing Director for Zoho UK, says, “As the UK continues to position itself as a global science and technology powerhouse, it is a welcome sight to see the department prioritising the investment of its data workforce for long-term commitment to data-driven research. “In an era where public trust and data ethics are paramount, building in-house expertise is essential to ensuring that data privacy, transparency, and compliance are at the heart of our national research infrastructure. This strategic investment lays the foundation for smarter and safer technology use by the UKRI."

House of Commons boosts data workforce by 50%
The UK's House of Commons has splashed £7.5 million into data spending and staff over the past three years, underscoring its strategic commitment to data as a cornerstone of national research and innovation. As the public sector embraces AI at pace, with over 70% of government bodies piloting or planning AI implementation, the demand for robust data infrastructure and skilled personnel has never been greater. In response, the House of Commons has quietly ramped up hiring and spending on data roles, reflecting a broader strategic shift towards data-centric governance. Over the past three years, the number of staff in the House of Commons with "data" in their job titles has jumped from 49 in 2022 to 73 in early 2025, marking a 49% increase. Alongside this, total salary investment for data roles rose by more than 63%, from £1.83 million to £2.98 million, excluding final April 2025 figures still pending payroll completion. The figures reflect a growing recognition within Parliament that AI innovation is only as effective as the data that underpins it. Stuart Harvey, CEO of Datactics, comments, "There's a growing appetite across government to harness the power of AI, but what's often overlooked is that AI is only as reliable as the data it's built on. The House of Commons' investment in data roles is a critical step toward ensuring its systems are grounded in quality, governance, and accuracy. "Hiring the right data professionals and embedding strong data practices is no longer optional, it's essential. Without it, organisations risk deploying AI that makes poor decisions based on flawed information. In this new era, those who prioritise data integrity will be the ones who gain real value from AI." The increase in data staffing at the heart of Parliament reflects a wider cultural shift toward long-term digital resilience, ensuring that public institutions are equipped to harness AI ethically and effectively. Richard Bovey, Head of Data at AND Digital, says, "The House of Commons is leading the way for data investment, with 66% of businesses agreeing that data investment is a top priority for their organisation, according to our recent Data Loyalty research. This move signals a long-term commitment to data-driven governance at the heart of the public sector. "As the UK advances its position as a global leader in science and technology, building in-house data capability is vital, not only to unlock innovation, but also to safeguard, embedded from the ground up, enabling institutions to innovate responsibly. "But data alone isn't enough. Organisational culture plays a crucial role in turning insight into impact and a culture that truly values curiosity, empathy, and accountability is what transforms data points into better decisions and more meaningful outcomes. By investing in its data workforce, the House of Commons is laying a robust foundation for smarter, more ethical, and future-ready public services. It's a necessary step toward creating a public sector that is both digitally progressive and aligned with democratic values."

Colovore appoints infrastructure veteran as CEO
Colovore, a leader in ultra-high-density, liquid cooling colocation solutions, today announced the appointment of Jeffrey Springborn as Chief Executive Officer. A seasoned operator in data centre development and cloud services, Jeffrey intends to guide the company through its next phase of growth as it scales a national buildout strategy across the USA to meet rising demand for AI and high-performance computing (HPC) infrastructure. Jeffrey joins Colovore on the heels of a $925 million financing facility with Blackstone to fuel expansion of its AI data centre platform in key US metro markets. Driven by strong demand for mission-critical infrastructure that supports enterprise planning and multi-market deployments, Colovore is expanding into key US metro markets. New high-density, liquid-cooled facilities are being developed to align with customer roadmaps, enabling partners to secure capacity and shape infrastructure rollouts that match long-term AI and HPC growth. “AI infrastructure needs are evolving faster than most data centres can adapt,” says Jeffrey Springborn, the new CEO of Colovore. “Colovore is building what tomorrow demands—today. The customers we’re partnering with aren’t just early adopters, they’re forward-thinking leaders who see what’s coming and are preparing for it now. They know that to stay ahead in AI, you can’t wait for capacity. You have to secure the right infrastructure before the rush. With the backing of King Street, that’s exactly what we’re enabling them to do.” Colovore Board Chairman and Managing Partner of King Street Capital Brian Higgins comments, ‘"I'm pleased to announce Jeff Springborn as Colovore's new CEO. We identified Jeff as the leader with the right experience to accelerate our growth in the AI infrastructure market. His 30 years of technology leadership and infrastructure expertise will be crucial as we expand our liquid-cooled data centres nationwide. Jeff complements the strong foundation built by Sean and Peter, positioning Colovore to meet the surging demand for AI-ready infrastructure. The Board is confident in Jeff's ability to execute our strategic vision in this next phase of growth." “Smart companies are locking in infrastructure now to avoid being left behind later,” states Jeffrey. “We’re not just building data centres—we’re shaping the AI backbone of tomorrow.” For more from Colovore, click here.

VAST Data unveils its operating system for the 'thinking machine'
VAST Data, a technology company focused on artificial intelligence and deep learning computing infrastructure, today announced the result of nearly a decade of development with the unveiling of the VAST AI Operating System (OS), a platform purpose-built for the next wave of AI breakthroughs. As AI redefines the fabric of business and society, the industry again finds itself at the dawn of a new computing paradigm – one where great numbers of intelligent agents will reason, communicate, and act across a global grid of millions of GPUs that are woven across edge deployments, AI factories and cloud data centres. To make this world accessible, programmable, and operational at extreme scale, a new generation of intelligent systems requires a new software foundation. The VAST AI OS is the product of nearly ten years of engineering with the aim to create an intelligent platform architecture that can harness the new generation of AI supercomputing machinery and unlock the potential of AI at scale. The platform is built on VAST’s Disaggregated Shared-Everything (DASE) architecture, a parallel distributed system architecture – making it possible to parallelise AI and analytics workloads, federate clusters into a unified computing and data cloud, and then feed new AI workloads with high amounts of data from one tier of storage. Today, DASE clusters support over 1 million GPUs around the world in many of the world’s most data-intensive computing centres. The scope of the AI OS is broad and is intended to consolidate disparate legacy IT technologies into one modern offering. “This isn’t a product release — it’s a milestone in the evolution of computing,” says Renen Hallak, Founder & CEO of VAST Data. “We’ve spent the past decade reimagining how data and intelligence converge. Today, we’re proud to unveil the AI Operating System for a world that is no longer built around applications — but around agents.” The AI OS consists of every aspect of a distributed system to run AI at a global scale: a kernel to run platform services on from private to public cloud, a runtime to deploy AI agents with, eventing infrastructure for real-time event processing, messaging infrastructure, and a distributed file and database storage system that can be used for real-time data capture and analytics. In 2024, VAST previewed the VAST InsightEngine – a service that extracts context from unstructured data using AI embedding tools. If the VAST InsightEngine prepares data for AI using AI, VAST AgentEngine is how AI now comes to life with data – an auto-scaling AI agent deployment runtime that aims to equip users with a low-code environment to build workflows, select reasoning models, define agent tools, and operationalise reasoning. The AgentEngine features a new AI agent tool server that provides support for agents to invoke data, metadata, functions, web search, or other agents using them as MCP-compatible tools. AgentEngine allows agents to assume multiple personas with different purpose and security credentials, and provides secure, real-time access to different tools. The platform’s scheduler and fault-tolerant queuing mechanisms are also intended to ensure agent resilience against machine or service failure. Just as operating systems ship with pre-built utilities, the VAST AgentEngine will feature a set of open-source agents that VAST will release (one per month). Some personal assistants will be tailored to industry use cases, whereas others will be designed for general purpose use. Examples include: ● A reasoning chatbot, powered by all of an organisation’s VAST data ● A data engineering agent to curate data automatically ● A prompt engineer to help optimise AI workflow inputs ● An agent agent, to automate the deployment, evaluation, and improvement of agents ● A compliance agent, to enforce data and activity level regulatory compliance ● An editor agent, to create rich media content ● A life sciences researcher, to assist with bioinformatic discovery In the spirit of enabling organisations to build on the VAST AI OS, VAST Data will be hosting VAST Forward, a series of global workshops, both in-person and online, throughout the year. These workshops will include training on components of the OS and sessions on how to develop on the platform. For more from VAST, click here.

House of Lords AI summit highlights cyber threats
Technology industry leaders gathered in the House of Lords yesterday for a high-profile debate on the transformative role artificial intelligence (AI) will play in the UK jobs market. The discussion, chaired by Steven George-Hilley of Centropy PR, brought together experts to address key industry challenges, including the digital skills shortage and AI’s potential to enhance compliance and accelerate digital transformation across key areas of the UK economy. The debate highlighted the growing role of AI in reshaping traditional job roles and powering a new wave of relentless cyber threats which could damage British businesses. Key speakers, including Richard Cuda of Kasha, discussed the role AI and digital technology can play in helping entrepreneurs launch their own business. Leigh Allen, Strategic Advisor, Cellebrite, says, "In a world where police forces are under increasing strain to combat crime and national security threats, AI technology represents a key enabler in unlocking digital evidence and significantly reducing investigation times. Cellebrite delivers secure, ethical access to digital evidence, using AI to accelerate investigations while closing the digital skills gap for modern law enforcement. We don’t just respond to digital threats—we equip agencies to lead with confidence in a complex, tech-driven world." Dr Janet Bastiman, Chief Data Scientist, Napier AI, comments, "Financial crime is one of the biggest threats facing the UK economy right now, and in AI we have the answer. AI-driven anti-money laundering solutions have the capacity to save UK financial institutions £2.2 billion each year, helping to bolster compliance processes, improve the accuracy of transaction screening, and monitor transaction behaviour to more effectively identify criminal networks." Linda Loader, Software Development Director, Resonate, suggests, "AI has the potential to significantly enhance operations in the rail industry by enabling faster and more efficient services. But this must be underpinned by quality data to drive innovative solutions that prioritise security and robust protection for our critical national infrastructure. By exploring smaller AI use cases now, we can build a solid foundation and understanding for more extensive, secure transport applications in future." Chris Davison, CEO, NavLive, mentions, "By using cutting edge AI and robotics technology to create automated 2D and 3D models of buildings in real time, we can make retrofits, brownfield developments more efficient and contribute to sustainable building practices. NavLive saves architects, engineers and construction professionals time and money, by providing accurate real time spatial data across the lifecycle of a building." Richard Bovey, Chief for Data, AND Digital, states, "The AI winners are the businesses that have invested the most in AI experimentation, underpinned by years of strong data foundations, meanwhile, SMEs are quickly watching a widening AI gap. But all isn’t lost, investing in data and modern tooling can stop the slide, helping businesses to keep pace and preventing a significant competitive disadvantage from taking over." Arkadiy Ukolov, Co-Founder and CEO, Ulla Technology, says, "As AI adoption continues to skyrocket, we must ensure that privacy and data security remain a critical component of development. Most of the popular AI tools send data to third-party AI providers, which may use client data to train models. This is unacceptable for sensitive meeting discussions and confidential documents, as it opens them up to data leaks. Placing safety and ethics at the centre of the discussion is the only route that we can take forward as AI evolves." For more on cyber security, click here.

R&M launches latest inteliPhy software suite
R&M, the Swiss developer and provider of high-end infrastructure solutions for data and communications networks, has launched the sixth generation of the inteliPhy software suite on the market. The scope of application has been significantly expanded, and according to R&M, inteliPhy 6.0 thinks beyond the infrastructure of an individual data centre. This makes it easier to plan the expansion of fibre optic infrastructures between several data centres or at a larger site, the company states. To this end, R&M has equipped the software with geoinformation system (GIS) functions for geolocalisation in longitude and latitude. In particular, the software also offers a unique height service, which leads to significantly more precise values when calculating underground cable lengths. inteliPhy 6.0 can visualise telecom, provider and campus networks, as well as underground cable runs. The software from R&M also supports the local and remote management of the infrastructure of external sites such as edge data centres, for example. With inteliPhy, users can plan the infrastructure of a gray space and now also the cages in colocation data centres three-dimensionally. Components such as racks, patch panels, flexible enclosures, cable ducts, trunk cables and active equipment are added from the model library using drag and drop. Components can now also be searched for using part numbers. A new feature is the ability to design customised data centre spaces instead of using fixed-size tiles, thus increasing flexibility. R&M’s ActiPower connector strips for power distribution can now also be integrated using drag and drop. This means that with inteliPhy 6.0, iPDUs can be integrated at the click of a mouse. They are assigned to the object identifiers (OIDs) and their data is scaled or named in such a way that it is easy to understand. This allows the operator to manage the power distribution in the racks without errors. R&M offers the inteliPhy trial versions here. For more from R&M, click here.

Hitachi Vantara launches Virtual Storage Platform 360
Hitachi Vantara, the data storage, infrastructure, and hybrid cloud management subsidiary of Hitachi, today announced the launch of Virtual Storage Platform 360 (VSP 360), a unified management software solution designed to help customers simplify data infrastructure management operations, improve decision-making, and the delivery of data services. With support for block, file, object, and software-defined storage, VSP 360 consolidates multiple management tools and aims to enable IT teams, including those with limited storage expertise, to more efficiently control hybrid cloud deployments, gain AIOps predictive insights, and simplify data lifecycle governance. Organisations today are struggling to manage sprawling data environments spread across disparate storage systems, fragmented data silos, and complex application workloads, all while grappling with overextended IT teams and rising demands for compliance and AI readiness. A recent survey showed AI has led to a dramatic increase in the amount of data storage that businesses require, with the amount of data expected to increase 122% by 2026. The survey also revealed that many IT leaders are being forced to implement AI before their data infrastructure is ready to handle it, with many embarking on a journey of experimentation, hoping to find additional ways to recover some of the cost of their investments. VSP 360 seeks to address these obstacles by integrating data management tools across enterprise storage to monitor key performance indicators, including storage capacity utilisation and overall system health, helping to deliver optimal performance and efficient resource management. It intends to improve end-to-end visibility, leveraging AIOps observability to break down data silos, as well as streamlining the deployment of VSP One data services. “VSP 360 represents a bold step forward in unifying the way enterprises manage their data,” says Octavian Tanase, Chief Product Officer, Hitachi Vantara. “It’s not just a new management tool—it’s a strategic approach to modern data infrastructure that gives IT teams complete command over their data, wherever it resides. With built-in AI and automation and by making it available via SaaS, Private, or via your mobile phone, we're empowering our customers to make faster, smarter decisions and eliminate the traditional silos that slow innovation.” “VSP 360 gives our customers the unified visibility and control they’ve been asking for,” claims Dan Pittet, Senior Solutions Architect, Stoneworks Technologies. “The ability to manage block, file, object, and software-defined storage from a single AI-driven platform helps streamline operations and reduce complexity across hybrid environments. It’s especially valuable for IT teams with limited resources who need to respond quickly to evolving data demands without compromising performance or governance.” "VSP 360 hits the mark for what modern enterprises need," states Ashish Nadkarni, Group Vice President and General Manager, Worldwide Infrastructure Research, IDC. "It goes beyond monitoring to deliver true intelligence across the storage lifecycle. The solution's robust data resiliency helps businesses maintain continuous operations and protect their critical assets, even in the face of unexpected disruptions. By integrating advanced analytics, automation, and policy enforcement, Hitachi Vantara is giving customers the agility and resilience needed to thrive in a data-driven economy.” For more from Hitachi, click here.

NetApp builds AI infrastructure on NVIDIA AI data platform
NetApp, the intelligent data infrastructure company, has announced that it is working with NVIDIA to support the NVIDIA AI Data Platform reference design in the NetApp AIPod solution to accelerate enterprise adoption of agentic AI. Powered by NetApp ONTAP, NetApp AIPod deployments built on the NVIDIA AI Data Platform aim to help businesses build secure, governed, and scalable AI data pipelines for retrieval-augmented generation (RAG) and inferencing. As an NVIDIA-Certified Storage partner leveraging the NVIDIA AI Data Platform, NetApp gives NetApp AIPod users data infrastructure with built-in intelligence. NetApp intends to give customers confidence that they have the enterprise data management capabilities and scalable multi-tenancy needed to eliminate data siloes so they can develop and operate high-performance AI factories and deploy agentic AI to solve real-world business problems. “A unified and comprehensive understanding of business data is the vehicle that will help companies drive competitive advantage in the era of intelligence, and AI inferencing is the key,” says Sandeep Singh, Senior Vice President and General Manager of Enterprise Storage at NetApp. “We have always believed that a unified approach to data storage is essential for businesses to get the most out of their data. The rise of agentic AI has only reinforced that truly unified data storage goes beyond just multi-protocol storage. Businesses need to eliminate silos throughout their entire IT environment, whether on-premises and in the cloud, and across every business function, and we are working with NVIDIA to deliver connected storage for the unique demands of AI.” The NetApp AIPod solution built on the NVIDIA AI Data Platform incorporates NVIDIA accelerated computing to run NVIDIA NeMo Retriever microservices and connects these nodes to scalable storage. Using this reference design enables customers to scan, index, classify and retrieve large stores of private and public documents in real time. The intention is to augment AI agents as they reason and plan to solve complex, multistep problems, helping enterprises turn data into knowledge and boost agentic AI accuracy across many use cases. “Agentic AI enables businesses to solve complex problems with superhuman efficiency and accuracy, but only as long as agents and reasoning models have fast access to high-quality data,” says Rob Davis, Vice President of Storage Technology at NVIDIA. “The NVIDIA AI Data Platform reference design and NetApp’s high-powered storage and mature data management capabilities bring AI directly to business data and drive unprecedented productivity.” For more from NetApp, click here.

AI set to supercharge cyber threats by 2027
The UK’s National Cyber Security Centre (NCSC) has released a landmark cyber threat assessment, warning that rapid advances in artificial intelligence (AI) will make cyber attacks more frequent, effective and harder to detect by 2027. The digital divide between organisations with the resources to defend against digital threats, and those without, will inevitably increase.  Published on the opening day of CYBERUK, the UK’s flagship cyber security conference, the report outlines how both state and non-state actors are already exploiting AI to increase the speed, scale and sophistication of cyber operations. Generative AI is enabling more convincing phishing attacks and faster malware development. This significantly lowers the barrier to entry for cyber crime and cyber intelligence. Of particular concern is the rising risk to the UK’s democratic processes, Critical National Infrastructure (CNI) and commercial sectors. Advanced language models and data analysis capabilities are used to craft highly persuasive content, resulting in more frequent attacks that are difficult to detect.  Andy Ward, SVP International at Absolute Security, says, “While AI offers significant opportunities to bolster defences, our research shows 54% of CISOs feel unprepared to respond to AI-enabled threats. That gap in readiness is exactly what attackers will take advantage of." "To counter this, businesses must go beyond adopting new tools - they need a robust cyber resilience strategy built on real-time visibility, proactive threat detection, and the ability to isolate compromised devices at speed.” This latest warning forms part of the UK Government’s wider cyber strategy after announcing the new AI Cyber Security Code of Practice earlier this year. This will form the basis of a new global standard to secure AI and ensure national security keeps pace with technological evolution, safeguarding the country against emerging digital threats. For more from NCSC click here.



Translate »