Advertise on DCNN Advertise on DCNN Advertise on DCNN

Data


Nasuni achieves AWS Energy & Utilities Competency status
Nasuni, a unified file data platform company, has announced that it has achieved Amazon Web Services (AWS) Energy & Utilities Competency status. This designation recognises that Nasuni has demonstrated expertise in helping customers leverage AWS cloud technology to transform complex systems and accelerate the transition to a sustainable energy and utilities future. To receive the designation, AWS Partners undergo a rigorous technical validation process, including a customer reference audit. The AWS Energy & Utilities Competency provides energy and utilities customers the ability to more easily select skilled partners to help accelerate their digital transformations. "Our strategic collaboration with AWS is redefining how energy companies harness seismic data,” comments Michael Sotnick, SVP of Business & Corporate Development at Nasuni. “Together, we’re removing traditional infrastructure barriers and unlocking faster, smarter subsurface decisions. By integrating Nasuni’s global unified file data platform with the power of AWS solutions including Amazon Simple Storage Service (S3), Amazon Bedrock, and Amazon Q, we’re helping upstream operators accelerate time to first oil, boost capital efficiency, and prepare for the next era of data-driven exploration." AWS says it is enabling scalable, flexible, and cost-effective solutions from startups to global enterprises. To support the integration and deployment of these solutions, AWS established the AWS Competency Program to help customers identify AWS Partners with industry experience and expertise. By bringing together Nasuni’s cloud-native file data platform with Amazon S3 and other AWS services, the company claims energy customers could eliminate data silos, reduce interpretation cycle times, and unlock the value of seismic data for AI-driven exploration. For more from Nasuni, click here.

Chemists create molecular magnet, boosting data storage by 100x
Scientists at The University of Manchester have designed a molecule that can remember magnetic information at the highest temperature ever recorded for this kind of material. In a boon for the future of data storage technologies, the researchers have made a new single-molecule magnet that retains its magnetic memory up to 100 Kelvin (-173 °C) – around the temperature of the moon at night. The finding, published in the journal Nature, is a significant advancement on the previous record of 80 Kelvin (-193 °C). While still a long way from working in a standard freezer, or at room temperature, data storage at 100 Kelvin could be feasible in huge data centres, such as those used by Google. If perfected, these single-molecule magnets could pack vast amounts of information into incredibly small spaces – possibly more than three terabytes of data per square centimetre. That’s around half a million TikTok videos squeezed into a hard drive that’s the size of a postage stamp. The research was led by The University of Manchester, with computational modelling led by the Australian National University (ANU). David Mills, Professor of Inorganic Chemistry at The University of Manchester, comments, “This research showcases the power of chemists to deliberately design and build molecules with targeted properties. The results are an exciting prospect for the use of single-molecule magnets in data storage media that is 100 times more dense than the absolute limit of current technologies. “Although the new magnet still needs cooling far below room temperature, it is now well above the temperature of liquid nitrogen (77 Kelvin), which is a readily available coolant. So, while we won’t be seeing this type of data storage in our mobile phones for a while, it does make storing information in huge data centres more feasible.” Magnetic materials have long played an important role in data storage technologies. Currently, hard drives store data by magnetising tiny regions made up of many atoms all working together to retain memory. Single-molecule magnets can store information individually and don’t need help from any neighbouring atoms to retain their memory, offering the potential for incredibly high data density. But, until now, the challenge has always been the incredibly cold temperatures needed in order for them to function. The key to the new magnets’ success is the unique structure, with the element dysprosium located between two nitrogen atoms. These three atoms are arranged almost in a straight line – a configuration predicted to boost magnetic performance, but now realised for the first time. Usually, when dysprosium is bonded to only two nitrogen atoms it tends to form molecules with more bent or irregular shapes. In the new molecule, the researchers added a chemical group called an alkene that acts like a molecular pin, binding to dysprosium to hold the structure in place. The team at the Australian National University developed a new theoretical model to simulate the molecule’s magnetic behaviour to allow them to explain why this particular molecular magnet performs so well compared to previous designs. Now, the researchers will use these results as a blueprint to guide the design of even better molecular magnets.

'More than a third of UK businesses unprepared for AI risks'
Despite recognising artificial intelligence (AI) as a major threat, with nearly a third (30%) of UK organisations surveyed naming it among their top three risks, many remain significantly unprepared to manage AI risk. Recent research from CyXcel, a global cyber security consultancy, highlights a concerning gap: nearly a third (29%) of UK businesses surveyed have only just implemented their first AI risk strategy - and 31% don’t have any AI governance policy in place. This critical gap exposes organisations to substantial risks including data breaches, regulatory fines, reputational harm, and critical operational disruptions, especially as AI threats continue to grow and rapidly evolve. CyXcel’s research shows that nearly a fifth (18%) of UK and US companies surveyed are still not prepared for AI data poisoning, a type of cyberattack that targets the training datasets of AI and machine learning (ML) models, or for a deepfake or cloning security incident (16%). Responding to these mounting threats and geopolitical challenges, CyXcel has launched its Digital Risk Management (DRM) platform, which aims to provide businesses with insight into evolving AI risks across major sectors, regardless of business size or jurisdiction. The DRM seeks to help organisations identify risk and implement the right policies and governance to mitigate them. Megha Kumar, Chief Product Officer and Head of Geopolitical Risk at CyXcel, comments, “Organisations want to use AI but are worried about risks – especially as many do not have a policy and governance process in place. The CyXcel DRM provides clients across all sectors, especially those that have limited technological resources in house, with a robust tool to proactively manage digital risk and harness AI confidently and safely.” Edward Lewis, CEO of CyXcel, adds, “The cybersecurity regulatory landscape is rapidly evolving and becoming more complex, especially for multinational organisations. Governments worldwide are enhancing protections for critical infrastructure and sensitive data through legislation like the EU’s Cyber Resilience Act, which mandates security measures such as automatic updates and incident reporting. Similarly, new laws are likely to arrive in the UK next year which introduce mandatory ransomware reporting and stronger regulatory powers. With new standards and controls continually emerging, staying current is essential.”

'AI is the new oil—and data centres are the refineries'
With AI adoption reshaping global industries, Straightline Consulting’s Managing Director, Craig Eadie, shares his insights regarding how data centres are powering the GenAI revolution: "The age of AI is here. Generative artificial intelligence (GenAI) is rewriting the rulebook when it comes to everything from software development and call centre productivity to copywriting — boosting efficiency and, depending on who you ask, on track to raise the GDP of industrialised nations by 10-15% over the next decade. "The impact of AI will reshape the global economy over the coming years, consolidating value among the companies that successfully capitalise on this moment — and disrupting those that don’t. The 'arms race' to develop the next generation of AI technologies — like Google’s new Veo 3 video generation tool, released at the start of June, which is already making headlines for its ability to allow anyone willing to pay $249 per month to create hauntingly lifelike, realistic videos of everything from kittens playing to election fraud — is accelerating as well. AI has become the new oil: the global fuel for economic growth. Unlike oil, however, GenAI alone isn’t valuable. Rather, its power lies in the ability to apply GenAI models to data. That process, akin to refining crude into petroleum, happens in the data centre. "Productivity is far from the only thing GenAI is turbocharging. This rush to build, train, and operate new GenAI models is also accelerating the race to build the digital infrastructure that houses them. Goldman Sachs predicts that global power demand from data centres will increase 50% by 2027 and by as much as 165% by the end of the decade, largely driven by GenAI adoption. "As someone working in the data centre commissioning sector, it’s impossible to overstate the impact that GenAI is having, and will continue to have, on our industry. GenAI has exploded our predictions. It’s even bigger than anyone anticipated. The money, the scale, the speed — demand is growing even faster than the most optimistic projections pre-2023. By the end of 2025, almost half of all the power data centres consume globally could be used to power AI systems. "The data centre commissioning space we’re operating in today has transformed dramatically. On the construction and design side, huge changes, not just in how buildings are constructed, but in the technology inside those buildings, are reshaping how we commission them. "The battle to capitalise on the GenAI boom is a battle to overcome three challenges: access to power, materials, and talent. "GenAI requires an order of magnitude more power than traditional colocation or cloud workloads. As a result, there are serious concerns about power availability across Europe, especially in the UK. We can’t build the data centres we need to capitalise on the GenAI boom because there’s just not enough power. There are some encouraging signs that governments are taking this challenge seriously. For example, the UK government has responded by creating 'AI Growth Zones' to unlock investment in AI-enabled data centres by improving access to power and providing planning support in some areas of the country. The European Union’s AI Continent Plan also includes plans to build large-scale AI data and computing infrastructures, including at least 13 operational 'AI factories' by 2026 and up to five 'gigafactories' at some point after that. "However, power constraints and baroque planning and approvals processes threaten to undermine these efforts. Multiple data centre markets are already facing pushback from local councils and communities against new infrastructure because of their effect on power grids and local water supplies. Dublin and Amsterdam already stymied new builds even before the GenAI boom. This comes with risk, because AI engines can be built anywhere. GDPR means data must be housed in-country, but if Europe and the UK don’t move faster, large US AI firms will resort to building their massive centres stateside and deploy the tech across the Atlantic later. Once an AI engine is trained, it can run on less demanding infrastructure. We risk stifling the AI industry in Europe and the UK if we don’t start building faster and making more power available today. "The other key constraints are access to raw materials and components. Global supply chain challenges have spiked the cost of construction materials, and the lead times for data-centre-specific components like cooling equipment can be as much as six months, further complicating the process of building new infrastructure. "Access to talent is another pain point that threatens to slow the industry at a time when it should be speeding up. Commissioning is a vital part of the data centre design, construction, and approvals process, and our sector is facing a generational talent crisis. There isn’t enough young talent coming into the sector. That has to change across the board—not just in commissioning, but for project managers, consultants, everyone, everywhere. The pain point is particularly acute in commissioning, however, because of the sector’s relatively niche pipeline and stringent requirements. You can’t just walk in off the street and become a commissioning engineer. The field demands a solid background in either electrical or mechanical engineering or through a trade. Right now, the pipelines to produce the next generation of data centre commissioning professionals just isn’t producing the numbers of new hires the industry needs. "This obviously affects all data centre commissioning, not just AI. The scale of demand and speed at which the industry is moving means this risks becoming a serious pinch point not too far down the line. "Looking at the next few years, it’s impossible to say exactly where we’re headed, but it’s clear that, unless Europe and the UK can secure access to reliable, affordable energy, as well as clear the way for data centre approvals to move quickly, pain points like the industry talent shortage and rising materials costs (not to mention lead times) threaten to leave the region behind in the race to capture, refine, and capitalise on the new oil: GenAI."

UKRI invests £22 million into data spending
The UK Department for Research and Innovation (UKRI) has invested £22 million into data spending and staff over the past three years, underscoring the department's strategic commitment to data as a cornerstone of national research and innovation. Data is playing an increasingly vital role, particularly as artificial intelligence (AI) is being rolled out throughout government departments, with 70% of government bodies already piloting or planning to use AI, highlighting the urgent need for high-quality, structured, and secure data. This development marks a 70% increase in salary investment in just two years, reflecting both rising headcounts and the increasing value of data expertise in shaping the UK’s research landscape. Stuart Harvey, CEO of Datactics, comments, “Both businesses and government departments are keen to implement AI into their business functions but are overlooking the fundamental truth that AI is only as good as the data it learns from. Hiring challenges are becoming an increasing problem, but businesses should follow in the UKRI's footsteps to invest in data spending and staff, and upskill their teams in data management, governance, and quality to improve data readiness. “AI is only as effective as the data it processes and without structured, accurate, and well-governed data, businesses risk AI systems that are flawed. The rush to deploy AI without a strong data foundation is a costly mistake and, in a competitive AI landscape, only those who get their data right will be the ones who thrive.” UKRI’s investment in its data workforce reflects the growing demand for high-quality, well-managed, and accessible data that enables researchers to collaborate, innovate, and respond to global challenges. Between 2022 and 2025, UKRI’s data-related salary investment rose by 85%, from £5.35 million to £9.89 million, reflecting both growing headcounts and the escalating value of data expertise across the UK’s research ecosystem. Over the same period, the number of staff with “data” in their job titles rose from 138 in 2022 to 203 in 2025 - a 47% increase. Sachin Agrawal, Managing Director for Zoho UK, says, “As the UK continues to position itself as a global science and technology powerhouse, it is a welcome sight to see the department prioritising the investment of its data workforce for long-term commitment to data-driven research. “In an era where public trust and data ethics are paramount, building in-house expertise is essential to ensuring that data privacy, transparency, and compliance are at the heart of our national research infrastructure. This strategic investment lays the foundation for smarter and safer technology use by the UKRI."

House of Commons boosts data workforce by 50%
The UK's House of Commons has splashed £7.5 million into data spending and staff over the past three years, underscoring its strategic commitment to data as a cornerstone of national research and innovation. As the public sector embraces AI at pace, with over 70% of government bodies piloting or planning AI implementation, the demand for robust data infrastructure and skilled personnel has never been greater. In response, the House of Commons has quietly ramped up hiring and spending on data roles, reflecting a broader strategic shift towards data-centric governance. Over the past three years, the number of staff in the House of Commons with "data" in their job titles has jumped from 49 in 2022 to 73 in early 2025, marking a 49% increase. Alongside this, total salary investment for data roles rose by more than 63%, from £1.83 million to £2.98 million, excluding final April 2025 figures still pending payroll completion. The figures reflect a growing recognition within Parliament that AI innovation is only as effective as the data that underpins it. Stuart Harvey, CEO of Datactics, comments, "There's a growing appetite across government to harness the power of AI, but what's often overlooked is that AI is only as reliable as the data it's built on. The House of Commons' investment in data roles is a critical step toward ensuring its systems are grounded in quality, governance, and accuracy. "Hiring the right data professionals and embedding strong data practices is no longer optional, it's essential. Without it, organisations risk deploying AI that makes poor decisions based on flawed information. In this new era, those who prioritise data integrity will be the ones who gain real value from AI." The increase in data staffing at the heart of Parliament reflects a wider cultural shift toward long-term digital resilience, ensuring that public institutions are equipped to harness AI ethically and effectively. Richard Bovey, Head of Data at AND Digital, says, "The House of Commons is leading the way for data investment, with 66% of businesses agreeing that data investment is a top priority for their organisation, according to our recent Data Loyalty research. This move signals a long-term commitment to data-driven governance at the heart of the public sector. "As the UK advances its position as a global leader in science and technology, building in-house data capability is vital, not only to unlock innovation, but also to safeguard, embedded from the ground up, enabling institutions to innovate responsibly. "But data alone isn't enough. Organisational culture plays a crucial role in turning insight into impact and a culture that truly values curiosity, empathy, and accountability is what transforms data points into better decisions and more meaningful outcomes. By investing in its data workforce, the House of Commons is laying a robust foundation for smarter, more ethical, and future-ready public services. It's a necessary step toward creating a public sector that is both digitally progressive and aligned with democratic values."

VAST Data unveils its operating system for the 'thinking machine'
VAST Data, a technology company focused on artificial intelligence and deep learning computing infrastructure, today announced the result of nearly a decade of development with the unveiling of the VAST AI Operating System (OS), a platform purpose-built for the next wave of AI breakthroughs. As AI redefines the fabric of business and society, the industry again finds itself at the dawn of a new computing paradigm – one where great numbers of intelligent agents will reason, communicate, and act across a global grid of millions of GPUs that are woven across edge deployments, AI factories and cloud data centres. To make this world accessible, programmable, and operational at extreme scale, a new generation of intelligent systems requires a new software foundation. The VAST AI OS is the product of nearly ten years of engineering with the aim to create an intelligent platform architecture that can harness the new generation of AI supercomputing machinery and unlock the potential of AI at scale. The platform is built on VAST’s Disaggregated Shared-Everything (DASE) architecture, a parallel distributed system architecture – making it possible to parallelise AI and analytics workloads, federate clusters into a unified computing and data cloud, and then feed new AI workloads with high amounts of data from one tier of storage. Today, DASE clusters support over 1 million GPUs around the world in many of the world’s most data-intensive computing centres. The scope of the AI OS is broad and is intended to consolidate disparate legacy IT technologies into one modern offering. “This isn’t a product release — it’s a milestone in the evolution of computing,” says Renen Hallak, Founder & CEO of VAST Data. “We’ve spent the past decade reimagining how data and intelligence converge. Today, we’re proud to unveil the AI Operating System for a world that is no longer built around applications — but around agents.” The AI OS consists of every aspect of a distributed system to run AI at a global scale: a kernel to run platform services on from private to public cloud, a runtime to deploy AI agents with, eventing infrastructure for real-time event processing, messaging infrastructure, and a distributed file and database storage system that can be used for real-time data capture and analytics. In 2024, VAST previewed the VAST InsightEngine – a service that extracts context from unstructured data using AI embedding tools. If the VAST InsightEngine prepares data for AI using AI, VAST AgentEngine is how AI now comes to life with data – an auto-scaling AI agent deployment runtime that aims to equip users with a low-code environment to build workflows, select reasoning models, define agent tools, and operationalise reasoning. The AgentEngine features a new AI agent tool server that provides support for agents to invoke data, metadata, functions, web search, or other agents using them as MCP-compatible tools. AgentEngine allows agents to assume multiple personas with different purpose and security credentials, and provides secure, real-time access to different tools. The platform’s scheduler and fault-tolerant queuing mechanisms are also intended to ensure agent resilience against machine or service failure. Just as operating systems ship with pre-built utilities, the VAST AgentEngine will feature a set of open-source agents that VAST will release (one per month). Some personal assistants will be tailored to industry use cases, whereas others will be designed for general purpose use. Examples include: ● A reasoning chatbot, powered by all of an organisation’s VAST data ● A data engineering agent to curate data automatically ● A prompt engineer to help optimise AI workflow inputs ● An agent agent, to automate the deployment, evaluation, and improvement of agents ● A compliance agent, to enforce data and activity level regulatory compliance ● An editor agent, to create rich media content ● A life sciences researcher, to assist with bioinformatic discovery In the spirit of enabling organisations to build on the VAST AI OS, VAST Data will be hosting VAST Forward, a series of global workshops, both in-person and online, throughout the year. These workshops will include training on components of the OS and sessions on how to develop on the platform. For more from VAST, click here.

R&M launches latest inteliPhy software suite
R&M, the Swiss developer and provider of high-end infrastructure solutions for data and communications networks, has launched the sixth generation of the inteliPhy software suite on the market. The scope of application has been significantly expanded, and according to R&M, inteliPhy 6.0 thinks beyond the infrastructure of an individual data centre. This makes it easier to plan the expansion of fibre optic infrastructures between several data centres or at a larger site, the company states. To this end, R&M has equipped the software with geoinformation system (GIS) functions for geolocalisation in longitude and latitude. In particular, the software also offers a unique height service, which leads to significantly more precise values when calculating underground cable lengths. inteliPhy 6.0 can visualise telecom, provider and campus networks, as well as underground cable runs. The software from R&M also supports the local and remote management of the infrastructure of external sites such as edge data centres, for example. With inteliPhy, users can plan the infrastructure of a gray space and now also the cages in colocation data centres three-dimensionally. Components such as racks, patch panels, flexible enclosures, cable ducts, trunk cables and active equipment are added from the model library using drag and drop. Components can now also be searched for using part numbers. A new feature is the ability to design customised data centre spaces instead of using fixed-size tiles, thus increasing flexibility. R&M’s ActiPower connector strips for power distribution can now also be integrated using drag and drop. This means that with inteliPhy 6.0, iPDUs can be integrated at the click of a mouse. They are assigned to the object identifiers (OIDs) and their data is scaled or named in such a way that it is easy to understand. This allows the operator to manage the power distribution in the racks without errors. R&M offers the inteliPhy trial versions here. For more from R&M, click here.

Hitachi Vantara launches Virtual Storage Platform 360
Hitachi Vantara, the data storage, infrastructure, and hybrid cloud management subsidiary of Hitachi, today announced the launch of Virtual Storage Platform 360 (VSP 360), a unified management software solution designed to help customers simplify data infrastructure management operations, improve decision-making, and the delivery of data services. With support for block, file, object, and software-defined storage, VSP 360 consolidates multiple management tools and aims to enable IT teams, including those with limited storage expertise, to more efficiently control hybrid cloud deployments, gain AIOps predictive insights, and simplify data lifecycle governance. Organisations today are struggling to manage sprawling data environments spread across disparate storage systems, fragmented data silos, and complex application workloads, all while grappling with overextended IT teams and rising demands for compliance and AI readiness. A recent survey showed AI has led to a dramatic increase in the amount of data storage that businesses require, with the amount of data expected to increase 122% by 2026. The survey also revealed that many IT leaders are being forced to implement AI before their data infrastructure is ready to handle it, with many embarking on a journey of experimentation, hoping to find additional ways to recover some of the cost of their investments. VSP 360 seeks to address these obstacles by integrating data management tools across enterprise storage to monitor key performance indicators, including storage capacity utilisation and overall system health, helping to deliver optimal performance and efficient resource management. It intends to improve end-to-end visibility, leveraging AIOps observability to break down data silos, as well as streamlining the deployment of VSP One data services. “VSP 360 represents a bold step forward in unifying the way enterprises manage their data,” says Octavian Tanase, Chief Product Officer, Hitachi Vantara. “It’s not just a new management tool—it’s a strategic approach to modern data infrastructure that gives IT teams complete command over their data, wherever it resides. With built-in AI and automation and by making it available via SaaS, Private, or via your mobile phone, we're empowering our customers to make faster, smarter decisions and eliminate the traditional silos that slow innovation.” “VSP 360 gives our customers the unified visibility and control they’ve been asking for,” claims Dan Pittet, Senior Solutions Architect, Stoneworks Technologies. “The ability to manage block, file, object, and software-defined storage from a single AI-driven platform helps streamline operations and reduce complexity across hybrid environments. It’s especially valuable for IT teams with limited resources who need to respond quickly to evolving data demands without compromising performance or governance.” "VSP 360 hits the mark for what modern enterprises need," states Ashish Nadkarni, Group Vice President and General Manager, Worldwide Infrastructure Research, IDC. "It goes beyond monitoring to deliver true intelligence across the storage lifecycle. The solution's robust data resiliency helps businesses maintain continuous operations and protect their critical assets, even in the face of unexpected disruptions. By integrating advanced analytics, automation, and policy enforcement, Hitachi Vantara is giving customers the agility and resilience needed to thrive in a data-driven economy.” For more from Hitachi, click here.

NetApp builds AI infrastructure on NVIDIA AI data platform
NetApp, the intelligent data infrastructure company, has announced that it is working with NVIDIA to support the NVIDIA AI Data Platform reference design in the NetApp AIPod solution to accelerate enterprise adoption of agentic AI. Powered by NetApp ONTAP, NetApp AIPod deployments built on the NVIDIA AI Data Platform aim to help businesses build secure, governed, and scalable AI data pipelines for retrieval-augmented generation (RAG) and inferencing. As an NVIDIA-Certified Storage partner leveraging the NVIDIA AI Data Platform, NetApp gives NetApp AIPod users data infrastructure with built-in intelligence. NetApp intends to give customers confidence that they have the enterprise data management capabilities and scalable multi-tenancy needed to eliminate data siloes so they can develop and operate high-performance AI factories and deploy agentic AI to solve real-world business problems. “A unified and comprehensive understanding of business data is the vehicle that will help companies drive competitive advantage in the era of intelligence, and AI inferencing is the key,” says Sandeep Singh, Senior Vice President and General Manager of Enterprise Storage at NetApp. “We have always believed that a unified approach to data storage is essential for businesses to get the most out of their data. The rise of agentic AI has only reinforced that truly unified data storage goes beyond just multi-protocol storage. Businesses need to eliminate silos throughout their entire IT environment, whether on-premises and in the cloud, and across every business function, and we are working with NVIDIA to deliver connected storage for the unique demands of AI.” The NetApp AIPod solution built on the NVIDIA AI Data Platform incorporates NVIDIA accelerated computing to run NVIDIA NeMo Retriever microservices and connects these nodes to scalable storage. Using this reference design enables customers to scan, index, classify and retrieve large stores of private and public documents in real time. The intention is to augment AI agents as they reason and plan to solve complex, multistep problems, helping enterprises turn data into knowledge and boost agentic AI accuracy across many use cases. “Agentic AI enables businesses to solve complex problems with superhuman efficiency and accuracy, but only as long as agents and reasoning models have fast access to high-quality data,” says Rob Davis, Vice President of Storage Technology at NVIDIA. “The NVIDIA AI Data Platform reference design and NetApp’s high-powered storage and mature data management capabilities bring AI directly to business data and drive unprecedented productivity.” For more from NetApp, click here.



Translate »