Data


The hidden cost of overuse and misuse of data storage
Most organisations are storing far more data than they use and, while keeping it “just in case” might feel like the safe option, it’s a habit that can quietly chip away at budgets, performance, and even sustainability goals. At first glance, storing everything might not seem like a huge problem. But when you factor in rising energy prices and ballooning data volumes, the cracks in that strategy start to show. Over time, outdated storage practices, from legacy systems to underused cloud buckets, can become a surprisingly expensive problem. Mike Hoy, Chief Technology Officer at UK edge infrastructure provider Pulsant, explores this growing challenge for UK businesses: More data, more problems Cloud computing originally promised a simple solution: elastic storage, pay-as-you-go, and endless scalability. But in practice, this flexibility has led many organisations to amass sprawling, unmanaged environments. Files are duplicated, forgotten, or simply left idle – all while costs accumulate. Many businesses also remain tied to on-premises legacy systems, either from necessity or inertia. These older infrastructures typically consume more energy, require regular maintenance, and provide limited visibility into data usage. Put unmanaged cloud plus outdated on-prem systems together and you’ve got a recipe for inefficiency. The financial sting of bad habits Most leaders in IT understand storing and securing data costs money. But what often gets overlooked are the hidden costs: the backup of low-value data, the power consumption of idle systems, or the surprise charges that come from cloud services which are not being monitored properly. Then there’s the operational cost. Disorganised or poorly labelled data makes access slower and compliance tougher. It also increases security risks, especially if sensitive information is spread across uncontrolled environments. The longer these issues go unchecked, the more danger there is of a snowball effect. Smarter storage starts with visibility The first step towards resolving these issues isn’t deleting data indiscriminately, it’s understanding what’s there. Carrying out an infrastructure or storage audit can shed light on what’s being stored, who’s using it, and whether it still serves a purpose. Once that visibility is at your fingertips, you can start making smarter decisions about what stays, what goes, and what gets moved somewhere more cost-effective. This is where a hybrid approach of combining cloud, on-premises, and edge infrastructure comes into play. It lets businesses tailor their storage to the job at hand, reducing waste while improving performance. Why edge computing is part of the solution Edge computing isn’t just a tech buzzword; it’s an increasingly practical way to harness data where it’s generated. By processing information at the edge, organisations can act on insights faster, reduce the volume of data stored centrally, and ease the load on core networks and systems. Edge computing technologies make this approach practical. By using regional edge data centres or local processing units, businesses can filter and process data closer to its source, sending only essential information to the cloud or core infrastructure. This reduces storage and transmission costs and helps prevent the build-up of redundant or low-value data that can silently increase expenses over time. This approach is particularly valuable in data-heavy industries such as healthcare, logistics, and manufacturing, where large volumes of real-time information are produced daily. Processing data locally enables businesses to store less, move less, and act faster. The wider payoff Cutting storage costs is an obvious benefit but it’s far from the only one. A smarter, edge-driven strategy helps businesses build a more efficient, resilient, and sustainable digital infrastructure: • Lower energy usage — By processing and filtering data locally, organisations reduce the energy demands of transmitting and storing large volumes centrally, supporting both carbon reduction targets and lower utility costs. As sustainability reporting becomes more critical, this can also help meet Scope 2 emissions goals. • Faster access to critical data — When the most important data is processed closer to its source, teams can respond in real time, meaning improved decision-making, customer experience, and operational agility. • Greater resilience and reliability — Local processing means organisations are less dependent on central networks. If there’s an outage or disruption, edge infrastructure can provide continuity, keeping key services running when they’re needed most. • Improved compliance and governance — By keeping sensitive data within regional boundaries and only transmitting what’s necessary, businesses can simplify compliance with regulations such as GDPR, while reducing the risk of data sprawl and shadow IT. Ultimately, it’s about creating a storage and data environment that’s fit for modern demands. It needs to be fast, flexible, efficient and aligned with wider business priorities. Don’t let storage be an afterthought Data is valuable - but only when it's well managed. When storage becomes a case of “out of sight, out of mind,” businesses end up paying more for less. And what do they have to show for it? Ageing infrastructure and bloated cloud bills. A little housekeeping goes a long way. By adopting modern infrastructure strategies, including edge computing and hybrid storage models, businesses can transform data storage from a hidden cost centre into a source of operational efficiency and competitive advantage. For more from Pulsant, click here.

365 Data Centers, Megaport grow partnership
365 Data Centers (365), a provider of network-centric colocation, network, cloud, and other managed services, has announced a further expansion of its partnership with Megaport, a global Network-as-a-Service provider (NaaS). Megaport has broadened its 365 footprint by adding Points of Presence (PoPs) at several of 365 Data Centers’ colocation facilities - namely Alpharetta, GA; Aurora, CO; Boca Raton, FL; Bridgewater, NJ; Carlstadt, NJ; and Spring Garden, PA - enhancing public cloud and other connectivity systems available to 365’s customers. Said customers will now be able to access DIA, Transport, and direct-to-cloud connectivity options to all the major public cloud hyperscalers - such as Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP), Oracle Cloud, and IBM Cloud - directly from 365 Data Centers. “Integrating Megaport’s advanced connectivity solutions into our data centers is a natural progression of our partnership and network-centric strategy," comments Derek Gillespie, CRO at 365 Data Centers. "When we’ve added to Megaport’s presence in other facilities, the deployments [have] fortified our joint Infrastructure-as-a-Service (IaaS) and NaaS offerings and complemented our partnership in major markets. "Megaport’s growing presence with 365 significantly enhances the public cloud connectivity options available to our customers.” Michael Reid, CEO at Megaport, adds, “Our expanded partnership with 365 Data Centers is all about pushing boundaries and delivering more for our customers. "Together, we’re making cutting-edge network solutions easier to access, no matter the size or location of the business, so customers can connect, scale, and innovate on their terms.” For more from 365 Data Centers, click here.

Summer habits could increase cyber risk to enterprise data
As flexible work arrangements expand over the summer months, cybersecurity experts are warning businesses about the risks associated with remote and ‘workation’ models, particularly when employees access corporate systems from unsecured environments. According to Andrius Buinovskis, Cybersecurity Expert at NordLayer - a provider of network security services for businesses - working from abroad or outside traditional office settings can increase the likelihood of data breaches if not properly managed. The main risks include use of unsecured public Wi-Fi, reduced vigilance against phishing scams, use of personal or unsecured devices, and exposure to foreign jurisdictions with weaker data protection regulations. Devices used outside the workplace are also more susceptible to loss or theft, further raising the threat of data exposure. Andrius recommends the following key measures to mitigate risk: • Strong network encryption — It secures data in transit, transforming it into an unreadable format and safeguarding it from potential attackers. • Multi-factor authentication — Access controls, like multi-factor authentication, make it more difficult for cybercriminals to access accounts with stolen credentials, adding a layer of protection. • Robust password policies — Hackers can easily target and compromise accounts protected by weak, reused, or easy-to-access passwords. Enforcing strict password management policies requiring unique, long, and complex passwords, and educating employees on how to store them securely, minimises the possibility of falling victim to cybercriminals. • Zero trust architecture — The constant verification process of all devices and users trying to access the network significantly reduces the possibility of a hacker successfully infiltrating the business. • Network segmentation — If a bad actor does manage to infiltrate the network, ensuring it's segmented helps to minimise the potential damage. Not granting all employees access to the whole network and limiting it to the parts essential for their work helps reduce the scope of the data an infiltrator can access. He also highlights the importance of centralised security and regular staff training on cyber hygiene, especially when using personal devices or accessing systems while travelling. “High observability into employee activity and centralised security are crucial for defending against remote work-related cyber threats,” he argues.

Siemens enters collaboration with Microsoft
Siemens Smart Infrastructure, a digital infrastructure division of German conglomerate Siemens, today announced a collaboration agreement with Microsoft to transform access to Internet of Things (IoT) data for buildings. The collaboration will enable interoperability between Siemens' digital building platform Building X and Microsoft Azure IoT Operations, a component of this adaptive cloud approach, providing tools and infrastructure to connect edge devices while integrating data. The interoperability of Building X and Azure IoT Operations seeks to make IoT-based data more accessible for large enterprise customers across commercial buildings, data centres, and higher education facilities, and provide them with the information to enhance sustainability and operations. It enables automatic onboarding and monitoring by bringing datapoints such as temperature, pressure, or indoor air quality to the cloud for assets like heating, ventilation, and air conditioning (HVAC) systems, valves, and actuators. The system should also allow customers to develop their own in-house use cases such as energy monitoring and space optimisation. The collaboration leverages known and established open industry standards, including World Wide Web Consortium (W3C) Web of Things (WoT), describing the metadata and interfaces of hardware and software, as well as Open Platform Communications Unified Architecture (OPC UA) for communication of data to the cloud. Both Siemens and Microsoft are members of the W3C and the OPC Foundation, which develops standards and guidelines that help build an industry based on accessibility, interoperability, privacy, and security. “This collaboration with Microsoft reflects our shared vision of enabling customers to harness the full potential of IoT through open standards and interoperability,” claims Susanne Seitz, CEO, Siemens Smart Infrastructure Buildings. “The improved data access will provide portfolio managers with granular visibility into critical metrics such as energy efficiency and consumption. With IoT data often being siloed, this level of transparency is a game-changer for an industry seeking to optimise building operations and meet sustainability targets.” “Siemens shares Microsoft’s focus on interoperability and open IoT standards. This collaboration is a significant step forward in making IoT data more actionable,” argues Erich Barnstedt, Senior Director & Architect, Corporate Standards Group, Microsoft. “Microsoft’s strategy underscores our commitment to partnering with industry leaders to empower customers with greater choice and control over their IoT solutions.” The interoperability between Siemens’ Building X and Azure IoT Operations will be available on the market from the second half of 2025. For more from Siemens, click here.

Nasuni achieves AWS Energy & Utilities Competency status
Nasuni, a unified file data platform company, has announced that it has achieved Amazon Web Services (AWS) Energy & Utilities Competency status. This designation recognises that Nasuni has demonstrated expertise in helping customers leverage AWS cloud technology to transform complex systems and accelerate the transition to a sustainable energy and utilities future. To receive the designation, AWS Partners undergo a rigorous technical validation process, including a customer reference audit. The AWS Energy & Utilities Competency provides energy and utilities customers the ability to more easily select skilled partners to help accelerate their digital transformations. "Our strategic collaboration with AWS is redefining how energy companies harness seismic data,” comments Michael Sotnick, SVP of Business & Corporate Development at Nasuni. “Together, we’re removing traditional infrastructure barriers and unlocking faster, smarter subsurface decisions. By integrating Nasuni’s global unified file data platform with the power of AWS solutions including Amazon Simple Storage Service (S3), Amazon Bedrock, and Amazon Q, we’re helping upstream operators accelerate time to first oil, boost capital efficiency, and prepare for the next era of data-driven exploration." AWS says it is enabling scalable, flexible, and cost-effective solutions from startups to global enterprises. To support the integration and deployment of these solutions, AWS established the AWS Competency Program to help customers identify AWS Partners with industry experience and expertise. By bringing together Nasuni’s cloud-native file data platform with Amazon S3 and other AWS services, the company claims energy customers could eliminate data silos, reduce interpretation cycle times, and unlock the value of seismic data for AI-driven exploration. For more from Nasuni, click here.

Chemists create molecular magnet, boosting data storage by 100x
Scientists at The University of Manchester have designed a molecule that can remember magnetic information at the highest temperature ever recorded for this kind of material. In a boon for the future of data storage technologies, the researchers have made a new single-molecule magnet that retains its magnetic memory up to 100 Kelvin (-173 °C) – around the temperature of the moon at night. The finding, published in the journal Nature, is a significant advancement on the previous record of 80 Kelvin (-193 °C). While still a long way from working in a standard freezer, or at room temperature, data storage at 100 Kelvin could be feasible in huge data centres, such as those used by Google. If perfected, these single-molecule magnets could pack vast amounts of information into incredibly small spaces – possibly more than three terabytes of data per square centimetre. That’s around half a million TikTok videos squeezed into a hard drive that’s the size of a postage stamp. The research was led by The University of Manchester, with computational modelling led by the Australian National University (ANU). David Mills, Professor of Inorganic Chemistry at The University of Manchester, comments, “This research showcases the power of chemists to deliberately design and build molecules with targeted properties. The results are an exciting prospect for the use of single-molecule magnets in data storage media that is 100 times more dense than the absolute limit of current technologies. “Although the new magnet still needs cooling far below room temperature, it is now well above the temperature of liquid nitrogen (77 Kelvin), which is a readily available coolant. So, while we won’t be seeing this type of data storage in our mobile phones for a while, it does make storing information in huge data centres more feasible.” Magnetic materials have long played an important role in data storage technologies. Currently, hard drives store data by magnetising tiny regions made up of many atoms all working together to retain memory. Single-molecule magnets can store information individually and don’t need help from any neighbouring atoms to retain their memory, offering the potential for incredibly high data density. But, until now, the challenge has always been the incredibly cold temperatures needed in order for them to function. The key to the new magnets’ success is the unique structure, with the element dysprosium located between two nitrogen atoms. These three atoms are arranged almost in a straight line – a configuration predicted to boost magnetic performance, but now realised for the first time. Usually, when dysprosium is bonded to only two nitrogen atoms it tends to form molecules with more bent or irregular shapes. In the new molecule, the researchers added a chemical group called an alkene that acts like a molecular pin, binding to dysprosium to hold the structure in place. The team at the Australian National University developed a new theoretical model to simulate the molecule’s magnetic behaviour to allow them to explain why this particular molecular magnet performs so well compared to previous designs. Now, the researchers will use these results as a blueprint to guide the design of even better molecular magnets.

'More than a third of UK businesses unprepared for AI risks'
Despite recognising artificial intelligence (AI) as a major threat, with nearly a third (30%) of UK organisations surveyed naming it among their top three risks, many remain significantly unprepared to manage AI risk. Recent research from CyXcel, a global cyber security consultancy, highlights a concerning gap: nearly a third (29%) of UK businesses surveyed have only just implemented their first AI risk strategy - and 31% don’t have any AI governance policy in place. This critical gap exposes organisations to substantial risks including data breaches, regulatory fines, reputational harm, and critical operational disruptions, especially as AI threats continue to grow and rapidly evolve. CyXcel’s research shows that nearly a fifth (18%) of UK and US companies surveyed are still not prepared for AI data poisoning, a type of cyberattack that targets the training datasets of AI and machine learning (ML) models, or for a deepfake or cloning security incident (16%). Responding to these mounting threats and geopolitical challenges, CyXcel has launched its Digital Risk Management (DRM) platform, which aims to provide businesses with insight into evolving AI risks across major sectors, regardless of business size or jurisdiction. The DRM seeks to help organisations identify risk and implement the right policies and governance to mitigate them. Megha Kumar, Chief Product Officer and Head of Geopolitical Risk at CyXcel, comments, “Organisations want to use AI but are worried about risks – especially as many do not have a policy and governance process in place. The CyXcel DRM provides clients across all sectors, especially those that have limited technological resources in house, with a robust tool to proactively manage digital risk and harness AI confidently and safely.” Edward Lewis, CEO of CyXcel, adds, “The cybersecurity regulatory landscape is rapidly evolving and becoming more complex, especially for multinational organisations. Governments worldwide are enhancing protections for critical infrastructure and sensitive data through legislation like the EU’s Cyber Resilience Act, which mandates security measures such as automatic updates and incident reporting. Similarly, new laws are likely to arrive in the UK next year which introduce mandatory ransomware reporting and stronger regulatory powers. With new standards and controls continually emerging, staying current is essential.”

'AI is the new oil—and data centres are the refineries'
With AI adoption reshaping global industries, Straightline Consulting’s Managing Director, Craig Eadie, shares his insights regarding how data centres are powering the GenAI revolution: "The age of AI is here. Generative artificial intelligence (GenAI) is rewriting the rulebook when it comes to everything from software development and call centre productivity to copywriting — boosting efficiency and, depending on who you ask, on track to raise the GDP of industrialised nations by 10-15% over the next decade. "The impact of AI will reshape the global economy over the coming years, consolidating value among the companies that successfully capitalise on this moment — and disrupting those that don’t. The 'arms race' to develop the next generation of AI technologies — like Google’s new Veo 3 video generation tool, released at the start of June, which is already making headlines for its ability to allow anyone willing to pay $249 per month to create hauntingly lifelike, realistic videos of everything from kittens playing to election fraud — is accelerating as well. AI has become the new oil: the global fuel for economic growth. Unlike oil, however, GenAI alone isn’t valuable. Rather, its power lies in the ability to apply GenAI models to data. That process, akin to refining crude into petroleum, happens in the data centre. "Productivity is far from the only thing GenAI is turbocharging. This rush to build, train, and operate new GenAI models is also accelerating the race to build the digital infrastructure that houses them. Goldman Sachs predicts that global power demand from data centres will increase 50% by 2027 and by as much as 165% by the end of the decade, largely driven by GenAI adoption. "As someone working in the data centre commissioning sector, it’s impossible to overstate the impact that GenAI is having, and will continue to have, on our industry. GenAI has exploded our predictions. It’s even bigger than anyone anticipated. The money, the scale, the speed — demand is growing even faster than the most optimistic projections pre-2023. By the end of 2025, almost half of all the power data centres consume globally could be used to power AI systems. "The data centre commissioning space we’re operating in today has transformed dramatically. On the construction and design side, huge changes, not just in how buildings are constructed, but in the technology inside those buildings, are reshaping how we commission them. "The battle to capitalise on the GenAI boom is a battle to overcome three challenges: access to power, materials, and talent. "GenAI requires an order of magnitude more power than traditional colocation or cloud workloads. As a result, there are serious concerns about power availability across Europe, especially in the UK. We can’t build the data centres we need to capitalise on the GenAI boom because there’s just not enough power. There are some encouraging signs that governments are taking this challenge seriously. For example, the UK government has responded by creating 'AI Growth Zones' to unlock investment in AI-enabled data centres by improving access to power and providing planning support in some areas of the country. The European Union’s AI Continent Plan also includes plans to build large-scale AI data and computing infrastructures, including at least 13 operational 'AI factories' by 2026 and up to five 'gigafactories' at some point after that. "However, power constraints and baroque planning and approvals processes threaten to undermine these efforts. Multiple data centre markets are already facing pushback from local councils and communities against new infrastructure because of their effect on power grids and local water supplies. Dublin and Amsterdam already stymied new builds even before the GenAI boom. This comes with risk, because AI engines can be built anywhere. GDPR means data must be housed in-country, but if Europe and the UK don’t move faster, large US AI firms will resort to building their massive centres stateside and deploy the tech across the Atlantic later. Once an AI engine is trained, it can run on less demanding infrastructure. We risk stifling the AI industry in Europe and the UK if we don’t start building faster and making more power available today. "The other key constraints are access to raw materials and components. Global supply chain challenges have spiked the cost of construction materials, and the lead times for data-centre-specific components like cooling equipment can be as much as six months, further complicating the process of building new infrastructure. "Access to talent is another pain point that threatens to slow the industry at a time when it should be speeding up. Commissioning is a vital part of the data centre design, construction, and approvals process, and our sector is facing a generational talent crisis. There isn’t enough young talent coming into the sector. That has to change across the board—not just in commissioning, but for project managers, consultants, everyone, everywhere. The pain point is particularly acute in commissioning, however, because of the sector’s relatively niche pipeline and stringent requirements. You can’t just walk in off the street and become a commissioning engineer. The field demands a solid background in either electrical or mechanical engineering or through a trade. Right now, the pipelines to produce the next generation of data centre commissioning professionals just isn’t producing the numbers of new hires the industry needs. "This obviously affects all data centre commissioning, not just AI. The scale of demand and speed at which the industry is moving means this risks becoming a serious pinch point not too far down the line. "Looking at the next few years, it’s impossible to say exactly where we’re headed, but it’s clear that, unless Europe and the UK can secure access to reliable, affordable energy, as well as clear the way for data centre approvals to move quickly, pain points like the industry talent shortage and rising materials costs (not to mention lead times) threaten to leave the region behind in the race to capture, refine, and capitalise on the new oil: GenAI."

UKRI invests £22 million into data spending
The UK Department for Research and Innovation (UKRI) has invested £22 million into data spending and staff over the past three years, underscoring the department's strategic commitment to data as a cornerstone of national research and innovation. Data is playing an increasingly vital role, particularly as artificial intelligence (AI) is being rolled out throughout government departments, with 70% of government bodies already piloting or planning to use AI, highlighting the urgent need for high-quality, structured, and secure data. This development marks a 70% increase in salary investment in just two years, reflecting both rising headcounts and the increasing value of data expertise in shaping the UK’s research landscape. Stuart Harvey, CEO of Datactics, comments, “Both businesses and government departments are keen to implement AI into their business functions but are overlooking the fundamental truth that AI is only as good as the data it learns from. Hiring challenges are becoming an increasing problem, but businesses should follow in the UKRI's footsteps to invest in data spending and staff, and upskill their teams in data management, governance, and quality to improve data readiness. “AI is only as effective as the data it processes and without structured, accurate, and well-governed data, businesses risk AI systems that are flawed. The rush to deploy AI without a strong data foundation is a costly mistake and, in a competitive AI landscape, only those who get their data right will be the ones who thrive.” UKRI’s investment in its data workforce reflects the growing demand for high-quality, well-managed, and accessible data that enables researchers to collaborate, innovate, and respond to global challenges. Between 2022 and 2025, UKRI’s data-related salary investment rose by 85%, from £5.35 million to £9.89 million, reflecting both growing headcounts and the escalating value of data expertise across the UK’s research ecosystem. Over the same period, the number of staff with “data” in their job titles rose from 138 in 2022 to 203 in 2025 - a 47% increase. Sachin Agrawal, Managing Director for Zoho UK, says, “As the UK continues to position itself as a global science and technology powerhouse, it is a welcome sight to see the department prioritising the investment of its data workforce for long-term commitment to data-driven research. “In an era where public trust and data ethics are paramount, building in-house expertise is essential to ensuring that data privacy, transparency, and compliance are at the heart of our national research infrastructure. This strategic investment lays the foundation for smarter and safer technology use by the UKRI."

House of Commons boosts data workforce by 50%
The UK's House of Commons has splashed £7.5 million into data spending and staff over the past three years, underscoring its strategic commitment to data as a cornerstone of national research and innovation. As the public sector embraces AI at pace, with over 70% of government bodies piloting or planning AI implementation, the demand for robust data infrastructure and skilled personnel has never been greater. In response, the House of Commons has quietly ramped up hiring and spending on data roles, reflecting a broader strategic shift towards data-centric governance. Over the past three years, the number of staff in the House of Commons with "data" in their job titles has jumped from 49 in 2022 to 73 in early 2025, marking a 49% increase. Alongside this, total salary investment for data roles rose by more than 63%, from £1.83 million to £2.98 million, excluding final April 2025 figures still pending payroll completion. The figures reflect a growing recognition within Parliament that AI innovation is only as effective as the data that underpins it. Stuart Harvey, CEO of Datactics, comments, "There's a growing appetite across government to harness the power of AI, but what's often overlooked is that AI is only as reliable as the data it's built on. The House of Commons' investment in data roles is a critical step toward ensuring its systems are grounded in quality, governance, and accuracy. "Hiring the right data professionals and embedding strong data practices is no longer optional, it's essential. Without it, organisations risk deploying AI that makes poor decisions based on flawed information. In this new era, those who prioritise data integrity will be the ones who gain real value from AI." The increase in data staffing at the heart of Parliament reflects a wider cultural shift toward long-term digital resilience, ensuring that public institutions are equipped to harness AI ethically and effectively. Richard Bovey, Head of Data at AND Digital, says, "The House of Commons is leading the way for data investment, with 66% of businesses agreeing that data investment is a top priority for their organisation, according to our recent Data Loyalty research. This move signals a long-term commitment to data-driven governance at the heart of the public sector. "As the UK advances its position as a global leader in science and technology, building in-house data capability is vital, not only to unlock innovation, but also to safeguard, embedded from the ground up, enabling institutions to innovate responsibly. "But data alone isn't enough. Organisational culture plays a crucial role in turning insight into impact and a culture that truly values curiosity, empathy, and accountability is what transforms data points into better decisions and more meaningful outcomes. By investing in its data workforce, the House of Commons is laying a robust foundation for smarter, more ethical, and future-ready public services. It's a necessary step toward creating a public sector that is both digitally progressive and aligned with democratic values."



Translate »