Monday, March 10, 2025

Data


Broadband Forum launches trio of new open broadband projects
An improved user experience, including reduced latency and a wider choice of in-home applications, will be delivered to broadband consumers as the Broadband Forum launches three new projects. The three new open broadband projects will provide open source software blueprints for application providers and Broadband Service Providers (BSPs) to follow. These will deliver a foundation for Artificial Intelligence (AI) and Machine Learning (ML) for network automation, additional tools for network latency and performance measurements, and on-demand connectivity for different applications. “These new projects will play a key role in improving network performance measurement and monitoring and the end-user experience,” says Broadband Forum Technical Chair, Lincoln Lavoie. “Open source software is a crucial component in providing the blueprint for BSPs to follow and we invite interested companies to get involved.” The new Open Broadband-CloudCO-Application Software Development Kit (OB-CAS), Open Broadband – Simple Two-Way Active Measurement Protocol (OB-STAMP), and Open Broadband – Subscriber Session Steering (OB-STEER) projects will bring together software developers and standards experts from the forum. The projects will deliver open source reference implementations, which are examples of how Broadband Forum specifications can be implemented. They act as a starting point for application developers to base their designs on. In turn, those applications are available on platforms for BSPs to select and offer to their customers, shortening the path between the development of the specification to the first deployment of the technologies into the network.  “The development of open source software and open broadband standards are invaluable to the industry, laying the foundations for faster innovation through global collaboration,” says Broadband Forum CEO, Craig Thomas. “The Broadband Forum places the end-user experience at the forefront of all of our projects and is playing a crucial role in overcoming network issues.” OB-CAS aims to simplify network monitoring and maintenance for BSPs, while also offering a wider selection of applications from various software vendors. Alongside this, network operations will be simplified and automated through existing Broadband Forum cloud standards that use AI and ML to improve the end-user experience. OB-STAMP will build an easy-to-deploy component that simplifies network performance measurement between Customer Premises Equipment and IP Edge. The project will allow BSPs to proactively monitor their subscribers’ home networks to measure latency and, ultimately, avoid network failures. Four vendors have already signed up to join the efforts to reduce the cost and time associated with deploying infrastructure for measuring network latency. Building on Broadband Forum’s upcoming technical report WT-474, OB-STEER will create a reference implementation of the Subscriber Session Steering architecture to deliver flexible, on-demand connectivity and simplify network management. Interoperability of Subscriber Session Steering is of high importance as it will be implemented in the access network equipment and edge equipment from various vendors.

Five considerations when budgeting for enterprise storage
By Eric Herzog, Chief Marketing Officer at Infinidat. Enterprise storage is fundamental to maintaining a strong enterprise data infrastructure. While storage has evolved over the years, the basic characteristics remain the same – performance, reliability, cost-effectiveness, flexibility, capacity, flexibility, cyber resilience, and usability. The rule of thumb in enterprise storage is to look for faster, cheaper, easier and bigger capacity, but in a smaller footprint. So, when you’re reviewing what storage solutions to entrust your enterprise with, what are the factors to be considering? What are the five key considerations that have risen to the top of enterprise storage buying decisions? • Safeguard against cyber attacks, such as ransomware and malware, by increasing your enterprise’s cyber resilience and cyber recovery with automated cyber protection.• Look to improve the performance of your enterprise storage infrastructure by up to 2.5x (or more), while simultaneously consolidating storage to save costs.• Evaluate the optimal balance between your enterprise’s use of on-premises and the use of the public cloud (i.e. Microsoft Azure or Amazon AWS).• Extend cyber detection across your storage estate.• Initiate a conversation about infrastructure consumption services that are platform-centric, automated, and optimised for hybrid, multi-cloud environments. The leading edge of enterprise storage has already moved into the next generation of storage arrays for all-flash and hybrid configurations. With cybercrime expected to cost an enterprise in excess of £7.3 trillion in 2024, according to Cybersecurity Ventures, the industry has also seen a rise in cybersecurity capabilities being built into primary and secondary storage. Seamless hybrid multi-cloud support is now readily available. And enterprises are taking advantage of Storage-as-a-Service (STaaS) offerings with confidence and peace of mind. When you’re buying enterprise storage for a refresh or for consolidation, it’s best to seek out solutions that are built from the ground-up with cyber resilient and cyber recovery technology intrinsic to your storage estate, optimised by a platform-native architecture for data services. In today’s world with continuous cyber threats, enterprises are substantially extending cyber storage resilience and recovery, as well as real-world application performance, beyond traditional boundaries. We have also seen our customers value scale-up architectures, such as 60%, 80% and 100% populated models of software-defined architected storage arrays. This can be particularly pertinent with all-flash arrays that are aimed at specific latency-sensitive applications and workloads. Having the option to utilise a lifecycle management controller upgrade program is also appealing when buying a next-generation storage solution. Thinking ahead, this option can extend the life of your data infrastructure. In addition, adopting next-gen storage solutions that facilitate a GreenIT approach puts your enterprise in a position to both save money (better economics) and reduce your carbon emissions (better for the environment) by using less power, less rack space, and less cooling. I call this the “E2” approach to enterprise storage: better economics and a better environment together in one solution. It helps to have faster storage devices with massive bandwidth and blistering I/O speeds. Storage is not just about storage arrays anymore Traditionally, it was commonly known that if you needed more enterprise data storage capacity, you’d buy more storage arrays and throw them into your data centre. No more thought needed for storage, right? All done with storage, right? Well, not exactly. Not only has this piecemeal approach caused small array storage 'sprawl' and complexity that can be exasperating for any IT team, but it doesn’t address the significant need to secure storage infrastructures or simplify IT operations. Cyber storage resilience and recovery need to be a critical component of an enterprise’s overall cybersecurity strategy. You need to be sure that you can safeguard your data infrastructure with cyber capabilities, such as cyber detection, automated cyber protection, and near-instantaneous cyber recovery. These capabilities are key to neutralising the effects of cyber attacks. They could mean the difference between you paying a ransom for your data that has been taken 'hostage' and not paying any ransom. When you can execute rapid cyber recovery of a known good copy of your data, then you can effectively combat the cybercriminals and beat them at their own sinister game. One of the latest advancements in cyber resilience that you cannot afford to ignore is automated cyber protection, which helps you reduce the threat window for cyber attacks. With a strong automated cyber protection solution, you can seamlessly integrate your enterprise storage into your Security Operations Centres (SOC), Security Information and Events Management (SIEM), Security Orchestration, Automation, and Response (SOAR) cyber security applications, as well as simple syslog functions for less complex environments. A security-related incident or event triggers immediate automated immutable snapshots of data, providing the ability to protect both block and file datasets. This is an extremely reliable way to ensure cyber recovery. Another dimension of modern enterprise storage is seamless configurations of hybrid multi-cloud storage. The debate about whether an enterprise should put everything into the public cloud is over. There are very good use cases for the public cloud, but there continues to be very good use cases for on-prem storage, creating a hybrid multi-cloud environment that brings the greatest business and technical value to the organisation. You can now harness the power of a powerful on-prem storage solution in a cloud-like experience across the entire storage infrastructure, as if the storage array you love on-premises is sitting in the public cloud. Whether you choose Microsoft Azure or Amazon AWS or both, you can extend the data services usually associated with on-prem storage to the cloud, including ease of use, automation, and cyber storage resilience. Purchasing new enterprise storage solutions is a journey. Isn’t it the best choice to get on the journey to the future of enterprise storage, cyber security, and hybrid multi-cloud? If you use these top five considerations as a guidepost, you end up in an infinitely better place for storage that transforms and transcends conventional thinking about the data infrastructure. Infinidat at DTX 2025 Eric Herzog is a guest speaker at DTX 2025 and will be discussing “The New Frontier of Enterprise Storage: Cyber Resilience & AI” on the Advanced Cyber Strategies Stage. Join him for unique insights on 3 April 2025, from 11.15-11.40am. DTX 2025 takes place on 2-3 April at Manchester Central. Infinidat will be located at booth #C81. For more from Infinidat, click here.

Ataccama to deliver end-to-end visibility into data flows
Ataccama, the data trust company, has launched Ataccama Lineage, a new module within its Ataccama ONE unified data trust platform (V16). Ataccama Lineage provides enterprise-wide visibility into data flows, offering organisations a clear view of their data’s journey from source to consumption. It helps teams trace data origins, resolve issues quickly, and ensure compliance - enhancing transparency and building confidence in data accuracy for business decision-making. Fully integrated with Ataccama’s data quality, observability, governance, and master data management capabilities, Ataccama lineage enables organisations to make faster, more informed decisions, such as ensuring audit readiness and meeting regulatory compliance requirements. Data challenges are increasingly complex and, according to the Ataccama Data Trust Report 2025, 41% of Chief Data Officers are struggling with fragmented and inconsistent systems. Despite significant investments in integrations, AI, and cloud applications, enterprise data often remains siloed or poor in quality. This fractured landscape obscures visibility into data transformations and flows, creating inefficiencies and operational silos. Ataccama believes that the lack of clarity hampers collaboration and increases the risk of non-compliance with regulations like GDPR, erodes customer trust, drains resources, and slows decision-making - ultimately stifling organisational growth. Ataccama Lineage is designed to simplify how organisations manage and trust their data. Its AI-powered capabilities automatically map data flows and transformations, saving time and reducing manual effort. For example, tracking customer financial data across fragmented systems is a common struggle in financial services. Ataccama Lineage provides clear, visual maps that trace issues like missing or duplicate records to their source. It also tracks sensitive data, such as PII, with audit-ready documentation to ensure compliance. By delivering reliable, trustworthy data, Ataccama Lineage establishes a strong foundation for AI and analytics, enabling organisations to make informed decisions and achieve long-term success. Isaac Gabay, Senior Director, Data Management & Operations at Lennar, says, “As one of the nation’s leading homebuilders, Lennar is continually evolving our data foundation with best-in-class, cost-effective solutions to drive efficiency and innovation. Ataccama ONE Lineage’s detailed, visual map of data flows enables us to monitor data quality, trace issues through our ecosystem, and take a proactive approach to prevent and remediate quality concerns while maintaining centralised control. Ataccama ONE Lineage will provide unparalleled visibility, enhancing transparency, data literacy, and trust in our data. This partnership strengthens our ability to scale with confidence, deliver accurate insights, and adapt to the evolving needs of the homebuilding industry.” Jessie Smith, VP of Data Quality at Ataccama, comments, "Managing today’s data pipelines means dealing with increasing sources, diverse data types, and transformations that impact systems upstream and downstream. The rise of AI and generative AI has amplified complexity while expanding data estates, and stricter audits demand greater transparency. Understanding how information flows across systems is no longer optional, it’s essential. Ataccama Lineage is part of the Ataccama ONE data trust platform which brings together data quality, lineage, observability and master data management into a unified solution for enterprise companies." Key benefits of AI-powered Ataccama Lineage include: - Faster resolution of data quality issues: Advanced anomaly detection identifies issues like missing records, unexpected values, or duplicates caused by transformation errors. For example, retail operations with multiple sales channels, mismatched pricing, or inventory discrepancies can disrupt business. Ataccama Lineage enables teams to quickly pinpoint root causes, assess downstream impacts, and resolve issues before they affect operations - ensuring continuity and reliability. - Simplified compliance: Data classification and anomaly detection enhance visibility into sensitive data, such as PII, and track its transformations. Financial organisations benefit from audit-ready documentation that ensures PII is properly traced to authorised locations, reducing regulatory risks, meeting data privacy requirements, and fostering customer trust with transparent processes. - Comprehensive visibility into data flows: Lineage maps provide a detailed, enterprise-wide view of data flows, from origin to dashboards and reports. Teams in sectors like manufacturing can analyse the lineage of key metrics, such as production efficiency or supply chain performance, identifying dependencies across ETL jobs, on-premises systems, and cloud platforms. Enhanced filtering focuses efforts on critical datasets, allowing faster issue resolution and better decision-making. - Streamlined data modernisation efforts: During cloud migrations, Ataccama Lineage reduces risks by mapping redundant pipelines, dependencies, and critical datasets. Insurance companies transitioning to modern platforms can retire outdated systems and migrate only essential data, minimising disruption while maintaining compliance with regulations like Solvency II. For more from Ataccama, click here.

AirTrunk expands with second AI-ready data centre in Johor
Asia Pacific & Japan (APJ) hyperscale data centre specialist, AirTrunk, has announced plans to develop its second cloud and AI-ready data centre in Johor, Malaysia. AirTrunk JHB2 will be located in Iskandar Puteri, Johor region. Scalable to over 270MW, JHB2 will support demand from global public cloud and technology companies in the region. The JHB2 announcement follows the opening of AirTrunk’s first data centre in Johor, 150+MW AirTrunk JHB1, in July 2024. Combined, AirTrunk is investing over RM 9.7 billion / A$3.5 billion in Malaysia, providing more than 420MW of IT load. JHB2, strategically located in a major availability zone, provides an end-to-end cross border connectivity strategy for customers and the ability to scale their operations to match demand. The additional capacity will support Malaysia’s fast-growing digital economy and follows the establishment of the landmark Johor-Singapore special economic zone (JS-SEZ). Like JHB1, the new data centre will feature AirTrunk’s state-of-the-art liquid cooling technology for managing the high-density demands of AI and will ensure significant energy savings. JHB2 is designed to meet the highest standards of efficiency and security, with a low design PUE (Power Usage Effectiveness) of 1.25 and multiple renewable energy options available to customers. To support Johor State Government’s aim to diversify water sources, AirTrunk is scoping treated greywater as a recycled sustainable water supply for its campuses’ operations. Aligned with the Malaysian Government’s focus on National Technical and Vocational Education and Training (TVET) and increasing opportunities for highly skilled workers, AirTrunk is creating jobs for Malaysians, with above market rate remuneration for AirTrunk employees, 90% local employees and career development opportunities. AirTrunk is also contributing to digital literacy programs and funding STEM education scholarships at the Universiti Teknologi Malaysia (UTM) to further support the local community over the long term. Advancing towards its net zero 2030 target, AirTrunk recently announced one of the largest onsite solar deployments for a data centre in Southeast Asia at JHB1, as well as the first renewable energy Virtual Power Purchase Agreement for a data centre for 30MW of renewable energy, under Malaysia’s Corporate Green Power Programme. AirTrunk is working with the leading Malaysian utility company, Tenaga Nasional Berhad (TNB) to connect JHB2 through TNB’s Green Lane Pathway for Data Centres initiative, streamlining high-voltage electricity supply to an accelerated timeframe of 12 months. AirTrunk is also providing land for TNB to build a new substation, adding resilience to the electricity distribution system in the area. This continuing collaboration, which started from an MoU signed in 2023, opens the door for AirTrunk to explore green solutions with TNB in efforts to advance the energy transition in the region. AirTrunk Founder & Chief Executive Officer, Robin Khuda, says, “As Malaysia establishes itself as a digital powerhouse, it is a privilege for AirTrunk to contribute to this growth over the long term and deliver shared benefit for the people of Malaysia. AirTrunk’s data centres serve as essential infrastructure that will help boost productivity and enable new products and services that can drive economic growth. “We are committed to helping realise the potential of cloud and AI in Malaysia and prioritising circularity for the benefit of society and the environment. AirTrunk is supporting local digital literacy and STEM initiatives, driving the energy transition and working to embed a sustainable water supply to make a positive impact.”

Tellus delivers key component for collaborative data economy
It has been revealed that the Gaia-X development project, Tellus, has successfully completed its implementation phase. Led by the Internet Exchange operator, DE-CIX, the consortium has developed a prototype interconnection infrastructure that provides fully automatic and virtual access to networks for sensitive, real-time applications across distributed cloud environments. Tellus covers the entire supply chain of interconnection services and integrates offerings from various providers based on the decentralised and distributed data infrastructure of Gaia-X. This makes Tellus a key component for the comprehensive connectivity required by intelligent business models in a collaborative data economy. Delivering networks and services according to application demands In the past, implementing business-critical applications in distributed IT systems required purchasing all necessary components, services, and functions separately from different providers and manually combining them in a time-consuming and costly process - without end-to-end guarantees. Tellus’ open-source software not only automates these processes but also ensures specific connectivity requirements. During the final phase, the project team implemented a controller and service registry which function as central elements of a super-node architecture. The controller coordinates and provisions service offers and orders via application programming interfaces (APIs). The service registry stores and lists all services that the controller can search through, address, and combine. The search process runs via the controller into the registry and the associated graph database, which then delivers suitable solutions. Finally, the controller commissions the interconnection infrastructure to provision network and cloud services to meet the requirements of the respective application, including guaranteed performance and Gaia-X compliance. Deployable prototype: Reliable and dynamic connectivity for data exchange In the implemented proof of concept (PoC) demo, virtual networks and services can be provided via a user-friendly interface to meet the requirements of industrial applications; for example, transmitting hand movements to a robot in real time via a smart glove. The same applies to delivering connectivity for a digital twin from IONOS in a manner required by production plants, to simulate, monitor in real-time, and optimise manufacturing steps. Equally, TRUMPF’s fully automatic laser cutting tools, where reliable and dynamic networks keep systems available and pay-per-part business models productive. Milestone for a secure, sovereign, and collaborative data economy “Since Tellus registers the products of all participants in a standardised way and stores the network nodes in a structured manner in a graph database, interconnection services can be composed end-to-end via a weighted path search,” says Christoph Dietzel, Head of Product & Research at DE-CIX. “With the successful completion of the implementation phase and the proof-of-concept demo, we have not only demonstrated the technical feasibility of our Gaia-X compliant interconnection infrastructure, but have also set an important milestone for the future of secure, sovereign, and collaborative data processing.” For more from DE-CIX, click here.

Industry experts comment on Data Privacy Day
With today (28 January) marking Data Privacy Day - an annual event seeking to raise awareness and promote privacy and data protection best practices - industry experts have marked the occasion by presenting a range of views on the latest trends and challenges that have arisen since last year's occasion. - Dr Ellison Anne Williams, Founder and CEO of Enveil, comments, “Data Privacy Day serves as a crucial reminder to safeguard sensitive information in an era where data dominates. As we navigate an increasingly interconnected world and transformative technologies such as AI grow their foothold in the digital economy, finding ways to protect data privacy and mitigate risk will be essential. “Privacy Enhancing Technologies (PETs) enable, enhance, and preserve data privacy throughout its lifecycle, securing data usage and allowing users to capitalise on the power of AI without sacrificing privacy or security. Organisations that truly prioritise data will incorporate PETs as a foundational, business-enabling tool that will fortify data-driven capabilities and enable data to be leveraged securely across silos and boundaries. “This year’s Data Privacy Day theme is ‘Take control of your data’, but that sentiment should not be limited to our personal data footprint. Businesses need to be proactive in their approach to data protection and commit to a future where PETs are woven into the very fabric of digital strategy. This will empower users to responsibly and securely harness innovative tools, such as AI and Machine Learning, in line with global regulations and compliance requirements.” - Edwin Weijdema, Field CTO EMEA & Cybersecurity Lead at Veeam, adds, “This year, Data Privacy Day seems a little different. With significant cyber security regulations coming into force around the world, most notably NIS2 and DORA, it feels like a lot has changed since we marked this day just 12 months ago. “And it has. We’ve seen corporate accountability given increasing weight when it comes to data resilience thanks to NIS2. It’s no longer a case of passing the buck – responsibility ultimately sits with the C-suite. Simultaneously, data resilience is shifting from a ‘cyber security requirement’ to a tangible business differentiator. At the moment, breaches and ransomware are still a ‘when’, not an ‘if’ - and I don’t see this changing. As C-suites become ever more aware, they’ll be demanding to see evidence of their organisation's data resilience, from their internal teams and any third-party partners. “Data Privacy Day is a good chance to reflect on how much can change in a year. After all, organisations can’t rely on markers like this to nudge them on the importance of data resilience - it needs to be a priority 365 days a year.” - James Blake, VP Global Cyber Resiliency Strategy at Cohesity, comments, "On Data Privacy Day, it's crucial to recognise that focusing solely on compliance will only lead to companies tying themselves in knots reacting to the swarm of active or planned regulatory requirements, as well as data legislation coming into force across multiple national and state jurisdictions. If we look at Germany alone as an example, there are 17 state laws on top of national and EU requirements. The most effective way to ensure data privacy compliance is by building robust and repeatable operational capabilities. This involves programmatically conducting comprehensive data audits to identify, categorise, and secure sensitive information. Implementing robust encryption protocols, including migrating to encryption methods resilient to emerging quantum computing attacks, is essential. Additionally, consider working with technology companies who can offer immutable data that can provide an extra layer of security, ensuring data cannot be altered or deleted, thus protecting against ransomware attacks, data breaches and the unnecessary financial loss accrued because of downtime. Appointing security champions in each business unit to educate their peers on tailored data privacy processes based on data classification levels is an important step. By embedding these practices, compliance with varying regulatory requirements will naturally follow." - Adrianus Warmenhoven, a cyber security expert at NordVPN, comments: “As debates continue over whether data, oil, or land holds the greatest value, in cyber security, the answer is unequivocal: data. Personal data, unlike physical assets, can be copied, stolen, or sold without leaving visible traces, creating significant financial and reputational risks. “Apps are a major culprit, often exposing sensitive information through excessive permissions, missed updates, or unauthorised data sharing. Keeping software current is not just a personal safeguard; it also helps protect your network of contacts from phishing attacks through outdated systems. The good news is that while it may seem like an uphill battle to get on top of your data privacy, it’s never been easier to manage how much you share.” To protect people’s privacy on apps, Adrianus offers these preventive measures: Always download apps from official stores - Unofficial apps may not check how safe it is before it is available to download, increasing the risk of modifications by criminals. Familiarise yourself with the data permissions required by apps - Head to your settings and review and adjust these permissions as necessary, particularly sensitive ones like access to your camera, microphone, storage, location, and contact list. Before downloading any app, read its privacy policy - Understand what information it will track and share with third parties. If the privacy level is unsatisfactory, consider an alternative. You can usually find this in the description on your mobile device’s app store. Limit location access only when using the app - It is difficult to justify why some apps need to know your location at all times, so do not give it to them. Avoid using social media accounts to log in, because doing so can allow unnecessary data exchange. Delete any apps you no longer use - This helps to prevent them from collecting data in the background. For more on data privacy, click here.

Progress Data Cloud platform launched
Progress, a provider of AI-powered digital experiences and infrastructure software, has announced the launch of Progress Data Cloud, a managed Data Platform as a Service designed to simplify enterprise data and artificial intelligence (AI) operations in the cloud. With Progress Data Cloud, customers can accelerate their digital transformation and AI initiatives while reducing operational complexity and IT overhead. As global businesses scale their data operations and embrace AI, a robust cloud data strategy has become the cornerstone of success, enabling organisations to harness the full potential of their data for innovation and growth. Progress Data Cloud meets this critical need by providing a unified, secure and scalable platform to build, manage and deploy data architectures and AI projects without the burden of managing IT infrastructure. “Organisations increasingly recognise that cloud and AI are pivotal to unlocking business value at scale,” says John Ainsworth, GM and EVP, Application and Data Platform, Progress. “Progress Data Cloud empowers companies to achieve this by offering a seamless, end-to-end experience for data and AI operations, removing the barriers of infrastructure complexity while delivering exceptional performance, security and predictability.” Key features and benefits Progress Data Cloud is a Data Platform as a Service that enables managed hosting of feature-complete instances of Progress Semaphore and Progress MarkLogic, with plans to support additional Progress products in the future. Core benefits include: • Simplified operations: Eliminates infrastructure complexity with always-on infrastructure management, monitoring service, continuous security scanning and automated product upgrades.• Cost efficiency: Reduces IT costs and bottlenecks with predictable pricing, resource usage transparency and no egress fees.• Enhanced security: Helps harden security posture with an enterprise-grade security model that is SOC 2 Type 1 compliant.• Scalability and performance: Offers superior availability and reliability, supporting mission-critical business operations, GenAI demands and large-scale analytics.• Streamlined user management: Self-service access controls and tenancy management provide better visibility and customisation. Progress Data Cloud accelerates time to production by offering managed hosting for the Progress MarkLogic Server database and the Progress MarkLogic Data Hub solution with full-feature parity. Customers can benefit from enhanced scalability, security and seamless deployment options. Replacing Semaphore Cloud, Progress Data Cloud provides a next-generation cloud platform with all existing Semaphore functionality plus new features for improved performance, security, reliability, user management and SharePoint Online integration. “As enterprises continue to invest in digital transformation and AI strategies, the need for robust, scalable and secure data platforms becomes increasingly evident,” says Stewart Bond, Vice President, Data Intelligence and Integration Software, IDC. “Progress Data Cloud addresses a critical market need by simplifying data operations and accelerating the development of AI-powered solutions. Its capabilities, from seamless infrastructure management to enterprise-grade security, position it as a compelling choice for organisations looking to unlock the full potential of their data to drive innovation and business value.” Progress Data Cloud is a cloud-based hosting of foundational products that make up the Progress Data Platform portfolio. Progress Data Cloud is now available for existing and new customers of the MarkLogic and Semaphore platforms.

Healthcare organisation reduces storage costs with DS3 platform
Cubbit, a geo-distributed cloud storage expert, has announced that ASL CN1 Cuneo, a North Italian healthcare public service organisation, has reduced its storage costs by 50% thanks to Cubbit’s fully-managed S3 cloud object storage, DS3. ASL CN1 Cuneo now stores all of its 110 TB of backup data on Cubbit as part of its 3-2-1-1-0 backup strategy orchestrated via Veeam. DS3 delivers exceptional resilience against client-and server-side ransomware attacks and disasters, ensuring top-tier security (NIS2 standard), GDPR compliance, and adherence to regional public sector regulations while allowing the company to choose the exact geographical perimeter of data storage. By adopting Cubbit, ASL CN1 Cuneo has avoided the hidden costs typically associated with S3 - such as egress fees, API calls, deletion, and bucket replication fees. ASL CN1 Cuneo manages healthcare services across 173 municipalities and employs over 3,500 staff members. As most of its data is health-related (80%), it is therefore classified as “critical” by the Italian National Cybersecurity Agency (ACN). Thus, compliance with stringent GDPR and NIS2 data sovereignty and security guidelines and ACN certification (Italian public sector requirement) was non-negotiable. Prior to selecting Cubbit, ASL CN1 Cuneo had considered various other storage platforms. The healthcare company previously relied on hyperscaler services, but found that egress costs and API call fees were inflating expenses. On-premises solutions offered control and compliance but carried high upfront costs, demanded heavy IT resources, and proved challenging to maintain - difficulties especially significant for a public healthcare entity with limited IT resources regarding employees and budget. Since the adoption of Cubbit’s technology, ASL CN1 Cuneo has reaped the benefits of an S3 cloud object storage that meets national and European sovereign requirements, keeps data within Italian borders, and ensures full regulatory compliance. With Cubbit fully-managed object storage, fixed storage costs include all the main S3 APIs, together with the geo-distribution capacity, enabling ASL CN1 Cuneo to save 50% on its previous storage costs for equivalent configurations, while enhancing data resilience and security. Additionally, achieving the comprehensive security and compliance standards enabled by Cubbit’s DS3 solution aids in mitigating the risk of non-compliance fines to GDPR and NIS2, which can reach up to €10m (£8.5m) or 2% of the global annual revenue, whichever is higher. The cost efficiencies enabled by Cubbit allow ASL CN1 Cuneo to reinvest savings into its core mission of delivering quality healthcare services. “Finding a storage solution that met our strict compliance needs, elevated our security to NIS2 standards, and cut costs was no easy task,” says Andrea Saglietti, Head of Innovation and Information Security at ASL CN1 Cuneo. “We’ve used US-based cloud storage providers for a long time, but they didn’t offer the sovereignty, resilience, or economic advantages that can be achieved with Cubbit. This has enabled us to generate 50% savings on previous costs for the equivalent configuration. The speed of deployment and ease of use make Cubbit’s DS3 far more manageable than complex on-prem systems, while maintaining sovereignty and giving us full control over our data. Today, we have greater peace of mind knowing our data is stored securely, compliantly, and cost-effectively.” Alessandro Cillario, Co-CEO and Co-founder of Cubbit, adds, “Healthcare organisations in Europe must navigate a dense framework of regulatory requirements while grappling with surging data volumes and sophisticated cyber-threats. With Cubbit, ASL CN1 Cuneo can ensure that its critical healthcare data is safeguarded, compliant, and cost-efficient - without the unpredictability of hidden fees or the burdens of on-prem infrastructure. We’re proud to support ASL CN1 Cuneo and European healthcare and public sector organisations in evolving their storage strategy.” For more from Cubbit, click here.

Poor data quality the top obstacle to AI success, report says
The Ataccama Data Trust Report 2025 has identified poor data quality as a critical obstacle to AI adoption. The report states that despite AI's transformative potential, its success depends on trusted, reliable data. A lofty 68% of Chief Data Officers (CDOs) cite data quality as their top challenge, with only 33% of organisations making meaningful progress in AI adoption. Conducted by Hanover Research with insights from 300 senior data leaders, the report underscores the urgency of addressing systemic issues like fragmented systems and governance gaps. Without resolution, businesses risk stalled innovation, wasted resources, and diminished returns on AI investments. Other key findings • 41% of organisations struggle to maintain consistent data quality, directly hindering AI outcomes.• Knowledge gaps around data trust and governance slow progress; education is critical to closing these gaps.• Trusted data drives AI success: High-quality data accelerates decision-making, enhances customer experiences, and delivers competitive advantages. Policy implications As the UK accelerates its AI strategy with the newly unveiled AI Opportunities Action Plan, the report highlights a foundational gap organisations must address: data trust. When data is accurate, reliable, and trustworthy, users can be confident in making informed decisions that drive improved outcomes and reduce risk. • National standards for data quality: The report emphasises the need for unified benchmarks to guide businesses in building AI-ready ecosystems. Creating a National Data Library is a core goal within the UK plan for homegrown AI and regulatory principles - safety, transparency, and fairness - could be operationalised through national data governance benchmarks. These standards would ensure clear compliance guidelines while supporting the UK’s pro-innovation regulatory goals.• Infrastructure modernisation: Legacy systems remain a bottleneck to AI scalability, unable to handle real-time, high-volume data demands. With the commitment to sufficient, secure, and sustainable infrastructure, the UK’s investment in supercomputing and AI growth zones enables continuous data quality monitoring and governance. These advancements create scalable, efficient systems tailored to advanced AI technologies.• Data trust in AI regulation: Embedding governance and automated validation practices into data workflows is crucial for compliance, reliability, and long-term growth. Aligning the UK’s ethical AI initiatives with data trust requirements would ensure AI systems both operate reliably and adhere to safety and transparency principles. “The report makes one thing clear: enterprise AI initiatives rely on a foundation of trusted data,” says Jay Limburn, Chief Product Officer at Ataccama. “Without addressing systemic data quality challenges, organisations risk stalling progress. The UK’s approach to AI regulation shows how aligning data trust principles with national standards and infrastructure modernisation can deliver tangible results.” Data trust as the foundation of global AI leadership The UK’s regulatory progress presents an opportunity to lead in AI innovation. However, even the most ambitious policies risk falling short without prioritising data trust. The Ataccama Data Trust Report 2025 offers a roadmap to embed data trust into the UK’s AI agenda, ensuring ethical, effective initiatives that drive measurable outcomes, including increased AI adoption, enhanced compliance, and competitive advantages. To download the report in full, click here.

Tech leaders gather to discuss AI Opportunities Action Plan
Technology industry leaders gathered in London this week to discuss the government’s AI Opportunities Action Plan, launched by Prime Minister, Keir Starmer, earlier this week. The meeting, which took place on Wednesday at The Savoy Hotel in central London, saw digital experts discuss the implementation and practicalities of adopting the much-hyped initiative, which is backed by a £14bn investment and set to create over 13,000 jobs. Key attendees included Feryal Clark MP, Minister for AI and Digital Government, who summarised the government’s AI roadmap, and Steven George-Hilley, Founder of Centropy PR. Speaking at the event, John Lucey, VP EMEA North for Cellebrite, commented, “We’ve seen the importance of AI and digital policy this week with the launch of the AI Opportunities Actions Plan poised to position the UK as a global AI leader. Data will play a central role in Britain’s AI future, requiring comprehensive data management systems and data privacy protocols to ensure that AI is trained on trustworthy data and that data inputs don’t breach privacy laws. “In key sectors such as policing and defence, for example, organisations need to be able to trust AI systems to deliver accurate results in a safe manner, maintaining client confidentiality while automating manual processes to drive efficiencies. For AI to be truly successful, it will require investment in data practices and training.” Meanwhile, cyber expert, Andy Ward, SVP International for Absolute Security, stated, “As the UK positions itself as a global AI leader, it’s important that a security-first approach is taken to AI innovation and development to mitigate cyber risks. AI-powered threats are growing increasingly sophisticated, targeting sensitive data from public sector bodies and high-profile individuals, right the way down to small businesses. “Recognising these threats and building cyber resilience frameworks to protect critical IT systems can help organisations to remain operational in the face of threats, allowing them to push forward with innovative AI solutions while limiting potential risks.” Ben Green, Chief Revenue Officer at adCAPTCHA, observed, “The evolution and widespread adoption of AI is showing no signs of slowing down, requiring collaboration between the public sector and industry to shape the UK’s AI future. There’s no question of the benefits that AI can bring, but we must also be mindful of the risks, with trends such as AI-enabled bot attacks continuing to threaten businesses and drain marketing revenues through manipulating ad auctions. “Understanding the threats that AI could pose, as well as where it can be a vital solution, is crucial to the UK’s ambitious AI leadership.”



Translate »