Advertise on DCNN Advertise on DCNN Advertise on DCNN

Data


ZTE enhances all-optical networks with smart ODN solutions
Hans Neff, Senior Director of ZTE CTO Group, recently delivered a keynote speech entitled ‘Embracing the AI Era to Accelerate the Development of All-Optical Networks’ at the FTTH Conference 2025 in Amsterdam, sharing the company's solutions and experience in focusing on AI empowerment to achieve intelligent ODN, improve the deployment efficiency of all-optical networks, and reduce deployment and O&M costs. At present, the deployment of all-optical networks is continuously accelerated, and the rapidly developing AI is increasingly integrated into broadband IT systems, significantly improving the intelligent level of FTTx deployment, operation and maintenance. In Europe, the National Broadband Plan (NBP) and the EU policies are actively advocating for an all-optical, gigabit and digitally intelligent society, and the demand for sustainable, efficient and future-oriented network infrastructure is growing. Hans pointed out that operators are facing challenges such as high costs, complex network management and diverse deployment scenarios. AI and digital intelligence can provide end-to-end support for ODN construction in multiple aspects. Hans described how ZTE leverages AI technology to build intelligent ODNs across the entire process, enhancing efficiency and reducing operational complexity. He elaborated on the specific applications and advantages of intelligent ODN across three phases: before, during and after deployment. Before deployment, one-click zero-touch site surveys, AI-based automatic planning and design, and paperless processes replace traditional manual methods, significantly reducing the time and human resources required. During deployment, real-time visualised and cost-effective solutions, along with tailored product combinations - such as pre-connectorised splice-free ODN, plug-and-play deployment and air-blowing solutions - ensure smooth construction while reducing both time and labour costs. After deployment and delivery, intelligent real-time ODN network detection, analysis and risk warnings minimise fault location time, improving monitoring accuracy and troubleshooting efficiency. Additionally, new solutions like passive ID branch detection enable 24/7 real-time optical link monitoring, dynamic perception of link changes and automatic updates, ensuring 100% accuracy of end-to-end ODN resources. This proactive approach optimises network performance while streamlining maintenance and resource management. Hans emphasised that collaboration among multiple parties is key to accelerating the construction of intelligent ODN networks. Suppliers play a vital role in driving digital and intelligent platforms, developing innovative product portfolios and delivering localised solutions through strategic partnerships. By leveraging AI and digital intelligence, operators and stakeholders can effectively address challenges such as high costs, complex construction and maintenance, and regional adaptability. ZTE says it is committed to providing comprehensive intelligent ODN solutions and building end-to-end all-optical networks. Moving forward, ZTE will continue collaborating with global partners to harness its advanced network and intelligent capabilities. ZTE aims to help operators build and operate cutting-edge, efficient and intelligent broadband networks, unlocking new application scenarios and fostering a ubiquitous, green and efficient era of all-optical networks.

Industry experts react to World Backup Day
Today, 31 March, marks this year's World Backup Day, and industry experts say that it once again offers a timely reminder of how vulnerable enterprise data can be. Fred Lherault, Field CTO at Pure Storage, says that businesses cannot afford to think about backup just one day, every year, and predicts that 2025 could be a record-setting year for ransomware attacks. Commenting on the day, Fred says, “31 March marks World Backup Day, serving as an important reminder for businesses to reassess their data protection strategies in the wake of an ever-evolving, and ever-growing threat landscape. However, cyber attackers aren’t in need of a reminder, and are probing for vulnerabilities 24/7 in order to invade systems. Given the valuable and sensitive nature of data, whether it resides in the public sector, healthcare, financial services or any other industry, businesses can’t afford to think about backup just one day per year. “Malware is a leading cause of data loss, and ransomware, which locks down data with encryption rendering it useless, is among the most common forms of malware. In 2024, there were 5,414 reported global ransomware attacks, an 11% increase from 2023. Due to the sensitive nature of these kinds of breaches, it’s safe to assume that the real number is much higher. It’s therefore fair to suggest that 2025 could be a record setting year for ransomware attacks. In light of these alarming figures, there is no place for a ‘it won’t happen to me’ mindset. Businesses need to be proactive, not reactive in their plans - not only for their own peace of mind, but also in the wake of new cyber resiliency regulations laid down by international governments. “Unfortunately, while backup systems have provided an insurance policy against an attack in the past, hackers are now trying to breach these too. Once an attacker is inside an organisation’s systems, they will attempt to find credentials to immobilise backups. This will make it more difficult, time consuming and potentially expensive to restore.” Meanwhile, Dr. Thomas King, the CTO of global internet exchange operator, DE-CIX, offers his own remarks about the occasion. Thomas explains, “World Backup Day has traditionally carried a very simple yet powerful message for businesses: backup your data. A large part of this is 'data redundancy' – the idea that storing multiple copies of data in separate locations will offer greater resilience in the event of an outage or network security breach. Yet, as workloads have moved into the cloud, and AI and SaaS applications have become dominant vehicles for productivity, the concept of 'redundancy' has started to expand. Businesses not only need contingency plans for their data, but contingency plans for their connectivity. Relying on a single-lane, vendor-locked connectivity pathway is a bit like only backing your data up in one place – once that solution fails, it’s game over. “In 2025, roughly 85% of software used by the average business is SaaS-based, with a typical organisation using 112 apps in their day-to-day operations. These cloud-based applications are wholly dependent on connectivity to function, and even minor slow-downs caused by congestion or packet loss on the network can kill productivity. This is even more true of AI-driven workloads, where businesses depend on low-latency, high-performance connectivity to generate real-time or near real-time calculations. “Over the years, we have been programmed to believe that faster connectivity equals better connectivity, but the reality is far more nuanced. IT decision-makers frequently chase faster connections to improve their SaaS or AI performance, but 82% severely underestimate the impact of packet loss and the general performance of their connectivity. This is what some refer to as the 'Application Performance Trap' – expecting a single, lightning-fast connection to solve all performance issues. But what happens if that connectivity pathway becomes congested, or worse, fails entirely? “This is why 'redundant' connectivity is essential. The main principle of redundancy in this context is that there should always be at least two paths leading to a destination – if one fails, the other can be used. This can be achieved by using a carrier-neutral Internet Exchange or IX, which facilitates direct peer-to-peer connectivity between businesses and their cloud-based workloads, essentially bypassing the public Internet. While IXs in the US were traditionally vendor-locked to a single carrier or data centre, neutral IXs allow businesses to establish multiple connections with different providers – sometimes to serve a particular use-case, but often in the interests of redundancy. Our research has shown that more than 80% of IXs in the US are now data centre and carrier neutral, presenting a perfect opportunity for businesses to not only back up their data, but also back up their connectivity this World Backup Day.” To read more about World Backup Day, visit its official website by clicking here. For more from Pure Storage, click here. For more from DE-CIX, click here.

Keysight and Coherent to enhance data transfer rates
Keysight Technologies and Coherent Corp have collaborated on a 200G multimode technology demonstration that will be showcased for the first time at the OFC Conference in San Francisco, California. The 200G per lane vertical-cavity surface-emitting laser (VCSEL) technology provides higher data transfer rates and addresses industry demand for higher bandwidth in data centres. It will enable the industry to deliver AI/ML services while reducing power consumption and capital expense of short-reach data interconnects. AI/ML deployment is driving extreme growth in the amount of data transfer in data centres. The cost of optical interconnects is an ever-growing portion of the Capex of the data centre, while the power consumption of the optical interconnects is an ever-growing portion of Opex. 200G multimode VCSELs revolutionise data transfer and network efficiency, offering the following benefits compared to single-mode transmission: · Increased bandwidth: 200 Gbps/lane doubles the data throughput of the current highest speed multimode interconnects.· Power efficiency: Significantly lower power-per-bit relative to single-mode alternatives driving down electrical power operational expense and helping large-scale data centres minimise their environmental impact.· Cost efficiency: Multimode VCSELs are less costly to manufacture than single-mode technologies, providing lower capital outlay for short-reach data links.· Compatible network architecture: AI pods and clusters require many high-speed, short-reach interconnects to share data amongst GPUs, aligning well with the strength of 200G multimode VCSELs. The 200G multimode demonstration consists of Keysight’s new wideband multimode sampling oscilloscope technology, Keysight’s M8199B 256 GSa/s Arbitrary Waveform Generator (AWG) and Coherent’s 200G multimode VCSEL. The M8199B AWG drives a 106.25 GBaud PAM4 electrical signal into the Coherent VCSEL, and the PAM4 optical signal output from the VCSEL is measured on wideband multimode scope displaying the eye diagram. The demo showcases the feasibility and capability of Coherent’s new VCSEL technology, as well as Keysight’s ability to characterise and validate this technology. Lee Xu, Executive Vice President and General Manager, Datacom Business Unit at Coherent, says, “Keysight has been a trusted partner and a leader in test instrumentation technology, providing advanced test solutions to us. We rely on Keysight products, such as the M8199B Arbitrary Waveform Generator, to validate our latest transceiver designs. We look forward to continuing our collaboration as we push the boundaries of optical communications with products based on 200G VCSEL, silicon photonics, and Electro-Absorption Modulated Laser (EML).” Dr. Joachim Peerlings, Vice President and General Manager of Keysight’s Network and Data Centre Solutions Group, adds, “We are pleased with the progress the industry is making in bringing 200G multimode technology to market. Our close collaboration with Coherent enabled another milestone in high-speed connectivity. The industry will benefit from a more efficient and cost-effective technology to address their business-critical AI/ML infrastructure deployment in the data centre.” Join Keysight experts at OFC, taking place 1-3 April in San Francisco, California, at Keysight's booth (stand 1301) for live demos on coherent optical innovations. For more from Keysight Technologies, click here.

Broadband Forum launches trio of new open broadband projects
An improved user experience, including reduced latency and a wider choice of in-home applications, will be delivered to broadband consumers as the Broadband Forum launches three new projects. The three new open broadband projects will provide open source software blueprints for application providers and Broadband Service Providers (BSPs) to follow. These will deliver a foundation for Artificial Intelligence (AI) and Machine Learning (ML) for network automation, additional tools for network latency and performance measurements, and on-demand connectivity for different applications. “These new projects will play a key role in improving network performance measurement and monitoring and the end-user experience,” says Broadband Forum Technical Chair, Lincoln Lavoie. “Open source software is a crucial component in providing the blueprint for BSPs to follow and we invite interested companies to get involved.” The new Open Broadband-CloudCO-Application Software Development Kit (OB-CAS), Open Broadband – Simple Two-Way Active Measurement Protocol (OB-STAMP), and Open Broadband – Subscriber Session Steering (OB-STEER) projects will bring together software developers and standards experts from the forum. The projects will deliver open source reference implementations, which are examples of how Broadband Forum specifications can be implemented. They act as a starting point for application developers to base their designs on. In turn, those applications are available on platforms for BSPs to select and offer to their customers, shortening the path between the development of the specification to the first deployment of the technologies into the network.  “The development of open source software and open broadband standards are invaluable to the industry, laying the foundations for faster innovation through global collaboration,” says Broadband Forum CEO, Craig Thomas. “The Broadband Forum places the end-user experience at the forefront of all of our projects and is playing a crucial role in overcoming network issues.” OB-CAS aims to simplify network monitoring and maintenance for BSPs, while also offering a wider selection of applications from various software vendors. Alongside this, network operations will be simplified and automated through existing Broadband Forum cloud standards that use AI and ML to improve the end-user experience. OB-STAMP will build an easy-to-deploy component that simplifies network performance measurement between Customer Premises Equipment and IP Edge. The project will allow BSPs to proactively monitor their subscribers’ home networks to measure latency and, ultimately, avoid network failures. Four vendors have already signed up to join the efforts to reduce the cost and time associated with deploying infrastructure for measuring network latency. Building on Broadband Forum’s upcoming technical report WT-474, OB-STEER will create a reference implementation of the Subscriber Session Steering architecture to deliver flexible, on-demand connectivity and simplify network management. Interoperability of Subscriber Session Steering is of high importance as it will be implemented in the access network equipment and edge equipment from various vendors.

Five considerations when budgeting for enterprise storage
By Eric Herzog, Chief Marketing Officer at Infinidat. Enterprise storage is fundamental to maintaining a strong enterprise data infrastructure. While storage has evolved over the years, the basic characteristics remain the same – performance, reliability, cost-effectiveness, flexibility, capacity, flexibility, cyber resilience, and usability. The rule of thumb in enterprise storage is to look for faster, cheaper, easier and bigger capacity, but in a smaller footprint. So, when you’re reviewing what storage solutions to entrust your enterprise with, what are the factors to be considering? What are the five key considerations that have risen to the top of enterprise storage buying decisions? • Safeguard against cyber attacks, such as ransomware and malware, by increasing your enterprise’s cyber resilience and cyber recovery with automated cyber protection.• Look to improve the performance of your enterprise storage infrastructure by up to 2.5x (or more), while simultaneously consolidating storage to save costs.• Evaluate the optimal balance between your enterprise’s use of on-premises and the use of the public cloud (i.e. Microsoft Azure or Amazon AWS).• Extend cyber detection across your storage estate.• Initiate a conversation about infrastructure consumption services that are platform-centric, automated, and optimised for hybrid, multi-cloud environments. The leading edge of enterprise storage has already moved into the next generation of storage arrays for all-flash and hybrid configurations. With cybercrime expected to cost an enterprise in excess of £7.3 trillion in 2024, according to Cybersecurity Ventures, the industry has also seen a rise in cybersecurity capabilities being built into primary and secondary storage. Seamless hybrid multi-cloud support is now readily available. And enterprises are taking advantage of Storage-as-a-Service (STaaS) offerings with confidence and peace of mind. When you’re buying enterprise storage for a refresh or for consolidation, it’s best to seek out solutions that are built from the ground-up with cyber resilient and cyber recovery technology intrinsic to your storage estate, optimised by a platform-native architecture for data services. In today’s world with continuous cyber threats, enterprises are substantially extending cyber storage resilience and recovery, as well as real-world application performance, beyond traditional boundaries. We have also seen our customers value scale-up architectures, such as 60%, 80% and 100% populated models of software-defined architected storage arrays. This can be particularly pertinent with all-flash arrays that are aimed at specific latency-sensitive applications and workloads. Having the option to utilise a lifecycle management controller upgrade program is also appealing when buying a next-generation storage solution. Thinking ahead, this option can extend the life of your data infrastructure. In addition, adopting next-gen storage solutions that facilitate a GreenIT approach puts your enterprise in a position to both save money (better economics) and reduce your carbon emissions (better for the environment) by using less power, less rack space, and less cooling. I call this the “E2” approach to enterprise storage: better economics and a better environment together in one solution. It helps to have faster storage devices with massive bandwidth and blistering I/O speeds. Storage is not just about storage arrays anymore Traditionally, it was commonly known that if you needed more enterprise data storage capacity, you’d buy more storage arrays and throw them into your data centre. No more thought needed for storage, right? All done with storage, right? Well, not exactly. Not only has this piecemeal approach caused small array storage 'sprawl' and complexity that can be exasperating for any IT team, but it doesn’t address the significant need to secure storage infrastructures or simplify IT operations. Cyber storage resilience and recovery need to be a critical component of an enterprise’s overall cybersecurity strategy. You need to be sure that you can safeguard your data infrastructure with cyber capabilities, such as cyber detection, automated cyber protection, and near-instantaneous cyber recovery. These capabilities are key to neutralising the effects of cyber attacks. They could mean the difference between you paying a ransom for your data that has been taken 'hostage' and not paying any ransom. When you can execute rapid cyber recovery of a known good copy of your data, then you can effectively combat the cybercriminals and beat them at their own sinister game. One of the latest advancements in cyber resilience that you cannot afford to ignore is automated cyber protection, which helps you reduce the threat window for cyber attacks. With a strong automated cyber protection solution, you can seamlessly integrate your enterprise storage into your Security Operations Centres (SOC), Security Information and Events Management (SIEM), Security Orchestration, Automation, and Response (SOAR) cyber security applications, as well as simple syslog functions for less complex environments. A security-related incident or event triggers immediate automated immutable snapshots of data, providing the ability to protect both block and file datasets. This is an extremely reliable way to ensure cyber recovery. Another dimension of modern enterprise storage is seamless configurations of hybrid multi-cloud storage. The debate about whether an enterprise should put everything into the public cloud is over. There are very good use cases for the public cloud, but there continues to be very good use cases for on-prem storage, creating a hybrid multi-cloud environment that brings the greatest business and technical value to the organisation. You can now harness the power of a powerful on-prem storage solution in a cloud-like experience across the entire storage infrastructure, as if the storage array you love on-premises is sitting in the public cloud. Whether you choose Microsoft Azure or Amazon AWS or both, you can extend the data services usually associated with on-prem storage to the cloud, including ease of use, automation, and cyber storage resilience. Purchasing new enterprise storage solutions is a journey. Isn’t it the best choice to get on the journey to the future of enterprise storage, cyber security, and hybrid multi-cloud? If you use these top five considerations as a guidepost, you end up in an infinitely better place for storage that transforms and transcends conventional thinking about the data infrastructure. Infinidat at DTX 2025 Eric Herzog is a guest speaker at DTX 2025 and will be discussing “The New Frontier of Enterprise Storage: Cyber Resilience & AI” on the Advanced Cyber Strategies Stage. Join him for unique insights on 3 April 2025, from 11.15-11.40am. DTX 2025 takes place on 2-3 April at Manchester Central. Infinidat will be located at booth #C81. For more from Infinidat, click here.

Ataccama to deliver end-to-end visibility into data flows
Ataccama, the data trust company, has launched Ataccama Lineage, a new module within its Ataccama ONE unified data trust platform (V16). Ataccama Lineage provides enterprise-wide visibility into data flows, offering organisations a clear view of their data’s journey from source to consumption. It helps teams trace data origins, resolve issues quickly, and ensure compliance - enhancing transparency and building confidence in data accuracy for business decision-making. Fully integrated with Ataccama’s data quality, observability, governance, and master data management capabilities, Ataccama lineage enables organisations to make faster, more informed decisions, such as ensuring audit readiness and meeting regulatory compliance requirements. Data challenges are increasingly complex and, according to the Ataccama Data Trust Report 2025, 41% of Chief Data Officers are struggling with fragmented and inconsistent systems. Despite significant investments in integrations, AI, and cloud applications, enterprise data often remains siloed or poor in quality. This fractured landscape obscures visibility into data transformations and flows, creating inefficiencies and operational silos. Ataccama believes that the lack of clarity hampers collaboration and increases the risk of non-compliance with regulations like GDPR, erodes customer trust, drains resources, and slows decision-making - ultimately stifling organisational growth. Ataccama Lineage is designed to simplify how organisations manage and trust their data. Its AI-powered capabilities automatically map data flows and transformations, saving time and reducing manual effort. For example, tracking customer financial data across fragmented systems is a common struggle in financial services. Ataccama Lineage provides clear, visual maps that trace issues like missing or duplicate records to their source. It also tracks sensitive data, such as PII, with audit-ready documentation to ensure compliance. By delivering reliable, trustworthy data, Ataccama Lineage establishes a strong foundation for AI and analytics, enabling organisations to make informed decisions and achieve long-term success. Isaac Gabay, Senior Director, Data Management & Operations at Lennar, says, “As one of the nation’s leading homebuilders, Lennar is continually evolving our data foundation with best-in-class, cost-effective solutions to drive efficiency and innovation. Ataccama ONE Lineage’s detailed, visual map of data flows enables us to monitor data quality, trace issues through our ecosystem, and take a proactive approach to prevent and remediate quality concerns while maintaining centralised control. Ataccama ONE Lineage will provide unparalleled visibility, enhancing transparency, data literacy, and trust in our data. This partnership strengthens our ability to scale with confidence, deliver accurate insights, and adapt to the evolving needs of the homebuilding industry.” Jessie Smith, VP of Data Quality at Ataccama, comments, "Managing today’s data pipelines means dealing with increasing sources, diverse data types, and transformations that impact systems upstream and downstream. The rise of AI and generative AI has amplified complexity while expanding data estates, and stricter audits demand greater transparency. Understanding how information flows across systems is no longer optional, it’s essential. Ataccama Lineage is part of the Ataccama ONE data trust platform which brings together data quality, lineage, observability and master data management into a unified solution for enterprise companies." Key benefits of AI-powered Ataccama Lineage include: - Faster resolution of data quality issues: Advanced anomaly detection identifies issues like missing records, unexpected values, or duplicates caused by transformation errors. For example, retail operations with multiple sales channels, mismatched pricing, or inventory discrepancies can disrupt business. Ataccama Lineage enables teams to quickly pinpoint root causes, assess downstream impacts, and resolve issues before they affect operations - ensuring continuity and reliability. - Simplified compliance: Data classification and anomaly detection enhance visibility into sensitive data, such as PII, and track its transformations. Financial organisations benefit from audit-ready documentation that ensures PII is properly traced to authorised locations, reducing regulatory risks, meeting data privacy requirements, and fostering customer trust with transparent processes. - Comprehensive visibility into data flows: Lineage maps provide a detailed, enterprise-wide view of data flows, from origin to dashboards and reports. Teams in sectors like manufacturing can analyse the lineage of key metrics, such as production efficiency or supply chain performance, identifying dependencies across ETL jobs, on-premises systems, and cloud platforms. Enhanced filtering focuses efforts on critical datasets, allowing faster issue resolution and better decision-making. - Streamlined data modernisation efforts: During cloud migrations, Ataccama Lineage reduces risks by mapping redundant pipelines, dependencies, and critical datasets. Insurance companies transitioning to modern platforms can retire outdated systems and migrate only essential data, minimising disruption while maintaining compliance with regulations like Solvency II. For more from Ataccama, click here.

Tellus delivers key component for collaborative data economy
It has been revealed that the Gaia-X development project, Tellus, has successfully completed its implementation phase. Led by the Internet Exchange operator, DE-CIX, the consortium has developed a prototype interconnection infrastructure that provides fully automatic and virtual access to networks for sensitive, real-time applications across distributed cloud environments. Tellus covers the entire supply chain of interconnection services and integrates offerings from various providers based on the decentralised and distributed data infrastructure of Gaia-X. This makes Tellus a key component for the comprehensive connectivity required by intelligent business models in a collaborative data economy. Delivering networks and services according to application demands In the past, implementing business-critical applications in distributed IT systems required purchasing all necessary components, services, and functions separately from different providers and manually combining them in a time-consuming and costly process - without end-to-end guarantees. Tellus’ open-source software not only automates these processes but also ensures specific connectivity requirements. During the final phase, the project team implemented a controller and service registry which function as central elements of a super-node architecture. The controller coordinates and provisions service offers and orders via application programming interfaces (APIs). The service registry stores and lists all services that the controller can search through, address, and combine. The search process runs via the controller into the registry and the associated graph database, which then delivers suitable solutions. Finally, the controller commissions the interconnection infrastructure to provision network and cloud services to meet the requirements of the respective application, including guaranteed performance and Gaia-X compliance. Deployable prototype: Reliable and dynamic connectivity for data exchange In the implemented proof of concept (PoC) demo, virtual networks and services can be provided via a user-friendly interface to meet the requirements of industrial applications; for example, transmitting hand movements to a robot in real time via a smart glove. The same applies to delivering connectivity for a digital twin from IONOS in a manner required by production plants, to simulate, monitor in real-time, and optimise manufacturing steps. Equally, TRUMPF’s fully automatic laser cutting tools, where reliable and dynamic networks keep systems available and pay-per-part business models productive. Milestone for a secure, sovereign, and collaborative data economy “Since Tellus registers the products of all participants in a standardised way and stores the network nodes in a structured manner in a graph database, interconnection services can be composed end-to-end via a weighted path search,” says Christoph Dietzel, Head of Product & Research at DE-CIX. “With the successful completion of the implementation phase and the proof-of-concept demo, we have not only demonstrated the technical feasibility of our Gaia-X compliant interconnection infrastructure, but have also set an important milestone for the future of secure, sovereign, and collaborative data processing.” For more from DE-CIX, click here.

Industry experts comment on Data Privacy Day
With today (28 January) marking Data Privacy Day - an annual event seeking to raise awareness and promote privacy and data protection best practices - industry experts have marked the occasion by presenting a range of views on the latest trends and challenges that have arisen since last year's occasion. - Dr Ellison Anne Williams, Founder and CEO of Enveil, comments, “Data Privacy Day serves as a crucial reminder to safeguard sensitive information in an era where data dominates. As we navigate an increasingly interconnected world and transformative technologies such as AI grow their foothold in the digital economy, finding ways to protect data privacy and mitigate risk will be essential. “Privacy Enhancing Technologies (PETs) enable, enhance, and preserve data privacy throughout its lifecycle, securing data usage and allowing users to capitalise on the power of AI without sacrificing privacy or security. Organisations that truly prioritise data will incorporate PETs as a foundational, business-enabling tool that will fortify data-driven capabilities and enable data to be leveraged securely across silos and boundaries. “This year’s Data Privacy Day theme is ‘Take control of your data’, but that sentiment should not be limited to our personal data footprint. Businesses need to be proactive in their approach to data protection and commit to a future where PETs are woven into the very fabric of digital strategy. This will empower users to responsibly and securely harness innovative tools, such as AI and Machine Learning, in line with global regulations and compliance requirements.” - Edwin Weijdema, Field CTO EMEA & Cybersecurity Lead at Veeam, adds, “This year, Data Privacy Day seems a little different. With significant cyber security regulations coming into force around the world, most notably NIS2 and DORA, it feels like a lot has changed since we marked this day just 12 months ago. “And it has. We’ve seen corporate accountability given increasing weight when it comes to data resilience thanks to NIS2. It’s no longer a case of passing the buck – responsibility ultimately sits with the C-suite. Simultaneously, data resilience is shifting from a ‘cyber security requirement’ to a tangible business differentiator. At the moment, breaches and ransomware are still a ‘when’, not an ‘if’ - and I don’t see this changing. As C-suites become ever more aware, they’ll be demanding to see evidence of their organisation's data resilience, from their internal teams and any third-party partners. “Data Privacy Day is a good chance to reflect on how much can change in a year. After all, organisations can’t rely on markers like this to nudge them on the importance of data resilience - it needs to be a priority 365 days a year.” - James Blake, VP Global Cyber Resiliency Strategy at Cohesity, comments, "On Data Privacy Day, it's crucial to recognise that focusing solely on compliance will only lead to companies tying themselves in knots reacting to the swarm of active or planned regulatory requirements, as well as data legislation coming into force across multiple national and state jurisdictions. If we look at Germany alone as an example, there are 17 state laws on top of national and EU requirements. The most effective way to ensure data privacy compliance is by building robust and repeatable operational capabilities. This involves programmatically conducting comprehensive data audits to identify, categorise, and secure sensitive information. Implementing robust encryption protocols, including migrating to encryption methods resilient to emerging quantum computing attacks, is essential. Additionally, consider working with technology companies who can offer immutable data that can provide an extra layer of security, ensuring data cannot be altered or deleted, thus protecting against ransomware attacks, data breaches and the unnecessary financial loss accrued because of downtime. Appointing security champions in each business unit to educate their peers on tailored data privacy processes based on data classification levels is an important step. By embedding these practices, compliance with varying regulatory requirements will naturally follow." - Adrianus Warmenhoven, a cyber security expert at NordVPN, comments: “As debates continue over whether data, oil, or land holds the greatest value, in cyber security, the answer is unequivocal: data. Personal data, unlike physical assets, can be copied, stolen, or sold without leaving visible traces, creating significant financial and reputational risks. “Apps are a major culprit, often exposing sensitive information through excessive permissions, missed updates, or unauthorised data sharing. Keeping software current is not just a personal safeguard; it also helps protect your network of contacts from phishing attacks through outdated systems. The good news is that while it may seem like an uphill battle to get on top of your data privacy, it’s never been easier to manage how much you share.” To protect people’s privacy on apps, Adrianus offers these preventive measures: Always download apps from official stores - Unofficial apps may not check how safe it is before it is available to download, increasing the risk of modifications by criminals. Familiarise yourself with the data permissions required by apps - Head to your settings and review and adjust these permissions as necessary, particularly sensitive ones like access to your camera, microphone, storage, location, and contact list. Before downloading any app, read its privacy policy - Understand what information it will track and share with third parties. If the privacy level is unsatisfactory, consider an alternative. You can usually find this in the description on your mobile device’s app store. Limit location access only when using the app - It is difficult to justify why some apps need to know your location at all times, so do not give it to them. Avoid using social media accounts to log in, because doing so can allow unnecessary data exchange. Delete any apps you no longer use - This helps to prevent them from collecting data in the background. For more on data privacy, click here.

Progress Data Cloud platform launched
Progress, a provider of AI-powered digital experiences and infrastructure software, has announced the launch of Progress Data Cloud, a managed Data Platform as a Service designed to simplify enterprise data and artificial intelligence (AI) operations in the cloud. With Progress Data Cloud, customers can accelerate their digital transformation and AI initiatives while reducing operational complexity and IT overhead. As global businesses scale their data operations and embrace AI, a robust cloud data strategy has become the cornerstone of success, enabling organisations to harness the full potential of their data for innovation and growth. Progress Data Cloud meets this critical need by providing a unified, secure and scalable platform to build, manage and deploy data architectures and AI projects without the burden of managing IT infrastructure. “Organisations increasingly recognise that cloud and AI are pivotal to unlocking business value at scale,” says John Ainsworth, GM and EVP, Application and Data Platform, Progress. “Progress Data Cloud empowers companies to achieve this by offering a seamless, end-to-end experience for data and AI operations, removing the barriers of infrastructure complexity while delivering exceptional performance, security and predictability.” Key features and benefits Progress Data Cloud is a Data Platform as a Service that enables managed hosting of feature-complete instances of Progress Semaphore and Progress MarkLogic, with plans to support additional Progress products in the future. Core benefits include: • Simplified operations: Eliminates infrastructure complexity with always-on infrastructure management, monitoring service, continuous security scanning and automated product upgrades.• Cost efficiency: Reduces IT costs and bottlenecks with predictable pricing, resource usage transparency and no egress fees.• Enhanced security: Helps harden security posture with an enterprise-grade security model that is SOC 2 Type 1 compliant.• Scalability and performance: Offers superior availability and reliability, supporting mission-critical business operations, GenAI demands and large-scale analytics.• Streamlined user management: Self-service access controls and tenancy management provide better visibility and customisation. Progress Data Cloud accelerates time to production by offering managed hosting for the Progress MarkLogic Server database and the Progress MarkLogic Data Hub solution with full-feature parity. Customers can benefit from enhanced scalability, security and seamless deployment options. Replacing Semaphore Cloud, Progress Data Cloud provides a next-generation cloud platform with all existing Semaphore functionality plus new features for improved performance, security, reliability, user management and SharePoint Online integration. “As enterprises continue to invest in digital transformation and AI strategies, the need for robust, scalable and secure data platforms becomes increasingly evident,” says Stewart Bond, Vice President, Data Intelligence and Integration Software, IDC. “Progress Data Cloud addresses a critical market need by simplifying data operations and accelerating the development of AI-powered solutions. Its capabilities, from seamless infrastructure management to enterprise-grade security, position it as a compelling choice for organisations looking to unlock the full potential of their data to drive innovation and business value.” Progress Data Cloud is a cloud-based hosting of foundational products that make up the Progress Data Platform portfolio. Progress Data Cloud is now available for existing and new customers of the MarkLogic and Semaphore platforms.

Healthcare organisation reduces storage costs with DS3 platform
Cubbit, a geo-distributed cloud storage expert, has announced that ASL CN1 Cuneo, a North Italian healthcare public service organisation, has reduced its storage costs by 50% thanks to Cubbit’s fully-managed S3 cloud object storage, DS3. ASL CN1 Cuneo now stores all of its 110 TB of backup data on Cubbit as part of its 3-2-1-1-0 backup strategy orchestrated via Veeam. DS3 delivers exceptional resilience against client-and server-side ransomware attacks and disasters, ensuring top-tier security (NIS2 standard), GDPR compliance, and adherence to regional public sector regulations while allowing the company to choose the exact geographical perimeter of data storage. By adopting Cubbit, ASL CN1 Cuneo has avoided the hidden costs typically associated with S3 - such as egress fees, API calls, deletion, and bucket replication fees. ASL CN1 Cuneo manages healthcare services across 173 municipalities and employs over 3,500 staff members. As most of its data is health-related (80%), it is therefore classified as “critical” by the Italian National Cybersecurity Agency (ACN). Thus, compliance with stringent GDPR and NIS2 data sovereignty and security guidelines and ACN certification (Italian public sector requirement) was non-negotiable. Prior to selecting Cubbit, ASL CN1 Cuneo had considered various other storage platforms. The healthcare company previously relied on hyperscaler services, but found that egress costs and API call fees were inflating expenses. On-premises solutions offered control and compliance but carried high upfront costs, demanded heavy IT resources, and proved challenging to maintain - difficulties especially significant for a public healthcare entity with limited IT resources regarding employees and budget. Since the adoption of Cubbit’s technology, ASL CN1 Cuneo has reaped the benefits of an S3 cloud object storage that meets national and European sovereign requirements, keeps data within Italian borders, and ensures full regulatory compliance. With Cubbit fully-managed object storage, fixed storage costs include all the main S3 APIs, together with the geo-distribution capacity, enabling ASL CN1 Cuneo to save 50% on its previous storage costs for equivalent configurations, while enhancing data resilience and security. Additionally, achieving the comprehensive security and compliance standards enabled by Cubbit’s DS3 solution aids in mitigating the risk of non-compliance fines to GDPR and NIS2, which can reach up to €10m (£8.5m) or 2% of the global annual revenue, whichever is higher. The cost efficiencies enabled by Cubbit allow ASL CN1 Cuneo to reinvest savings into its core mission of delivering quality healthcare services. “Finding a storage solution that met our strict compliance needs, elevated our security to NIS2 standards, and cut costs was no easy task,” says Andrea Saglietti, Head of Innovation and Information Security at ASL CN1 Cuneo. “We’ve used US-based cloud storage providers for a long time, but they didn’t offer the sovereignty, resilience, or economic advantages that can be achieved with Cubbit. This has enabled us to generate 50% savings on previous costs for the equivalent configuration. The speed of deployment and ease of use make Cubbit’s DS3 far more manageable than complex on-prem systems, while maintaining sovereignty and giving us full control over our data. Today, we have greater peace of mind knowing our data is stored securely, compliantly, and cost-effectively.” Alessandro Cillario, Co-CEO and Co-founder of Cubbit, adds, “Healthcare organisations in Europe must navigate a dense framework of regulatory requirements while grappling with surging data volumes and sophisticated cyber-threats. With Cubbit, ASL CN1 Cuneo can ensure that its critical healthcare data is safeguarded, compliant, and cost-efficient - without the unpredictability of hidden fees or the burdens of on-prem infrastructure. We’re proud to support ASL CN1 Cuneo and European healthcare and public sector organisations in evolving their storage strategy.” For more from Cubbit, click here.



Translate »