Monday, March 10, 2025

Data


AI security and data availability to underpin 2025 tech trends
AI has continued to be transformative throughout 2024, with accelerating adoption by enterprises and a growing number of use cases. According to experts from data platform, Nasuni, the AI boom will continue in 2025, but will be defined by three key pillars: 1. 2025 will bring a new era of security maturity - The ability to protect and quickly recover data assets underpins every other business process in an AI-first world 2. Data readiness will be central to AI success - As we look toward 2025, data will no longer just support AI, it will shape and limit the scope of what AI can achieve 3. Enterprises will strive to find the real ROI in AI - 2025 will usher in a more measured approach to AI investment, as organisations will be increasingly focused on quantifiable ROI Discussing these predictions, Russ Kennedy, Chief Evangelist at Nasuni, says, “In 2025, data will be more valuable than ever as enterprises leverage AI to power their operations. However, as data’s value grows, so does its appeal to increasingly sophisticated threat actors. This new reality will continue driving organisations to rethink their security frameworks, making data protection and rapid recovery the backbone of any AI strategy. Attackers are evolving, using AI to create more insidious methods, like embedding corrupted models and targeting AI frameworks directly, which makes rapid data recovery as vital as data protection itself. “Businesses will need to deploy rigorous measures not only to prevent attacks, but to ensure that if the worst happens, they can quickly restore their AI-driven processes. 2025 will bring a new era of security maturity, one where the ability to protect and quickly recover data assets underpins every other business process in an AI-first world.” Jim Liddle, Chief Innovation Officer Data Intelligence and AI at Nasuni, comments, “As we look toward 2025, data will no longer just support AI – it will shape and limit the scope of what AI can achieve. A robust data management strategy will be essential, especially as AI continues advancing into unstructured data. For years, companies have successfully leveraged structured data for insights, but unstructured data – such as documents, images, and embedded files – has remained largely untapped. The continued advancements in AI’s ability to process the different types of unstructured data that reside within an enterprise are exciting, but they also require organisations to know what data they have and how and where it’s being used. “2025 will mark the era of ‘data readiness’ for AI. Companies that strategically curate and manage their data assets will see the most AI-driven value, while those lacking a clear data strategy may struggle to move beyond the basics. A data-ready strategy is the first step for any enterprise looking to maximise AI’s full potential in the coming years.” Nick Burling, Senior Vice President, Product at Nasuni, adds, “2025 will usher in a more measured approach to AI investment, as organisations will be increasingly focused on quantifiable ROI. While AI can deliver immense value, its high operational costs and resource demands mean that companies need to be more selective with their AI projects. Many enterprises will find that running data-heavy applications, especially at scale, requires not just investment but careful cost management. Edge data management will be a critical component, helping businesses to optimise data flow and control expenses associated with AI. “For organisations keen on balancing innovation with budgetary constraints, cost efficiency will drive AI adoption. Enterprises will focus on using AI strategically, ensuring that every AI initiative is justified by clear, measurable returns. In 2025, we’ll see businesses embrace AI not only for its transformative potential, but for how effectively it can deliver sustained, tangible value in an environment where budgets continue to be tightly scrutinised.” For more from Nasuni, click here.

Infinidat introduces RAG workflow deployment architecture
Infinidat, a provider of enterprise storage solutions, has introduced new Retrieval-Augmented Generation (RAG) workflow deployment architecture to enable enterprises to fully leverage generative AI (GenAI). The company states that this dramatically improves the accuracy and relevancy of AI models with up-to-date, private data from multiple company data sources, including unstructured data and structured data, such as databases, from existing Infinidat platforms. With Infinidat’s RAG architecture, enterprises utilise Infinidat’s existing InfiniBox and InfiniBox SSA enterprise storage systems as the basis to optimise the output of AI models, without the need to purchase any specialised equipment. Infinidat also provides the flexibility of using RAG in a hybrid multi-cloud environment, with InfuzeOS Cloud Edition, making the storage infrastructure a strategic asset for unlocking the business value of GenAI applications for enterprises. “Infinidat will play a critical role in RAG deployments, leveraging data on InfiniBox enterprise storage solutions, which are perfectly suited for retrieval-based AI workloads,” says Eric Herzog, CMO at Infinidat. “Vector databases that are central to obtaining the information to increase the accuracy of GenAI models run extremely well in Infinidat’s storage environment. Our customers can deploy RAG on their existing storage infrastructure, taking advantage of the InfiniBox system’s high performance, ow latency, and unique Neural Cache technology, enabling delivery of rapid and highly accurate responses for GenAI workloads.” RAG augments AI models using relevant and private data retrieved from an enterprise’s vector databases. Vector databases are offered by a number of vendors, such as Oracle, PostgreSQL, MongoDB and DataStax Enterprise. These are used during the AI inference process that follows AI training. As part of a GenAI framework, RAG enables enterprises to auto-generate more accurate, more informed and more reliable responses to user queries. It enables AI learning models, such as a Large Language Model (LLM) or a Small Language Model (SLM), to reference information and knowledge that is beyond the data on which it was trained. It not only customises general models with a business’ most updated information, but it also eliminates the need for continually re-training AI models, which are resource intensive. “Infinidat is positioning itself the right way as an enabler of RAG inferencing in the GenAI space,” adds Marc Staimer, President of Dragon Slayer Consulting. “Retrieval-augmented generation is a high value proposition area for an enterprise storage solution provider that delivers high levels of performance, 100% guaranteed availability, scalability, and cyber resilience that readily apply to LLM RAG inferencing. With RAG inferencing being part of almost every enterprise AI project, the opportunity for Infinidat to expand its impact in the enterprise market with its highly targeted RAG reference architecture is significant.” Stan Wysocki, President at Mark III Systems, remarks, “Infinidat is bringing enterprise storage and GenAI together in a very important way by providing a RAG architecture that will enhance the accuracy of AI. It makes perfect sense to apply this retrieval-augmented generation for AI to where data is actually stored in an organisation’s data infrastructure. This is a great example of how Infinidat is propelling enterprise storage into an exciting AI-enhanced future.” Inaccurate or misleading results from a GenAI model, referred to as 'AI hallucinations', are a common problem that have held back the adoption and broad deployment of AI within enterprises. An AI hallucination may present inaccurate information as 'fact', cite non-existent data, or provide false attribution – all of which tarnish AI and expose a gap that calls for the continual refinement of data queries. A focus on AI models, without a RAG strategy, tends to rely on a large amount of publicly available data, while under-utilising an enterprise’s own proprietary data assets. To address this major challenge in GenAI, Infinidat is making its architecture available for enterprises to continuously refine a RAG pipeline with new data, thereby reducing the risk of AI hallucinations. By enhancing the accuracy of AI model-driven insights, Infinidat is helping to advance the fulfillment of the promise of GenAI for enterprises. Infinidat’s solution can encompass any number of InfiniBox platforms and enables extensibility to third-party storage solutions via file-based protocols such as NFS. In addition, to simplify and accelerate the rollout of RAG for enterprises, Infinidat integrates with the cloud providers, using its InfuzeOS Cloud Edition for AWS and Azure to make RAG work in a hybrid cloud configuration. This complements the work that hyperscalers are doing to build out LLMs on a larger scale to do the initial training of the AI models. The combination of AI models and RAG is a key component for defining the future of generative AI. For more from Infinidat, click here.

Keepit scoops accolades at Infosec Innovator Awards
Keepit, a global provider of a comprehensive cloud backup and recovery platform, triumphed in four categories at the Top Infosec Innovator 2024 Awards, which took place at the recently-held CyberDefenceCon in Orlando, Florida. Keepit was named the winner in the following categories: Cutting Edge Cloud Backup; Most Innovative Cyber Resilience; Hot Company Data Security Platform; and Hot Company Ransomware Protection of SaaS Data. Headquartered in Copenhagen, Denmark, with offices and data centres globally, Keepit future-proofs cloud data for organisations, ensuring business continuity and access to information. Michele Hayes, CMO at Keepit, comments, “Ransomware is only one of the many threats companies face in today’s cyber security landscape. Keepit provides the tools for companies to be secure and confident in their disaster recovery plans: a platform that enables rapid recovery and data monitoring for early anomaly detection. We’re thrilled that Cyber Defense Magazine has recognised our platform for these coveted awards.” For more from Keepit, click here.

Survey highlights concerns over SaaS data protection
Keepit, a vendor-independent cloud-native data protection platform, has released results from a recent Software as a Service (SaaS) survey. As SaaS applications become critical components of modern business operations, the survey, conducted by Gatepoint Research for Keepit, reveals a troubling gap in confidence among executives regarding the protection of their SaaS data. The SaaS data protection confidence survey, which gathered responses from 100 senior decision-makers across industries such as finance, healthcare, technology, and manufacturing, shows that while businesses increasingly rely on SaaS tools, many leaders are not fully confident in their ability to safeguard their data. According to the survey, while 28% of respondents expressed high confidence in their data protection measures, a significant 31% reported moderate to severe lapses in their data protection. This lack of confidence is alarming as the use of SaaS applications continues to grow, with critical data stored in applications like Microsoft 365, Salesforce, and Power BI. “Moderate confidence in SaaS data protection is not enough in today’s threat landscape,” says Paul Robichaux, Senior Product Director of Keepit and Microsoft MVP. “Organisations must ensure their data recovery processes are robust and regularly tested. Otherwise, they risk discovering weaknesses too late, when a disaster has already struck and they’re trying to recover.” The survey reveals that 50% of respondents cite increased compliance requirements as their top challenge, with growing data volumes and the complexities of managing SaaS data also ranking high. As global regulations like NIS2 and DORA become more stringent, organisations are under pressure to ensure their SaaS data is adequately protected and compliant with these evolving mandates. Paul adds, “In the financial industry, for example, DORA requires that backup environments be segregated from production environments to reduce risk. And we know that many organisations aren’t well-prepared to meet these requirements. The rising volume of data, combined with increasingly complex regulations, presents a significant challenge for many organisations.” The survey also highlights the financial and reputational risks associated with data loss. 57% of respondents identified brand and reputation damage as the most significant business impact of data loss, followed closely by financial consequences and regulatory compliance violations. “Customer data is among the most valuable assets an organisation holds,” Paul notes. “Losing access to that data, whether through ransomware or accidental deletion, can have devastating financial and reputational consequences. Organisations need to take a proactive approach to ensure their SaaS data is protected.” While 58% of respondents reported using Microsoft to back up their SaaS data, there is a disconnect between perception and reality. Many executives mistakenly believe their data is fully protected by native SaaS backup features. However, shared responsibility models mean that SaaS providers are not accountable for customers’ data backup, leaving a critical gap in protection. "Only 15% of respondents consider backing up directory and identity services like Entra ID to be crucial, even though losing access to these services could cripple business operations," Paul explains. "This shows a need for better education around SaaS data protection." When asked about the roadblocks to improving their data protection strategies, 56% of respondents cited budget constraints, while 33% noted a lack of expertise and resources. Many organisations also face the challenge of managing multiple data backup vendors, further complicating their efforts. The survey will be a key focus of an upcoming webinar titled Protecting Your SaaS Data – Pitfalls and Challenges to Overcome, scheduled for 17 October 2024. This event will provide industry professionals with actionable insights on how to bolster their SaaS data protection strategies and ensure compliance with evolving global regulations. Attendees can also participate in a live Q&A session with industry experts and take a benchmark test to see how their organisation stacks up. To register for the webinar, click here. For more from Keepit, click here.

Evocative launches with Megaport in Boston and Seattle data centres
Evocative, a global provider of Internet infrastructure, has announced that Megaport services are now available in its Boston and Seattle data centres. Megaport, a cloud interconnectivity service provider, adds an additional 300+ on-ramps and 360+ service providers to enable seamless cloud access in Evocative’s facilities. This launch follows the recent partnership announcement between the companies aimed at expanding access to a diverse array of cloud services. The integration of Megaport at Evocative’s data centres allows Evocative customers to seamlessly connect to both public and private clouds, providing them with the flexibility and scalability needed in today’s evolving digital landscape. The ability to dynamically scale bandwidth on demand means that companies can adapt to changing market conditions, ensuring they remain competitive and innovative. Megaport services will be available at additional Evocative locations in the coming weeks, in Atlanta, Dallas, Los Angeles, New York/New Jersey, Phoenix, and Silicon Valley. “The launch of Megaport in our data centres marks a significant milestone in our commitment to delivering exceptional connectivity solutions,” says Derek Garnier, CEO at Evocative. “This partnership not only enhances our service offerings but empowers our customers to optimise their cloud strategies effectively. We believe that by providing direct access to more cloud providers, we are enhancing our clients’ ability to reach new levels of performance. Our goal is to support their growth by offering the tools they need.” “By integrating Megaport’s capabilities with Evocative’s robust infrastructure across its extensive data centre network, we are creating a powerful ecosystem enabling enterprises to harness the full potential of cloud technologies,” says Michael Reid, CEO at Megaport.  “The Evocative/Megaport partnership is a testament to our shared vision of enabling businesses to accelerate their digital transformation journeys.” With Megaport's services now live in Evocative’s facilities, businesses can take advantage of additional connectivity options that facilitate growth and drive operational efficiency. Customers can pair leading cloud access with Evocative Metal, an enterprise bare metal service, to create a comprehensive solution that addresses all their digital infrastructure requirements.

Infinidat launches cyber security awareness campaign
Infinidat, a provider of enterprise storage solutions, marked the beginning of Cybersecurity Awareness Month by kicking off a campaign to raise awareness about the critical need for enterprises to increase their cyber resilience with next-generation data protection and recovery capabilities in the battle against cyberattacks. Throughout the month of October, Infinidat will be contributing to awareness-building efforts across its social media channels about the emergence of cyber resilient storage as the last line of defence against ransomware and malware. “As we embark into Cybersecurity Awareness Month, we’re excited to help enterprises better understand how to incorporate a cyber-centric, recovery-focused strategy with our InfiniSafe capabilities into their overall cybersecurity approach,” says Eric Herzog, CMO at Infinidat. “Cyber attacks have evolved to increasingly target enterprise storage infrastructure. However, the combination of cyber resilience and cyber security closes the gap and vastly improves the ability to mitigate the impact of cyber attacks, especially ransomware. Broader awareness of best practices in cyber resilience and cyber recovery will be one of the crowning achievements of this month dedicated to cyber security.” Protecting data is one of the most critical actions an IT team must do in their data centre today, and expectations for restoring data and backing up data at multi-petabyte scale have changed. IT teams need to increase next-generation data protection capabilities, and there needs to be data integrity and high reliability with 100% availability, which Infinidat provides. Best practices require an enterprise to ensure data validity and near-instantaneous recovery of primary storage and backup repositories, regardless of the size. This accelerates digital disaster recovery when a cyberattack happens. Krista Macomber, Research Director, Cybersecurity at The Futurum Group, comments, “Cyber security is established as a board-level priority. Given that, it is the data that attackers are after. CIOs and CISOs have begun to critically evaluate the cyber resilience of their organisation's enterprise storage implementations. With this in mind, the need for cyber resilience has established new table-stakes criteria within the storage infrastructure. Strategic planning for capabilities, like Infinidat's InfiniSafe Automated Cyber Protection that helps to mitigate data loss and downtime resulting from a cyber incident, has become critical.” Bob Elliott, VP Strategic Alliances, at Mainline Information Systems, adds, “We’re seeing a growing focus on cyber resilience and rapid recovery in enterprise data infrastructure, especially against threats like ransomware. Adopting a recovery-first strategy helps protect businesses from massive cyber attacks. As IT leaders recognise the importance of next-gen data protection, we expect increased adoption of these solutions. In today’s security-driven landscape, boosting cyber resilience is essential for safeguarding storage systems.” Core pillars of next generation data protection in a cyber-first architecture include: immutable snapshots, logical air-gapping, a fenced forensic environment, and near-instantaneous cyber recovery. These dimensions of cyber resilience are available within Infinidat’s core storage operating system. Moreover, the cyber resilient capabilities that complement, utilise, extend and enable these pillars include cyber detection and automated cyber protection. Infinidat’s InfiniSafe suite provides extensive cyber resilience capabilities, including InfiniSafe Cyber Detection and InfiniSafe Automated Cyber Protection (ACP) along with the stack of all the core pillars of next-generation data protection. InfiniSafe provides secure, end-to-end capabilities to orchestrate with existing security solutions to detect, contain, mitigate and recover from a cyber attack. For more from Infinidat, click here.

Tackling bad data at source is key, STX Next claims
Despite the importance of quality assurance in ensuring data projects are accurate from conception to deployment, this is a process that many tech companies struggle to perfect. The 2024 State of Testing Report revealed test cases are not well-written and maintained for 60% of organisations, highlighting the challenge tech leaders face to deliver a seamless testing phase. According to Maksymilian Jaworski, Data Engineer at IT consulting company, STX Next, detecting and addressing inaccuracies as soon as they arise minimises the risk of propagating errors, simultaneously reducing the cost and effort required to fix them. Maksymilian takes up the story: “In data engineering, the principle of ‘validate early, validate often’ emphasises the importance of integrating validation checks throughout the entire data pipeline, as opposed to deferring them to the last possible moment. “Handling data quality issues at source is by far the most cost-effective method of operating. Dealing with unforeseen roadblocks during the remediation phase is significantly more expensive, while problems at the deployment stage can cripple a data engineering project. This underscores the value of implementing a rigorous quality assurance regime, that spots and eradicates any outliers early in the project cycle. “Programming data transformations is a minefield of avoidable errors. Common mistakes include forgetting to add a required argument to a function, trying to access a column missing from a table produced upstream, or attempting to select from a table that doesn’t exist. “Typically, a trivial solution is required to fix these issues – what’s crucial is the stage at which problems are discovered. Manually testing code can uncover inaccuracies at an early point in the data engineering process. Mistakes will also show up when code is deployed to production, but this is far more costly to fix. Although unit testing is recommended by many data engineering experts, this is often a laborious and unnecessary process that hampers further development. “External testing of the application is another effective method of quality assurance. This is where the application is run in a simulated environment, with engineers checking that the results match the expectations of the given test case. “Finally, tests should be put in place to ensure that the data supports business operations and decision-making. Organisations must guarantee the consistency, completeness, timeliness, accuracy and referential integrity of outputs, all while making certain the data adheres to specific business rules.” “Data engineers must take a long-term view when it comes to quality assurance. Investing time and resources into running tests at the nascent stage of development can prevent costly errors further down the line, potentially preventing a project from being delayed or even scrapped.” For more from STX Next, click here.

Veeam announces integration with Palo Alto Networks
Veeam Software, a data resilience expert, has announced a new integration with Palo Alto Networks, a global cybersecurity specialist, to simplify security operations and strengthen data resilience. This integration addresses the pressing need for organisations to take an integrated approach to protecting their data backups and proactively respond to cyber threats through the capabilities offered by Veeam’s new apps and Palo Alto Networks Cortex XSIAM and Cortex XSOAR. With this new integration, Veeam is the first Palo Alto Networks partner to independently design and develop a data collector, dashboards, and reports for Cortex XSIAM. Dave Russell, SVP of Strategy at Veeam, explains, "Cyber threats are a reality for every single organisation. It takes teamwork to fight this escalating battle against ransomware. We are excited to integrate with Palo Alto Networks to provide customers with capabilities to further strengthen their data resilience. This powerful integration enables our 550,000 customers to better protect their backups and respond to cyberattacks faster, tightening their security posture and helping to ensure reliable, rapid and trusted recovery.” In today's digital landscape, ransomware attacks are on the rise, with 96% specifically targeting an organisation's backups according to the Veeam 2024 Ransomware Trends Report. This alarming reality poses a significant challenge for IT and security leaders worldwide. Traditional tools struggle to scale for large enterprises, resulting in a high volume of alerts and overwhelming manual processes for security teams. To combat these challenges and fulfil customer demand, Veeam and Palo Alto Networks have integrated technology to centralise, scale, and automate data monitoring and incident response. By integrating Palo Alto Networks AI-driven security operations centre (SOC) platform with Veeam's recovery capabilities, organisations can identify and respond to cyberattacks faster, helping to ensure the resilience of their business-critical backup data. "We are thrilled to collaborate with Veeam, empowering organisations to respond and react more quickly to threats facing their critical data," says Pamela Cyr, VP of Technical Partnerships at Palo Alto Networks. "By combining the power of Palo Alto Networks' AI-driven SOC platform with data resilience capabilities from Veeam, we can help customers identify and respond to threats, ensuring the resilience of business-critical data. The new integration demonstrates our shared commitment to providing organisations with tools and technologies that help them proactively combat evolving cyber threats and strengthen their security posture." The integration introduces two new applications – the Veeam apps integrated with Cortex XSIAM and Cortex XSOAR that leverage a bi-directional API connection to monitor, detect, and respond to security incidents impacting critical business data and data backups. The Veeam app integrated with Cortex XSIAM brings data from Veeam Backup & Replication and VeeamONE environments into Cortex XSIAM, providing a centralised view of data and backup security-related activity. The Veeam app, integrated with Cortex XSOAR, enables regular API queries against Veeam Backup & Replication and Veeam ONE, monitoring for significant security events or alerts. Both applications are included at no charge to Veeam Data Platform Advanced and Premium customers. For more from Veeam, click here.

Datadog Monitoring for OCI now widely available
Datadog, a monitoring and security platform for cloud applications and a member of Oracle PartnerNetwork, has announced the general availability of Datadog Monitoring for Oracle Cloud Infrastructure (OCI), which enables Oracle customers to monitor enterprise cloud-native and traditional workloads on OCI with telemetry in context across their infrastructure, applications and services. With this launch, Datadog is helping customers migrate with confidence from on-premises to cloud environments, execute multi-cloud strategies and monitor AI/ML inference workloads. Datadog Monitoring for Oracle Cloud Infrastructure helps customers: - Gain visibility into OCI and hybrid environments: Teams can collect and analyse metrics from their OCI stack by using Datadog's integrations for over 20 major OCI services and more than 750 other technologies. In addition, customers can visualise the performance of OCI cloud services, on-premises servers, VMs, databases, containers and apps in near-real time with customisable, drag-and-drop, and out-of-the-box dashboards and monitors. - Monitor AI/ML inference workloads: Teams can monitor and receive alerts on the usage and performance of GPUs, investigate root causes, monitor operational performance and evaluate the quality, privacy and safety of LLM applications. - Get code-level visibility into applications: Real-time service maps, AI-powered synthetic monitors and alerts on latency, exceptions, code-level errors, log issues and more give teams deeper insight into the health and performance of their applications, including those using Java. “With this announcement, Datadog enables Oracle customers to unify monitoring of OCI, on-premises environments and other clouds in a single pane of glass for all teams,” says Yrieix Garnier, VP of Product at Datadog. “This helps teams migrate to the cloud and execute multi-cloud strategies with confidence, knowing that they can monitor services side-by-side, visualise performance data during all stages of a migration and immediately identify service dependencies.” For more from Datadog, click here.

UK data centres designated Critical National Infrastructure
The UK government has made the country’s data centres Critical National Infrastructure to protect the country’s data against IT outages, cyber attacks and environmental emergencies. It’s the first Critical National Infrastructure designation since 2015, putting data centres alongside water, energy and emergency services systems, giving them greater government support when recovering from critical incidents. As part of the designation, a dedicated CNI data infrastructure team of senior government officials will be formed to monitor for potential threats, working closely with agencies such as the National Cyber Security Centre and emergency services to ensure data, from photos to NHS records, is protected. Jennifer Holmes, CCO at LINX, comments, “Data and network traffic is growing exponentially as people and businesses rely more and more on digital services. Here at LINX we have been classed as critical national infrastructure in the UK for many years and wholly support this recognition for our data centres, many of whom are valuable partners of ours. “As data continues to scale, resilient infrastructure becomes increasingly important to ensure uninterrupted data flow and protect against downtime, which can prove costly across many sectors. “This move should form part of a wider internet redundancy strategy, creating protocols and fail-safes to reroute network traffic in the event of an outage. Threats such as cyber attacks or extreme weather conditions are a case of when, not if, so it’s vital to have redundancies in place to not only protect data centres, but ensure networks stay online." With the CNI designation, the government will work to build contingency plans to mitigate risks and damage caused in the event of an attack against a data centre. This will work in tandem with the proposed Cyber Security and Resilience Bill to strengthen the UK’s cyber defences. Technology Secretary Peter Kyle says, “Data centres are the engines of modern life, they power the digital economy and keep our most personal information safe. Bringing data centres into the Critical National Infrastructure regime will allow better coordination and cooperation with the government against cyber criminals and unexpected events.” It follows the Chancellor’s announcement of an £8 billion investment in the UK data centre market, aiming to create 14,000 jobs and spark economic growth. The UK is currently home to the highest number of data centres in Western Europe, becoming an increasingly valuable driver of the UK economy.



Translate »