Thursday, April 24, 2025

Artificial Intelligence


Iron Mountain launches new digital experience platform
Global information management specialist, Iron Mountain, today announced the availability of the Iron Mountain InSight Digital Experience Platform (DXP), a secure software-as-a-service (SaaS) platform. Customers can use the platform to access, manage, govern, and monetise physical and digital information. AI-powered self-service tools automate workflows, enable audit-ready compliance, and make data accessible and useful. Through InSight DXP’s unique intelligent document processing that extracts, classifies, and enriches information with speed and accuracy, customers can quickly turn physical and digital unstructured information into structured, actionable data they can use in their integrated business applications and as the basis for their AI initiatives. Research conducted across six countries with 700 IT and data decision-makers indicates that most (93%) of the respondents’ organisations already use generative AI in some way. An overwhelming majority of those surveyed (96%) agree that a unified asset strategy for managing both digital and physical assets is critical to the success of generative AI initiatives. The modular InSight DXP platform includes secure generative AI-powered chat, enabling fast access to data trapped within documents. It can be used to quickly query data and documents in a secure isolated environment separate from publicly available generative AI applications. Its low-code solution designer allows anyone to use its intuitive drag-and-drop features to intelligently process and unlock intelligence from content. Nathan Booth, Product Manager, Amazon MGM Studios, says, “Iron Mountain InSight DXP’s generative AI integration with content management has the potential to provide us with enhanced visibility into our assets, allowing us to make quicker and easier business decisions.” InSight DXP is focused on improving the business to employee (B2E) experience which can greatly enhance how Iron Mountain customers create meaningful experiences for their employees while increasing productivity. IDC’s Amy Machado, Senior Research Manager, Enterprise Content and Knowledge Management Strategies, says, “Iron Mountain’s ability to help its customers build models and secure data within Large Language Models (LLMs) with low-code tools can change the way organisations access and monetise documents, both digital and physical.” Narasimha Goli, Senior Vice President, Chief Technology and Product Officer, Iron Mountain Digital Solutions, adds, "We are thrilled to introduce InSight DXP, our next-generation SaaS platform, which is the foundation for AI-readiness. InSight DXP transforms your information - whether physical or digital, structured or unstructured - from unrealised potential to actionable power. This innovative platform empowers businesses to harness the full capabilities of their data, driving intelligent decisions and unlocking new growth opportunities.” Mithu Bhargava, Executive Vice President and General Manager, Iron Mountain Digital Solutions, notes, “We see InSight DXP as a critical platform to help our customers get their information ready for use in generative AI and other AI-powered applications that drive operational efficiency and enhance customer experience. With unified asset management, information governance, workflow automation, and intelligent document processing tools, our customers can efficiently manage information across physical and digital assets.” InSight DXP is a flexible platform designed to quickly design, build, and publish solutions ranging from industry-specific, such as banking solutions like Digital Auto Lending, and healthcare solutions like Health Information Exchange, to cross-industry solutions, such as Digital Human Resources and Invoice Processing. These pre-built, customisable solutions are intended to help customers and system integrators get a head start on their unique configuration needs through pre-built connectors, workflows, document types, metadata, retention rules, and AI prompts. For more from Iron Mountain, click here.

PagerDuty builds upon generative AI capabilities
PagerDuty, a global leader in digital operations management, is building upon previous generative AI (genAI) capabilities with PagerDuty Advance, which is embedded across the PagerDuty Operations Cloud platform - including Incident Management, AIOps, Automation and Customer Service Operations customers. With PagerDuty Advance, organisations can accelerate digital transformation initiatives - from operations centre modernisation to automation standardisation and incident management transformation - elevating their operational excellence. The evolution of PagerDuty Advance empowers responder teams to work faster and smarter by using genAI to surface relevant context or automate at every step of the incident lifecycle. “Global IT disruption and outages are becoming the new normal due to organisations’ technical debt and a rush to harness the power of generative AI,” says Jeffrey Hausman, Chief Product Development Officer at PagerDuty. “These are contributing factors to a greater number of outages which last longer and are more costly. “Building upon our genAI offerings, PagerDuty Advance provides customers with generative AI solutions that help them scale teams by surfacing contextual insights and automating time-consuming tasks at every step of the incident lifecycle. Organisations can take the next step in unlocking the full potential of AI and automation across the digital enterprise with the help of PagerDuty.” PagerDuty Advance includes AI-powered capabilities built to streamline manual work across the incident lifecycle, including: PagerDuty Advance Assistant for Slack - A genAI chatbot that provides helpful insight at every step of the incident lifecycle from event to resolution directly from Slack. Using simple prompts, responders can quickly get a summary of the key information about the incident. It can also anticipate common diagnostic questions and suggest troubleshooting steps, resulting in faster resolution. PagerDuty Advance for Status Updates - This feature leverages AI to auto-generate an audience-specific status update draft in seconds, offering key insights on events, progress and challenges. It helps to streamline communication while saving cycles on what to say to whom, allowing teams to focus on the real work of resolution. PagerDuty Advance for Automation Digest - Part of the Actions Log, this feature summarises the most important results from running automation jobs in one place. Responders can make informed decisions based on diagnostic results and even load the output as key values into variables in Event Orchestration for dynamic automation. PagerDuty Advance for Postmortems - Once an incident is resolved, the user can elect to generate a postmortem review, accelerating an otherwise time-consuming task of collecting all available data around the incident at hand (including logs, metrics, and relevant Slack conversations). In addition to highlighting key findings, this AI-generated postmortem includes recommended next steps to prevent future issues and indicates areas of improvement. AI Generated Runbooks - AI-generated Runbooks accelerate automation development and deployment even among non-technical teams. Operators and developers can quickly translate plain-English prompts into runbook automations or leverage pre-engineered prompts as a starting point. Interviews with early access customers revealed that PagerDuty Advance for Status Updates can save up to 15 minutes per responder per incident. Given the average number of responders responsible for status updates in enterprise organisations is five and the monthly average number of incidents is 60, PagerDuty Advance can save at least 75 hours a month; more than nine business days. Snow Tempest, Research Manager for IT Service Management at IDC, notes, “According to IDC research, customers are looking for AI-enhanced, data-driven IT service and operations practices that reduce resolution times, prevent issues, and improve end user experiences. As digital products and services increasingly drive revenue and operations, organisations can expand their advantage by successfully implementing tools that enable rapid, context-aware response to urgent issues.” For more from PagerDuty, click here.

Vultr and Run:ai deliver advanced NVIDIA GPU orchestration
Vultr, the privately held cloud computing platform, today announced that Run:ai, a leader in AI optimisation and orchestration, is the latest ecosystem partner to join its Cloud Alliance. Run:ai’s advanced AI workload orchestration platform, coupled with Vultr’s robust, scalable cloud infrastructure - including Vultr Cloud GPUs, accelerated by NVIDIA computing technologies and Vultr Kubernetes Engine - provides the enhanced computational power needed to accelerate AI initiatives across Vultr’s global network of 32 cloud data centre locations. As businesses across industries look to deploy their AI initiatives, they often grapple with scaling AI training jobs, fragmented AI development tools, and long queue times for AI experimentation. The partnership between Vultr and Run:ai addresses these challenges, offering a cutting-edge solution that enhances resource utilisation, supports rapid AI deployment, and provides customisable scalability through their integrated infrastructure and advanced AI workload orchestration. “Enterprises around the world are vying to deploy transformative, AI-driven solutions,” says Sandeep Brahmarouthu, Head of Global Business Development at Run:ai. “Our partnership with Vultr will give these organisations a comprehensive solution suite designed to address the technical challenges of AI project development. Now, businesses are empowered with unparalleled adaptability, performance, and control, setting a new standard for AI in today’s rapidly evolving digital landscape.” Vultr’s scalable infrastructure guarantees unified AI stack management, thanks to seamless integrations with existing Cloud Alliance partners, Qdrant and Console Connect. Qdrant, a high-performance vector database with retrieval-augmented generation (RAG) capabilities, manages and queries large volumes of vector data, enhancing tasks like similarity search and recommendation systems. Console Connect facilitates private, high-speed networking to ensure secure, low-latency data transfer between these components, optimising the overall AI/ML pipeline. Now, Run:ai has become the newest member of the Cloud Alliance, with its advanced AI workload orchestration platform. This integrated stack, centred around Vultr, provides a robust, scalable, and efficient solution for handling the most demanding AI/ML workloads. As a result, customers can benefit from: • Enhanced GPU utilisation – Maximise GPU efficiency with Run:ai’s dynamic scheduling and fractional GPU capabilities, reducing idle times and optimising resource use on Vultr’s scalable infrastructure.• Accelerated AI development – Speed up AI development cycles with Vultr’s high-performance cloud infrastructure and Run:ai’s comprehensive orchestration platform, reducing time to market for AI models.• Simplified lifecycle management – Streamline the entire AI lifecycle from development to deployment with integrated tools for dynamic resource allocation, efficient training, and comprehensive monitoring.• Cost-effective operations – Minimise operational costs with Vultr’s affordable cloud solutions and Run:ai’s efficient resource management, ensuring economical AI project execution.• Robust security and compliance – Bolster the security and compliance of AI workloads with advanced features like role-based access control and detailed audit logs, backed by Vultr’s secure infrastructure. Kevin Cochrane, CMO of Vultr, comments, “We are committed to giving our customers the best-of-breed technologies needed to help achieve their business goals. By partnering with Run:ai, we’ve provided a unique solution tailored specifically for AI workloads. Our integrated platform will not only ensure high performance and cost efficiency for customers worldwide, but also give them the agility needed to navigate the evolving demands of modern AI environments.” Vultr is a member of the NVIDIA Partner Network. For more from Vultr, click here.

Gcore raises $60m in Series A funding to drive AI innovation
Gcore, the global edge AI, cloud, network, and security solutions provider, has secured $60 million (£46.4m) in Series A funding from institutional and strategic investors. Led by Wargaming, and with participation from Constructor Capital and Han River (HRZ), this marks the company’s first external capital raise since its inception more than 10 years ago. The funds will be strategically invested in Gcore’s technology and platform, including cutting-edge AI servers powered by NVIDIA GPUs, to drive AI-led innovations. Gcore says that the investment underscores its commitment to delivering advanced edge AI solutions that enhance cloud resource efficiency and ensure data sovereignty. Public organisations, telcos, and global corporations entrust Gcore with their edge workloads due to its expansive network, strong presence in emerging markets, and proven cloud capabilities in AI training and inference. Gcore serves customers across diverse industries, including media and entertainment, gaming, technology, financial services, and retail. Built for the edge and addressing a $200 billion-plus market opportunity, Gcore’s cloud infrastructure powers both the training of large language models (LLMs) and the inference of AI applications at the edge. This is enabled by Gcore’s global network of over 180 edge nodes across six continents, including more than 25 cloud locations, with a total network capacity exceeding 200 Tbps. Sean Lee, Chief Corporate Development Officer of Wargaming, comments, "Gcore has been our partner for over 10 years, helping us to deliver games to hundreds of millions of players worldwide. We are excited to support the company on this journey and look forward to helping them become uniquely positioned to lead high-speed AI model training and inference anywhere in the world." Matthias Winter, Managing Partner of Constructor Capital, adds, "Constructor Capital is excited to invest in Gcore, a leading player in the AI IaaS space, in a booming market with CAGRs of over 40%. We believe in Gcore’s unique value proposition as a comprehensive provider offering a wide range of edge solutions, high automation, attractive TCO, extremely low latency, and an experienced management team. We look forward to a successful journey together in the years to come." Christopher Koh, Managing Partner of HRZ, notes, “We are thrilled to invest in Gcore for its forward-thinking approach to global low-latency AI infrastructure and innovative edge AI solutions. We are especially impressed by its leadership in APAC, collaboration with world-class partners, and strategic alignment with emerging AI opportunities in the region.” Lastly, Andre Reitenbach, CEO of Gcore, says, “We are on the cusp of an AI revolution that will transform how companies operate. Gcore is perfectly positioned to connect the world to AI, anywhere and anytime, by delivering innovative AI, cloud, and edge solutions. The growing demand for AI infrastructure from enterprises and SMBs alike highlights the importance of this significant investment. We are thrilled by the support of investors like Wargaming, Constructor Capital, and Han River Partners as we enhance our extensive network of AI servers and reinforce the powerful edge services we offer.” For more from Gcore, click here.

Research reveals need for hybrid cloud storage strategies
Nasuni, an enterprise data platform for hybrid cloud environments, has unveiled the findings of its new industry research 2024 report, The Era of Hybrid Cloud Storage. The research includes insights from over 1,000 IT purchasing decision-makers in the US, UK, and DACH (Germany, Austria, Switzerland) on hybrid cloud, digital transformation, security, and artificial intelligence (AI). David Grant, President of Nasuni, comments, "As hybrid cloud storage takes centre stage, organisations need strategies to capitalise on their most valuable asset: data. In tandem, they need strategies for addressing critical IT issues including ransomware attacks and the introduction of AI integrations to the market. Legacy storage solutions cannot keep up with these demands. Nasuni’s The Era of Hybrid Cloud Storage report gives organisations the necessary industry and peer insights to understand and take action in a rapidly evolving cloud landscape.” Key takeaways: • Cloud strategies are at the forefront of enterprise success: Enterprises are rapidly moving forward with rolling out or planning cloud-first initiatives (according to 97% of respondents) to help grow their businesses, which includes significant investments in data management, analytics, AI, and cybersecurity. • Hybrid cloud is business critical for proper data management: While only 19% of companies have a hybrid cloud storage model, a staggering 65% plan to implement one within the next year. Of those currently using a hybrid cloud solution, 70% plan to upgrade within the next 18 months. • Data recovery and security is a primary driver for cloud solutions: Data recovery is the number one priority for firms when faced with a ransomware attack, with 59% of organisations seeing cloud initiatives delivering better data security and disaster recovery time. • The growing role of data intelligence and AI: Organisations are targeting advanced data management and visibility through AI (60%). However, the biggest roadblocks preventing organisations from either developing or implementing AI solutions are data privacy and security (42%) and skills shortages (35%). Nasuni enables global organisations to transform file data into an asset that can deliver critical business insights by consolidating that data in a secure and versatile enterprise hybrid cloud platform. Through its strategic partnerships and long-standing alliances with the major cloud providers, Microsoft Azure, AWS, and Google Cloud, the Nasuni File Data Platform is unlocking even greater efficiencies, reducing cost, and establishing a foundation for facilitating core enterprise AI use-cases. Nasuni currently supports over 850 enterprise customers, including numerous Fortune 500, in more than 70 countries to scale, protect, and manage their data. To download the full Era of Hybrid Cloud Storage report, click here. For more from Nasuni, click here.

Report explores the gap between AI ambition and maturity
Vultr, the world’s largest, privately-held cloud computing platform, has released a new industry report, The New Battleground: Unlocking the Power of AI Maturity with Multi-Model AI. The new study reveals a clear correlation between an organisation’s AI maturity and its ability to achieve superior business outcomes, outpacing industry peers in revenue growth, market share, customer satisfaction, and operational efficiency. Commissioned by Vultr and conducted by S&P Global Market Intelligence, the research surveyed over 1,000 IT and digital transformation decision-makers responsible for their organisation’s AI strategy across industries, including healthcare and life sciences, government/public sector, retail, manufacturing, financial services, and more. Of the respondents surveyed, almost three-quarters (72%) are at higher levels of maturity of AI use. The report also includes a qualitative perspective on AI use by enterprises of varying sizes through in-depth interviews with AI decision-makers and practitioners. “As organisations worldwide capitalise on strategic investments in AI, we wanted to look at the state of AI maturity,” says Kevin Cochrane, CMO of Vultr’s parent company, Constant. “What we’ve found is that transformational organisations are winning the hearts, minds, and share of wallets while also improving their operating margins. AI maturity is the new competitive weapon, and businesses must invest now to accelerate AI models, training, and scaling in production.”The number of models actively used within an organisation is a reliable measure of its deployed AI capabilities and overall AI maturity. The data reveals that advanced AI adopters leverage a multitude of models simultaneously as part of a multi-model approach. On average, the number of distinct AI models currently operational stands at 158 with projections suggesting this number will rise to 176 AI models within the next year. This growth highlights remarkable acceleration in AI adoption across industries, underscored by the 89% of organisations anticipating advanced AI utilisation within two years. AI is poised to permeate throughout the enterprise with 80% adoption anticipated across all business functions within 24 months. This will include AI being embedded across all applications and business units. As AI builds on its new foothold across businesses, there will be an immense impact on enterprise-wide performance. According to the report, those with transformational AI practices reported that they outperformed their peers at higher levels. Specifically, 50% of transformational companies are performing "significantly better" against industry peers than those at operational levels, while a large majority of AI-driven organisations say they improved their 2022/2023 year-over-year performance in customer satisfaction (90%), revenue (91%), cost reduction/margin expansion (88%), risk (87%), marketing (89%), and market share (89%). Meanwhile, nearly half (40-45%) of organisations say AI is having a “major” impact on market share, revenue, customer satisfaction, marketing improvements, and cost and risk reduction. “AI's transformative impact is undeniable - it's devouring industries and is becoming ubiquitous in every facet of business operations. This necessitates a new era of technology, underpinned by a composable stack and platform engineering to effectively scale these innovations,” Kevin notes.To fully harness AI’s potential, 88% of the enterprises surveyed intend to increase their AI spend in 2025, with 49% expecting moderate to significant increases. Findings related to key infrastructure, partner, and implementation strategies include: • For cloud-native applications, two-thirds of organisations are either custom-building their models or using open-source models to deliver functionality.• In 2025, the AI infrastructure stack will be hybrid cloud with 35% of inference taking place on-prem and 38% in the cloud/multi-cloud.• Thanks to the skills shortage, 47% of enterprises are leveraging a partner to help them with strategy and implementation, and deployment of AI at scale. Only 15% are leveraging hyperscalers such as AWS, GCP, or Azure.• Open, secure, and compliant are the top attributes of cloud platforms for scaling AI across the organisation, geographies, and to the edge. “For years, the hyperscalers have dominated the infrastructure market, relying on scale, resources, and technological expertise, but that is all about to change,” Kevin adds. “Over the next decade, everything will be rebuilt with AI at the core, with organisations integrating the principles of cloud engineering into their operations. As a result, we will see the rise of AI specialists and independents as they empower organisations to do transformative work and gain a competitive edge.”As the race to AI heats up, it will not be without its share of obstacles. Budget limitations, building or obtaining AI algorithms, lack of skilled personnel, and data quality are among the top hurdles organisations say they must resolve to graduate to the next stage of AI maturity. For those at a transformational level of maturity, governance (30%) becomes much more of an issue, while company culture is the larger issue for those still in the accelerating stage. For more information, or to download a copy of the full report, click here. For more from Vultr, click here.

Google’s emissions soar 48% over five years due to AI
Google’s greenhouse gas emissions have soared 48% over the past five years, with its Artificial Intelligence (AI) products relying on energy-intensive data centres. Google labelled "increases in data centre energy consumption and supply chain emissions" as the primary driver behind the rise, with total emissions reaching 14.3 million metric tons, according to its annual environmental report. It is estimated that data centres contribute 2.3% to 3.7% of the world’s CO2 emissions, surpassing the global aviation industry, which accounts for 2.1%. In the report, Google said that "reaching net zero emissions by 2030 is an extremely ambitious goal and we know it won’t be easy", citing that the future of AI and its environmental impact is "complex and difficult to predict". Last week, Microsoft’s Co-Founder, Bill Gates, downplayed AI’s climate impact, saying that it would be more of a help than a hindrance. He also said that big tech is "seriously willing" to pay the extra premium to bootstrap clean energy capacity. At the end of 2023, Google released Gemini, which is positioned as a competitor to OpenAI’s ChatGPT-4 and Google’s biggest leap into the AI trend. The tech giant is also placing AI at the heart of its new Pixel phones in order to make them ‘even more helpful’. John Kirk, CSO at ITG comments, “The insatiable demand for AI adoption is already fuelling a wave of increased emissions, leaving big brands open to scrutiny around their sustainability credentials. Forward-thinking organisations will need to look again at the impact their operations are having on the environment and work with partners in the supply chain, such as creative agencies, to provide a more open and honest account of their activities. Customers now expect both accountability and a clear action plan to offset or reduce emissions, and without it, trust will be lost.”

Juniper to partner with the 2026 Olympics and Paralympics
The Milano Cortina 2026 Foundation and Juniper Networks, a leader in secure and AI-Native Networking, have signed a partnership agreement for Milano Cortina 2026 Olympic and Paralympic Winter Games. The collaboration aims to optimise network systems and protect data and virtual information for the major sporting event. Making its Olympic and Paralympic debut as a partner of the Milano Cortina 2026 Olympic and Paralympic Winter Games, Juniper’s entry significantly enhances the project and the organising committee’s team. Juniper will help manage the event’s digital complexities such as real-time data management, cybersecurity and high-volume network demands, enabling a smoother operation. Sujai Hajela, Executive Vice President, AI-Driven Enterprise at Juniper Networks, comments, “Juniper leverages the right data, the right real-time response and the right infrastructure to provide predictable, reliable, measurable and secure connections for every device, user, application and resource. With our unique AI-Native Networking Platform featuring industry-leading wired, wireless, routing and security solutions, users will have simple and reliable access to digital assets and online information throughout the Milano Cortina 2026 Olympic and Paralympic Winter Games.” The partnership is built on technological excellence and shared values of inclusion and employee well-being, Juniper states. Furthermore, the company says that it prioritises people, equality and diversity - principles which align with the organising committee’s vision, making the Games a model of people-centric values.  Chris Barnard, Vice President, Telecoms and Infrastructure (Europe), IDC, adds, “The partnership between Milano Cortina 2026 Foundation and Juniper Networks highlights the experience-first value that intelligent technology can bring to large-scale sports events. Juniper’s AI-Native Networking Platform is designed to provide the data-driven reliability and security needed to manage the relentless demands of the Olympic and Paralympic Winter Games, delivering exceptional connectivity and robust protection of digital assets.” Mario Manfredoni, Senior Sales Director, South Europe, Juniper Networks, concludes, “Juniper’s partnership with Fondazione Milano Cortina 2026 will showcase Juniper’s services and high-performance networking. As the official secure IP network provider of the Milano Cortina 2026 Olympic and Paralympic Winter Games, Juniper will support in addressing the complexity of the major event across multiple locations. "Additionally, Juniper will employ a circular economy approach by collecting and re-cycling all equipment once the event concludes. Juniper’s goal is that none of the equipment goes to waste or ends up in landfill, and instead foster a more environmentally friendly business model. The products will be returned to the production process as pre-owned items through Juniper’s partner." For more from Juniper, click here.

NetApp receives AAA rating for its AI ransomware detection
NetApp, the intelligent data infrastructure company, today announced that NetApp ONTAP Autonomous Ransomware Protection with Artificial Intelligence (ARP/AI) has received a AAA rating from SE Labs, an independently-owned and run testing company that assesses security products and services. SE Labs validated the protection effectiveness of NetApp ARP/AI with 99% recall - a metric that measures malware detection rates - for ransomware attacks, while noting the absence of false positives. When responding to ransomware attacks, seconds can make the difference between ensuring continuity and a massive business disruption. Organisations need fast, automated, and accurate detection and response capabilities built into their primary storage systems to minimise the damage done by lost production data and downtime. NetApp ARP/AI, with its AI-powered ransomware detection capability, addresses this gap by providing real-time detection and response to minimise the impact of cybersecurity threats. SE Labs rigorously tested NetApp ARP/AI against hundreds of known ransomware variants with impressive results. NetApp ARP/AI demonstrated 99% detection of advanced ransomware attacks. NetApp ARP/AI also achieved 100% detection of legitimate files without flagging any false positives, indicating a strong ability to operate in a business context without contributing to alert fatigue. Ensuring data is secure against internal and external threats is a critical part of making data infrastructure intelligent, which then empowers customers to turn disruption into opportunity. This validation of NetApp’s AI-powered ransomware detection capabilities underscores how NetApp is staying at the forefront of AI innovation by both enabling AI adoption and applying AI to data services. NetApp’s newly released more powerful, all-flash storage systems help enterprises leverage their data to drive AI workloads, built on NetApp’s secure storage infrastructure. "NetApp has passed a significant milestone in the fight against ransomware as the first and only storage vendor to offer AI-driven on-box ransomware detection with externally validated top-notch protection effectiveness,” says Dr. Arun Gururajan, Vice President, Research & Data Science at NetApp. “Ransomware detection methodologies that rely only on backup data are too slow to effectively mitigate the risks businesses face from cybersecurity threats. NetApp ARP/AI hardens enterprise storage by providing robust, built-in detection capabilities that can respond to ransomware threats in real time. The AAA rating we achieved from SE Labs is the result of our commitment to innovation in intelligent data infrastructure and our drive to find new ways to make NetApp the most secure storage on the planet.” By embedding ransomware detection in storage, NetApp ARP/AI helps customers improve their cyber resilience while reducing the operational burden and skills required to maintain their intelligent data infrastructure. NetApp ARP/AI’s detection technology continuously adapts and evolves as new ransomware variants are discovered. This continuous retraining on the latest ransomware strains ensures that NetApp ARP/AI remains at the forefront of protection effectiveness, offering organisations a future-proof defence against the dynamic ransomware landscape. To see the full results of the tests, read the SE Labs Report by clicking here. NetApp ARP/AI is currently in tech preview. Customers can request to participate in the tech preview by reaching out to their NetApp sales representative. For more from NetApp, click here.

Pure Storage introduces unified data storage platform
Pure Storage, an IT pioneer that delivers advanced data storage technologies and services, has announced new capabilities in the Pure Storage platform that are evolving the ways IT and business leaders can improve their ability to deploy AI, improve cyber resilience, and modernise their applications. The Pure Storage platform delivers agility and risk reduction to organisations with a simple, consistent storage platform and an 'as-a-service' experience for the broadest set of use cases across on-premises, public cloud, and hosted environments. At the heart of the platform, the Evergreen architecture brings continuous and non-disruptive upgrades helping enterprises adapt to dynamic business environments. With the industry’s record number of concurrent SLAs, customers get the reliability, performance, and sustainability their business requires. Charles Giancarlo, Chairman and CEO, Pure Storage, says, “Pure is redefining enterprise storage with a single, unified data storage platform that can address virtually all enterprise storage needs including the most pressing challenges and opportunities IT leaders face today, like AI and cyber resilience. The Pure Storage platform delivers unparalleled consistency, resilience, and SLA-guaranteed data storage services, reducing costs and uncertainty in an increasingly complex business landscape.” Pure Storage announced new innovations in the platform including: Storage automation: Pure Fusion unifies arrays and optimises storage pools on the fly across structured and unstructured data, on-premises, and in the cloud. Now fully embedded into the Purity operating environment designed to continually get better over time via non-disruptive upgrades, the next generation Pure Fusion will be available across the entire Pure Storage platform to all global customers. Generative AI co-pilot for storage: Extending Pure Storage’s leadership position as the innovator in simplicity, the first AI co-pilot for storage represents a radically new way to manage and protect data using natural language. This leverages data insights from tens of thousands of Pure Storage customers to guide storage teams through every step of investigating complex performance and management issues and staying ahead of security incidents. In the new Innovation Race survey of 1,500 global CIOs and decision makers commissioned by Pure Storage, nearly all respondents (98%) state that their organisation’s data infrastructure must improve to support initiatives like AI - which is evolving so rapidly that IT is struggling to keep up, much less predict what’s next. Companies large and small are realising that they are locked into inflexible storage architectures lacking enterprise-grade reliability, unable to resize or upgrade performance without complex and risky infrastructure planning. Pure Storage is introducing new innovations in the platform that help businesses accelerate successful AI deployments today, and future-proof for tomorrow. The Pure Storage platform empowers organisations to unlock the value of their data with AI, while delivering agility to instantly scale capacity and performance up and down independently, without disruption. New Evergreen//One for AI - First Purpose-Built AI Storage as-a-Service: Provides guaranteed storage performance for GPUs to support training, inference, and HPC workloads, extending Pure Storage's leadership position for capacity subscriptions and introduces the ability to purchase based on dynamic performance and throughput needs. The new SLA uniquely delivers the performance needed and eliminates the need for planning or overbuying by paying for throughput performance. Secure Application Workspaces with Fine-Grained Access Controls: Combines Kubernetes container management, secure multi-tenancy, and policy governance tools to enable advanced data integrations between enterprise mission-critical data and AI clusters. This makes storage infrastructure transparent to application owners, who gain fully automated access to AI innovation without sacrificing security, independence, or control. For more from Pure Storage, click here.



Translate »