Monday, March 10, 2025

Data


Lucid Software launches global data residency in EMEA
Lucid Software, a leader in visual collaboration software, has announced the launch of its global data residency in the EMEA region. Hosted by Amazon Web Services (AWS), this strategic expansion marks a significant milestone in Lucid's mission to enhance team collaboration while further strengthening data security across the EMEA region.  “The launch of Lucid’s improved global data residency program in EMEA reaffirms our dedication to supporting the unique needs of our local customers as they navigate local compliance standards” says Roderick de Greef, VP of Sales, EMEA. “By providing a robust data residency solution, we enable organisations across the region to collaborate with peace of mind knowing their document data remains secure and within their region. Team collaboration should be both seamless and secure, and Lucid is committed to making that possible.” Similar to Lucid’s recent global data residency launch in Australia, Lucid’s enterprise customers can now control where document data in the Lucid Suite is securely stored by selecting the region that helps meet their data residence obligations, as well as regional data and compliance requirements. Lucid’s full catalogue of integrations work in all data regions along with its developer platform and external API to improve product performance and reduced latency. Since Lucid's physical expansion to Europe in 2018, the company has seen consistent growth and increasing interest from customers in establishing a global data residency program. With data locality becoming a top priority for organisations in EMEA, this launch offers a pathway to faster innovation and meaningful collaboration while maintaining data security. Lucid upholds the highest standards of data protection and security demonstrated through its many initiatives such as its Enterprise Shield offering, global data residency program, alignment with GDPR and PCI requirements and FedRAMP authorisation. Additionally, Lucid complies with the EU-US and Swiss-US Data Privacy Frameworks, holds ISO 27001 and ISO 27701 certifications, and has successfully completed rigorous SOC 2 Type II audits.

Easily upgrade fibre networks with new transceiver products
ProLabs, a leader in compatible optical networking and connectivity solutions, has announced the launch of two new transceiver products which will extend the life of existing broadband networks. Since the launch of the first fibre optic networks in 2008, demand for internet services has grown massively, a trend exacerbated by recent increases in bandwidth demand from Artificial Intelligence (AI) and Machine Learning (ML) tools and greater automation in data centre applications. In response, on the opening day of Connected Britain, ProLabs launched two new transceivers which will enable operators to upgrade their existing fibre networks without having to uproot and replace existing cables. “So much has changed since the first full fibre networks were built it is understandable that some are beginning to struggle with the demands placed upon them by bandwidth hungry households, businesses and data centres,” ProLabs, Vice President of Sales EMEAI, Sam Walker says. “The highly anticipated new transceivers being launched today will extend the life of networks across the country without the need for major infrastructure replacement projects. Upgrading existing network elements and infrastructure has never been so easy.” The new QSFP28 100G ZR4L-Bidi transceiver enables existing 10G fibre networks to operate at 100G levels at distances up to and above 70km without the need to replace any existing infrastructure. The product is 100% compatible with all leading Network Equipment Manufacturer (NEM) switches and rotor platforms and comes with a single LC connector interface, making it perfect for any network operator looking to upgrade their network with minimum fuss. The QSFP28 100G DCO transceiver responds to customer demands for a tuneable 100G product. It is capable of 80km ‘Point-to-Point’ reach without amplification and massively increases the capacity of existing infrastructure making it suitable for high traffic scenarios and backhaul. It is also backwards compatible with existing switchboards. “These transceivers have been developed with service providers in mind,” continues Sam. “In instances where a business has acquired another and is looking to integrate an older legacy network into the fold, our products will enable them to save valuable resources without an overhaul of legacy fibre. This will prove invaluable for alternative networks (altnets) with ambitions to consolidate and expand despite a limited budget.”

Kiteworks boosts data collection capabilities with new acquisition
Kiteworks, which delivers data privacy and compliance for sensitive content communications through its Private Content Network (PCN), has acquired 123FormBuilder, a provider of advanced data collection through secure web forms and form-driven private content workflows. Kiteworks says that this strategic move further strengthens its position as a trusted provider for organisations seeking to protect sensitive content across their entire content communications ecosystem. “We are very excited to welcome Florin and the talented team at 123FormBuilder to the Kiteworks family,” says Amit Toren, SVP of Corporate and Business Development at Kiteworks. “123FormBuilder’s emphasis on security and compliance aligns with our PCN vision. Our customers will benefit from no-code, dynamic form creation, as well as bidirectional integration of web forms with various solutions such as Salesforce, Stripe, Shopify, HubSpot, and others. “In addition, this acquisition further solidifies Kiteworks’ aggressive growth strategy and demonstrates our continued momentum in expanding our market presence and technological capabilities through strategic M&A activities.” Integrating 123FormBuilder’s advanced data collection through secure web forms and form-driven private content workflows into the Kiteworks Private Content Network will enable 123FormBuilder’s customers to benefit from a unified platform that centralises tracking, control, and security of sensitive content communications. Consolidation of audit logs into one platform will also streamline compliance tracking and reporting for 123FormBuilder customers. 123FormBuilder offers a comprehensive, modern, secure web forms platform, enabling customers to build secure registration forms, order forms, surveys, and other form types quickly and easily. The company offers advanced no-code, drag-and-drop online form creation that includes conditional logic, e-signature functionality, multipage forms, file uploads, and integrations with over 45 popular tools for streamlined workflow automation. “123FormBuilder is thrilled to join the Kiteworks family and contribute to its PCN vision, empowering organisations to manage security and compliance risk across communication channels,” notes Florin Cornianu, CEO of 123FormBuilder. “Our team at 123FormBuilder has worked tirelessly to develop a secure and user-friendly platform for data collection, a technology that will thrive under Kiteworks’ guidance. The acquisition extends our long-term security and compliance commitment to innovation bolstered by a profitable, well-funded organisation committed to the highest security and compliance standards.” Kiteworks’ acquisition of 123FormBuilder follows on the heels of its recent $456 million growth equity investment. For more from Kiteworks, click here.

Custocy partners with Enea for AI-based NDR integration
Custocy, a pioneer in artificial intelligence (AI) technologies for cybersecurity, is to embed Enea Qosmos deep packet inspection (DPI) and intrusion detection (IDS) software libraries in its AI-powered network detection and response (NDR) platform. This integration will enable Custocy to improve accuracy and performance and support product differentiation through detailed traffic visibility and streamlined data inspection. Custocy uses layered, multi-temporal AI functions to detect immediate threats as well as persistent attacks. This approach streamlines the work of security analysts through attack path visualisation, improved prioritisation, workflow support and a radical reduction in the number of false-alarm alerts (‘false positives’). By integrating Enea software into its solution, Custocy will have the exceptional traffic data it needs to extend and accelerate this innovation while meeting extreme performance demands. Enea’s deep packet inspection (DPI) engine, the Enea Qosmos ixEngine, is the most widely embedded DPI engine in the cybersecurity industry. While it has long played a vital role in a wide range of security functions, it is increasingly valued by security leaders today for the value it brings to AI innovation. With market-leading recognition of more than 4,500 protocols and delivery of 5,900 metadata, including unique indicators of anomaly, Qosmos ixEngine provides invaluable fuel for AI innovators like Custocy. In addition, the Enea Qosmos Threat Detection SDK delivers a two-fold improvement in product performance by eliminating double packet processing for DPI and IDS, optimising resources and streamlining overheads. And thanks to Enea Qosmos ixEngine’s packet acquisition and parsing library, parsing speed is accelerated while traffic insights are vastly expanded to fuel next-generation threat detection and custom rule development. These enhancements are important, as demand for high-performing NDR solutions has never been higher. NDR plays a pivotal role in detecting unknown and advanced persistent threats (APTs), which is a challenge certain to become even more daunting as threat actors adopt AI tools and techniques. Custocy is well-positioned to help private and public organisations meet this challenge with a unique technological core built on AI that has earned the company a string of awards; the latest being Product of the Year at Cyber Show Paris. Jean-Pierre Coury, SVP Embedded Security Business Group, comments, “Custocy has developed its solution from the ground up to exploit the unique potential of AI to enhance advanced threat detection and security operations. AI is truly woven into the company's DNA, and I look forward to the additional value it will deliver to its customers as they leverage the enhanced data foundation delivered by Enea software to support their continuous AI innovation.” Custocy CEO, Sebastien Sivignon, adds, “We are thrilled to join forces with Enea to offer our customers the highest level of network intrusion detection. The Enea Qosmos ixEngine is the industry gold standard for network traffic data. It offers a level of accuracy and depth conventional DPI and packet sniffing tools cannot match. Having such a rich source of clean, well-structured, ready-to-use data will enable Custocy to dramatically improve its performance, work more efficiently and devote maximum time to AI model innovation.”

Veeam expands data resilience for Microsoft 365
Veeam Software, a data resilience specialist, has announced the release of Veeam Backup for Microsoft 365 v8, delivering comprehensive and flexible immutability for Microsoft 365 data. Organisations can now ensure their Microsoft 365 data is resilient by employing a zero-trust, multi-layered immutable strategy, ensuring that backup data is safe from potential changes or deletions so that its original integrity stays intact. Currently protecting more than 21 million Microsoft 365 users, Veeam safeguards customers’ critical Microsoft 365 data to ensure that their business keeps running no matter what happens. "Losing the critical data, files and communications housed in Microsoft 365 is a catastrophic scenario for any organisation," says John Jester, CRO at Veeam. "That’s why we’re protecting over 21 million users today, more than any vendor in market, making Veeam the number one data resilience solution for Microsoft 365. "Veeam Backup for Microsoft 365 v8 ensures that despite expected cyber-attacks and data disruptions, organisations have ready access to critical business information to ensure business continuity. Now with the most comprehensive backup immutability for Microsoft 365, this release includes new architecture designed for efficiency and scale, as well as added support which is based directly on customer requests.” Veeam Backup for Microsoft 365 v8 combines immutable backups with existing immutable copies, delivering total defence for organisations’ backups. It provides the flexibility to store backup data on any object storage, including Azure Blob Storage, Amazon S3, IBM Cloud Object Storage, or S3-compatible storage. In addition to enhanced immutability, Veeam enables increased enterprise scale and efficiency with Veeam Proxy Pools. This architectural update boosts backup processing speed by distributing traffic across multiple proxies. By intelligently sharing the load and staying under the radar of throttling, enterprises can achieve better backup performance and efficiently scale up large environments with tens of thousands of users. Responding to customers, Veeam has expanded its support with several new features. Organisations now have the ability to use Linux-based backup proxies, providing more choices and a lower total cost of ownership. Additionally, Veeam Backup for Microsoft 365 v8 now supports private and shared Microsoft Teams channels, offering comprehensive protection for this popular communication and collaboration platform. Key features of Veeam Backup for Microsoft 365 v8 include: • Comprehensive immutability: The most comprehensive backup immutability for Microsoft 365 on the market. • Enterprise scale: Purpose-built architecture designed to handle the largest enterprise datasets. • Added support: Private and shared Teams channels, Linux-based backup proxies, and MFA access to the UI. Learn more about the new Veeam Backup for Microsoft 365 v8 and discover how organisations of all sizes can keep their data secure, protected, and accessible during the VeeamON Data Resilience Summit, taking place virtually October 1 (AMER and EMEA) and October 2 (APJ). Register now for free by clicking here. For more from Veeam, click here.

NetApp optimises VMware environments with new capabilities
NetApp, an intelligent data infrastructure company, has announced new capabilities that support VMware Cloud Foundation deployments. Mutual customers will be able to leverage NetApp solutions to right-size their IT environments to run VMware workloads at scale efficiently. For more than a decade, NetApp and VMware have collaborated to ensure the success of their joint customers and to help them unlock the full value of their VMware investments. During that time, NetApp has been a key engineering design partner with VMware and is continuing to drive innovation in highly available, scalable and performant storage as a design partner for its Next-Generation vSphere Virtual Volumes (vVols). Now, NetApp is announcing new capabilities that will enable joint customers to run their VMware deployments more efficiently. “NetApp and Broadcom are working together to take the uncertainty out of hybrid cloud environments,” explains Jonsi Stefansson, Senior Vice President and Chief Technology Officer at NetApp. “More than 20,000 customers rely on NetApp to support their VMware workloads. NetApp's continued close collaboration with Broadcom following the acquisition of VMware ensures our solutions seamlessly interoperate so our mutual customers can leverage a single intelligent data infrastructure to operate their VMware workloads more efficiently.” NetApp is helping to optimise costs, simplify operations, and increase flexibility for customers running VMware environments by offering: • Expanded support for VMware Cloud Foundation (VCF): NetApp and Broadcom customers will now be able to simplify their VCF hybrid cloud environments by using NetApp ONTAP software for all storage requirements, including standard and consolidated architectures. The latest release of ONTAP Tools for VMware (OTV) will support SnapMirror active sync to provide symmetric active-active data replication capabilities for NetApp storage systems running VMware workloads. SnapMirror active sync allows customers to operate more efficiently by offloading data protection from their virtualised compute and improving data availability. • New capabilities for Azure VMware Solution (AVS): To support customers that are extending or migrating their vSphere workloads to the cloud, customers can now leverage Spot Eco by NetApp with AVS reserved instances to get the most value out of their deployments. Using Spot Eco to manage AVS reserved instances while also using Azure NetApp Files to offload data storage can reduce compute costs significantly. • Enhanced VM Optimisation features for NetApp Cloud Insights: NetApp is introducing Cloud Insights VM Optimisation, expanding its comprehensive solution for optimising virtual environments, including VMware. Cloud Insights VM Optimisation will give customers tools to reduce costs by increasing VM density, run storage at the best price-to-performance ratio for their environment, and monitor their entire environment to ensure availability, performance, and adherence to configuration best practices across the entire stack. To help customers optimise the compute, memory and storage resources of their VMware environments, NetApp is also offering customers a free 30-day trial of Cloud Insights to most cost-effectively migrate to the new VMware software subscriptions. These offerings follow last month’s release of enhancements to NetApp BlueXP disaster recovery service, which provides guided workflows to design and execute automated disaster recovery plans for VMware workloads across hybrid cloud environments with newly added support for VMFS datastores. “As organisations modernise infrastructure with VMware Cloud Foundation, they want to know that the services upon which they rely from industry-leaders such as NetApp will continue to work seamlessly and deliver the value they have come to expect,” says Paul Turner, Vice President of Products, VCF Division at Broadcom. “Having NetApp as a close collaborator helps our mutual customers deploy innovative data and storage services on top of their private cloud platform, and ensure they are getting the most value out of their VMware environments.” “We have made Microsoft Azure the cloud of choice for VMware environments, and offer fast and cost-effective solutions enabling many customers to move their VMware workloads to the cloud,” says Brett Tanzer, Vice President of Product Management at Microsoft. “As VMware customers navigate changes to operating virtualised environments, we have given our customers a way to lock in secure and predictable pricing over multiple years. NetApp's data management and cloud observability capabilities help our customers ensure those deployments are delivering the return on investment they need.” “In an ever more complicated world of cloud, data, and infrastructure operations, IT teams are increasingly looking for holistic platforms over point solutions,” notes Scott Sinclair, Practice Director, Enterprise Strategy Group. “These joint updates from NetApp and Broadcom enable customers to use NetApp’s intelligent data infrastructure to consolidate multiple data operations onto a single platform with industry-leading data management and CloudOps capabilities. That will help customers drive greater operational and infrastructure efficiencies that reduce the total cost of ownership for their VMware investments.” For more from NetApp, click here.

Singtel and Nscale partner to unlock GPU capacity
Singtel and Nscale, a fully vertically integrated AI cloud platform, have announced a strategic partnership that will unlock both companies’ GPU capacity across Europe and Southeast Asia. The collaboration aims to meet the growing global demand from enterprises for generative AI, high-performance computing and data-intensive workloads. Singtel will leverage Nscale’s AMD and NVIDIA GPU capacity in Europe for Singtel’s customer workloads across key markets in the region. This capability ensures that Singtel can deliver to high-volume requirements on demand and maintain service excellence especially when additional capacity is needed. Correspondingly, Nscale will be able to tap into Singtel’s NVIDIA H100 Tensor Core GPU capacity in the Southeast Asian region for its customers’ workloads through an integration with Singtel’s patented orchestration platform, Paragon. Furthermore, as Singtel’s regional data centre arm Nxera expands in the region, its sustainable AI-ready data centres will provide the necessary data centre capacity to support large-scale deployment of Nscale GPU capacity. This partnership will allow Singtel and Nscale to build out a more comprehensive GPU-as-a-Service (GPUaaS) offering globally, ensuring that their customers benefit from the flexibility of a wider geographic footprint and robust infrastructure support. This will also drive greater utilisation in their respective GPU clusters. Bill Chang, CEO of Singtel’s Digital InfraCo and Nxera, says, “As we continue to augment our GPUaaS offerings, we are forging a series of strategic partnerships to grow our ecosystem and broaden our service availability for our customers. Our partnership with Nscale will allow our customers to tap into their high-performance GPU resources on demand, unlocking new possibilities for innovation and efficiency. Our commitment to delivering cost-effective solutions, backed by our state-of-the-art data centres, ensures businesses can access high-performance GPU resources quickly and seamlessly.” Josh Payne, Nscale Founder and CEO, adds, “Nscale is the vertically integrated GPU cloud building the global infrastructure backbone for generative AI. Our sustainable AI-ready data centre together with our GW pipeline of data centre capacity uniquely positions us to deliver sustainable AI infrastructure at any scale for customers worldwide. Through this strategic partnership, Nscale will provide Singtel customers with unmatched access to sustainable, high-performance, and cost-effective AI compute to accelerate enterprise generative AI in the region and beyond.” Singtel previously announced in February that it will be launching its GPUaaS later this year, providing enterprises with access to NVIDIA’s AI computing power. This will enable them to deploy AI at scale quickly and cost-effectively to accelerate growth and innovation. Singtel also recently announced a partnership with Vultr in the US and a strategic partnership with Bridge Alliance that will bring its GPUaaS offerings to enterprises across Southeast Asia. Singtel’s GPUaaS will be expanded to run in new sustainable, hyper-connected, AI-ready data centres by Nxera across Singapore, Thailand, Indonesia and Malaysia when they begin operations from mid-2025 onwards. Nscale's strategic partnership with Singtel follows a number of recent announcements, including a partnership with Open Innovation AI to deliver 30,000 GPUs of consumption to the Middle Eastern market. Integrating Nscale’s powerful GPU infrastructure with Open Innovation AI’s orchestration, data science tools and frameworks. Additionally, Nscale recently acquired Kontena, a leader in high-density modular data centres and AI data centre solutions, further enhancing its ability to provide high-performance, cost-effective AI infrastructure to meet the growing demands of the generative AI market. For more from Singtel, click here.

Is poor data quality the biggest barrier to efficiency?
Employing data specialists, selecting the right tech and understanding the value of a patient and meticulous approach to validation are all fundamental elements of an effective data strategy, according to STX Next, a global leader in IT consulting. Recent research shows that data is an asset that many organisations undervalue, with businesses generating over $5.6 billion in annual global revenue losing a yearly average of $406 million as a direct result of low-quality data. Bad data primarily impacts company bottom lines by acting as the bedrock of underperforming business intelligence reports and AI models – set up or trained on inaccurate and incomplete data – that produce unreliable responses, which businesses then use as the basis for important decisions. According to Tomasz Jędrośka, Head of Data Engineering at STX Next, significant work behind the scenes is required for organisations to be confident in the data at their disposal. Tomasz says, “Approaches to data quality vary from company to company. Some organisations put a lot of effort into curating their data sets, ensuring there are validation rules and proper descriptions next to each attribute. Others concentrate on rapid development of the data layer with very little focus on eventual quality, lineage and data governance. “Both approaches have their positives and negatives, but it’s worth remembering that data tends to outlive all other layers of the application stack. Therefore, if data architecture isn’t designed correctly there could be issues downstream. This often stems from aggressive timelines set by management teams, as projects are rushed to facilitate unrealistic objectives, leading to a less than desirable outcome. “It’s important to remember that the data world is no longer recognisable from where we were 20 years ago. Whereas before we had a handful of database providers, now development teams may pick one of a whole host of data solutions that are available. “Businesses should carefully consider the requirements of the project and potential future areas that it might cover, and use this information to select a database product suitable for the job. Specialist data teams can also be extremely valuable, with organisations that invest heavily in highly skilled and knowledgeable personnel more likely to succeed. “An integral aspect of why high-quality data is important in today’s business landscape is because companies across industries are rushing to train and deploy classical ML as well as GenAI models. These models tend to multiply whatever issues they encounter, with some AI chatbots even hallucinating when trained on a perfect set of source information. If data points are incomplete, mismatched, or even contradictory, the GenAI model won’t be able to draw satisfactory conclusions from them. “To prevent this from happening, data teams should analyse the business case and the roots of ongoing data issues. Too often, organisations aim to tactically fix problems and then allow the original issue to grow bigger and bigger. “At some point, a holistic analysis of the architectural landscape needs to be done, depending on the scale of the organisation and its impact, in the shape of a lightweight review or a more formalised audit where recommendations are then implemented. Fortunately, modern data governance solutions can mitigate a lot of the pain connected with such a process and in many cases make it smoother, depending on the size of the technical debt.” Tomasz concludes, “Employees who trust and rely on data insights work far more effectively, feel more supported and drive improvements in efficiency. Business acceleration powered by a data-driven decision-making process is a true signal of a data-mature organisation, with such traits differentiating companies from rivals.” For more from STX Next, click here.

Data storage insights among the highlights of Technology Live!
Federica Monsone, CEO, A3 Communications, reflects on this year's recently-held Technology Live! event in Munich. Backup and data recovery trends, flash storage innovations, the use of cutting-edge AI in virtual machine environments and more were at the heart of the Munich edition of A3 Communications’ Technology Live!. On show were Tintri, Keepit, and Pure Storage. Showcasing their wares to an international audience of journalists and analysts, the event offered exciting deep dives into new technologies, drilled down into upcoming product roadmaps, and offered a glimpse of future growth strategies. Tintri Tintri, a division of DDN, provides products designed for cloud computing, virtual machines (VMs), and containers. The core product line is the VMstore, a storage system and software designed to simplify management in data centre and cloud environments. At the Munich edition of the event, Tintri explored how to secure business value from AI. VP of revenue, Phil Trickovic, and VP of sales EMEA, Mark Walsh, presented the company’s rational and practical outlook on the future of AI in enterprises, and shared their vision of data management platforms’ role in the success of AI.  The session dove into how Tintri is currently assisting how organisations derive business value from AI-driven data management. Phil and Mark also explained the blueprint for the company's future roadmap to support customers along their AI journeys. They highlighted how the vendor is seeking specialist partners who recognise the importance of its key market verticals, such as the public sector, health, education, and services, markets with technology needs around virtual servers, SQL, VDI and DevOps. As Mark explained, this is part of Tintri’s drive to further boost its EMEA channel partner programme. Keepit Danish-born Keepit provides independent backup of SaaS data, safeguarding businesses from data loss due to unforeseen events, including human error, cyberattacks and malicious deletion. By using a cloud-native, vendor-independent architecture, Keepit ensures data availability even if a main provider’s cloud is inaccessible. Founded in 2007 and headquartered in Copenhagen, Keepit showcased how its unique data backup and recovery services operate, and how its SaaS backup achieves its impressive security and reliability targets. Keepit’s CTO Jakob Østergaard, CISO Kim Larsen, and VP DACH Michael Heuer demonstrated the Keepit Platform, providing insights into the technology under the hood. Built exclusively for SaaS applications, Keepit is the only vendor-independent cloud dedicated to SaaS data protection, ensuring data is stored in a separate geographical location from the production environment. A deep dive into the platform included its five-minute set-up as well as fast and granular restoration, guaranteeing business continuity through reliable and instant data availability. During this demonstration, Keepit also detailed installations at a number of Keepit’s German customers: Edeka, Deutsches Rotes Kreuz Hessen, and ThyssenKrupp. Pure Storage Founded in 2009, Pure Storage develops all-flash data storage hardware and software products. At the Munich edition of the event, Pure’s Principal Technologist, Markus Grau, offered an overview of the company’s evolution. Grau stressed Pure’s unified data storage platform, a platform that can satisfy the full range of customers' data storage needs (block, file, and object) across the entire price and performance spectrum. Grau also covered flexible consumption models, and explained why Pure chose to offer a highly differentiated as-a-service portfolio, and how its users benefit from its industry-leading SLAs and guarantees. The audience were also briefed around the way Pure is empowering its customers across a wide variety of AI use cases (financial services, healthcare, generative AI RAG applications), and how it helps them achieve environmental sustainability goals, reduce energy consumption, and minimise e-waste. The influencers in the room also heard about Pure’s ongoing revenue growth, which has increased by 20% YoY after the quarter ending in May, boosting the confidence to win a future hyperscaler customer. For more from Pure Storage, click here.

Cloudian and Lenovo announce AI data lake platform
Cloudian and Lenovo today announced the general availability of a new Cloudian HyperStore AI data lake platform that reportedly delivers new levels of performance and power efficiency. Built on Lenovo ThinkSystem SR635 V3 all-flash servers with AMD EPYC 9454P processors, the new solution demonstrated performance of 28.7 GB/s reads and 18.4 GB/s writes from a cluster of six power-efficient, single-processor servers, delivering a 74% power efficiency improvement versus a HDD-based system in Cloudian testing. AI workloads demand scalable, secure solutions to meet the performance and capacity requirements of next-generation workloads. Cloudian’s limitlessly scalable, parallel-processing architecture – proven with popular AI and data analytics tools including PyTorch, Tensor Flow, Kafka, and Druid – accelerates AI in capacity-intensive use cases such as media, finance, and life sciences. The system’s single processor architecture delivers not only superior performance with just one socket, but also amplifies power efficiency, a metric that is emerging as a key concern as power consumption for generative AI is forecasted to increase at an annual average of 70% through 2027, according to Morgan Stanley. Lenovo combines Cloudian’s high-performance AI-ready data platform software with its all-flash Lenovo ThinkSystem SR635 V3 servers and 4th Gen AMD EPYC processors to deliver an exceptionally high-performance, efficient and scalable data management solution for AI and data analytics. “There’s a big focus on the AI boom in Australia, New Zealand and across APAC, and it’s easy to see why when bodies like the CSIRO say the Australian market alone could be worth close to A$500 billion (£258.6bn) in the next few years,” says James Wright, Managing Director Asia Pacific and Japan, Cloudian. “But there’s a storage and infrastructure layer that companies and government agencies need to power the data-hungry workloads central to AI’s performance and functionality. What’s out there now simply won’t cut it. Imagine trying to power the mobile applications we use today with the simple mobile phones we had 20 years ago – it wouldn’t work and it’s no different at the infrastructure level, particularly with AI in play. “Cloudian’s data lake software on Lenovo’s all-flash servers simply breaks through the limitations we’ve had in terms of performance and power efficiency to power and secure modern applications and data workflows. These are the breakthroughs we need to drive AI, and particularly sovereign AI, which the CSIRO and many industry and government stakeholders are calling for in Australia.” Michael Tso, CEO and Co-Founder at Cloudian, adds, “Lenovo’s industry-leading servers with AMD EPYC processors perfectly complement Cloudian’s high-performance data platform software. Together, they deliver the limitlessly scalable, performant, and efficient foundation that AI and data analytics workloads require. For organisations looking to innovate or drive research and discovery with AI, ML, and HPC, this solution promises to be transformative.” Built for mission-critical, capacity-intensive workloads, the platform features exabyte scalability, industry-leading S3 API compatibility, military-grade security, and Object Lock for ransomware protection. “Combining Lenovo’s high-performance all-flash AMD EPYC CPU-based servers with Cloudian's AI data lake software creates a solution that can handle the most demanding AI and analytics workloads,” notes Stuart McRae, General Manager, Lenovo. “This partnership enables us to offer our customers a cutting-edge, scalable, and secure platform that will help them accelerate their AI initiatives and drive innovation.” Kumaran Siva, Corporate Vice President, Strategic Business Development, AMD, comments, “AI workloads demand a lot from storage. Our 4th Gen AMD EPYC processors, together with Lenovo's ThinkSystem servers and Cloudian's AI data lake software, deliver the performance and scalability that AI users need. The single socket, AMD EPYC CPU-based Lenovo ThinkSystem SR635 V3 platform provides outstanding throughput combined with excellent power and rack efficiency to accelerate AI innovation.” Proven in over 800 enterprise-scale deployments worldwide, Cloudian on-premises AI data lakes help organisations securely turn information into insight and develop proprietary AI models while fully addressing data sovereignty requirements. The combined Lenovo/AMD/Cloudian solution is available now from Lenovo and from authorised resellers. For more from Cloudian, click here.



Translate »