Artificial Intelligence in Data Centre Operations


Vertiv trends see intense focus on AI enablement and energy management
Intense, urgent demand for artificial intelligence (AI) capabilities and the duelling pressure to reduce energy consumption, costs and greenhouse gas emissions, loom large over the data centre industry heading into 2024. The proliferation of AI, as Vertiv predicted two years ago, along with the infrastructure and sustainability challenges inherent in AI-capable computing can be felt across the industry and throughout the 2024 data centre trends forecast from Vertiv. “AI and its downstream impact on data centre densities and power demands have become the dominant storylines in our industry,” says Vertiv CEO Giordano Albertazzi. “Finding ways to help customers both support the demand for AI and reduce energy consumption and greenhouse gas emissions is a significant challenge requiring new collaborations between data centres, chip and server manufacturers, and infrastructure providers.” These are the trends Vertiv’s experts expect to dominate the data centre ecosystem in 2024: AI sets the terms for new builds and retrofits: Surging demand for artificial intelligence across applications is pressuring organisations to make significant changes to their operations. Legacy facilities are ill-equipped to support widespread implementation of the high-density computing required for AI, with many lacking the required infrastructure for liquid cooling. In the coming year, more and more organisations are going to realise half-measures are insufficient and opt instead for new construction – increasingly featuring prefabricated modular solutions that shorten deployment timelines – or large-scale retrofits that fundamentally alter their power and cooling infrastructure. Such significant changes present opportunities to implement more eco-friendly technologies and practices, including liquid cooling for AI servers, applied in concert with air cooled thermal management to support the entire data centre space. Expanding the search for energy storage alternatives: New energy storage technologies and approaches have shown the ability to intelligently integrate with the grid and deliver on a pressing objective, reducing generator starts. Battery energy storage systems (BESS) support extended runtime demands by shifting the load as necessary and for longer durations and can integrate seamlessly with alternative energy sources, such as solar or fuel cells. This minimises generator use and reduces their environmental impact. BESS installations will be more common in 2024, eventually evolving to fit 'bring your own power' (BYOP) models and delivering the capacity, reliability and cost-effectiveness needed to support AI-driven demand. Enterprises prioritise flexibility: While cloud and colocation providers aggressively pursue new deployments to meet demand, organisations with enterprise data centres are likely to diversify investments and deployment strategies. AI is a factor here as organisations wrestle with how best to enable and apply the technology while still meeting sustainability objectives. Businesses may start to look to on-premise capacity to support proprietary AI, and edge application deployments may be impacted by AI tailwinds. Many organisations can be expected to prioritise incremental investment, leaning heavily on prefabricated modular solutions, and service and maintenance to extend the life of legacy equipment. Such services can provide ancillary benefits, optimising operation to free up capacity in maxed-out computing environments and increasing energy efficiency in the process. Likewise, organisations can reduce Scope 3 carbon emissions by extending the life of existing servers rather than replacing and scrapping them. The race to the cloud faces security hurdles: Gartner projects global spending on public cloud services to increase by 20.4% in 2024, and the mass migration to the cloud shows no signs of abating. This puts pressure on cloud providers to increase capacity quickly to support demand for AI and high performance compute, and they will continue to turn to colocation partners around the world to enable that expansion. For cloud customers moving more and more data offsite, security is paramount, and according to Gartner, 80% of CIOs plan to increase spending on cyber/information security in 2024. Disparate national and regional data security regulations may create complex security challenges as efforts to standardise continue.

New course from The Data Lab to boost understanding of AI
AI technology is evolving at an incredible pace with a raft of new solutions continually being brought to the market. But while the benefits that AI could provide are clear, many business leaders are at risk of being overwhelmed by the practical application of AI for their organisation.  To address these concerns, The Data Lab has launched a brand-new, online and self-paced course, designed to provide business leaders with the practical skills to use AI responsibly within their organisations.  The course, Driving Value from AI, has been designed to address the primary questions leaders across all sectors have about AI. The four-week course which is delivered over 14 hours is entirely non-technical, meaning anyone can benefit from participating in it, regardless of their existing knowledge about data and technology.  Bringing together expert insights from practitioners across Scotland, and expertly guided through the course by Strategy Advisor and Coach Craig Paterson, learners will be empowered to better understand how AI could benefit their organisation in a practical training format.   Anna Ashton Scott, Programme Manager for Professional Development at The Data Lab, who led the course development team, says, “Business leaders are trying their best to understand and keep up with the ever-changing AI landscape. They may feel embarrassed or vulnerable asking questions about AI and worried about its impact on their employees, organisations and livelihoods. When building the 'Driving Value from AI' programme, we wanted to make it as practical as possible. For anyone who signs up, they’ll immediately see how they can translate their knowledge into practical benefits for their organisation, and remove any hesitation they may have had around AI.”  AI tools are already being found to benefit organisations, a study by Stanford University and MIT found that the technology can increase productivity by 14%. Separately, data from Anthem showed that companies do better by taking an incremental rather than a transformative approach to developing and implementing AI, and by focusing on augmenting human capabilities rather than replacing them entirely.   Anna Ashton Scott adds, “What we can achieve with AI will keep evolving and changing, so business leaders must also be nimble and responsive to emerging ethical responsibilities. Leaders must take action to ensure that ethical use of AI is built into operational plans. By doing so, they not only protect the organisation but also provide reassurance to their customers and stakeholders. This new course will ensure that all who participate can take relevant leanings into their organisations and act immediately." Until 22 December 2023, those interested in the course can gain access to a 50% discount using the code: EARLYBIRD50. To register or find out more about the course visit the website.

Research reveals that 95% of security leaders are calling for AI cyber regulations
Research from RiverSafe has revealed that 95% of businesses are urgently advocating for AI cyber regulations, ahead of November’s AI Safety Summit. The report, titled 'AI Unleashed: Navigating Cyber Risks Report', conducted by Censuswide, revealed the attitudes of 250 cyber security leaders towards the impact of AI on cyber security. Three in four businesses (76% of surveyed businesses) revealed that the implementation of AI within their operations has been halted due to the substantial cyber risks associated with this technology. Security concerns have also prompted 22% of organisations to prohibit their staff from using AI chatbots, highlighting the deep-rooted apprehension regarding AI's potential vulnerabilities. To manage risks, two-thirds (64%) of respondents have increased their cyber budgets this year, demonstrating a commitment to bolstering their cyber security defences. Suid Adeyanju, CEO at RiverSafe, says, "While AI has many benefits for businesses, it is clear that cyber security leaders are facing the brunt of the risks. AI-enabled attacks can increase the complexity of security breaches, exposing organisations to data incidents, and we still have not explored the full extent of the risks that AI can pose. Rushing into AI adoption without first prioritising security is a perilous path, so striking a delicate balance between technological advancement and robust cyber security is paramount." Two thirds of businesses (63%) expect a rise in data loss incidents, while one in five (18%) respondents admitted that their businesses had suffered a serious cyber breach this year, emphasising the urgency of robust cyber security measures. A link to the full report can be found here.

VAST Data and Lambda partner for Gen AI training
VAST Data and Lambda has announced a strategic partnership that will enable the world's first hybrid cloud experience dedicated to AI and deep learning workloads. Together, Lambda and VAST are building an NVIDIA GPU-powered accelerated computing platform for Generative AI across public and private clouds. Lambda has selected the VAST Data Platform, the data platform designed from the ground up for the AI era, to power Lambda’s on-demand GPU cloud, providing customers with the fastest and most optimised GPU deployments for Large Language Model (LLM) training and inference workloads in the market. Lambda customers will also have access to the VAST DataSpace, a global namespace to store, retrieve, and process data with high performance across hybrid cloud deployments. “Lambda is committed to partnering with the most innovative AI infrastructure companies in the market to engineer the fastest, most efficient, and most highly optimised GPU-based deep learning solutions available,” says Mitesh Agrawal, Head of Cloud and COO at Lambda. “The VAST Data Platform enables Lambda customers with private cloud deployments to burst swiftly into Lambda’s public cloud as workloads demand. Going forward, we plan to integrate all of the features of VAST’s Data Platform to help our customers get the most value from their GPU cloud investments and from their data.” Lambda chose the VAST Data Platform for its balance of delivering: Simplified AI infrastructure: The NVIDIA DGX SuperPOD certification of the VAST Data Platform allows Lambda to simplify data management and improve data access across its private cloud clusters HPC performance with enterprise simplicity: Its highly performant architecture is built for AI workloads, allowing for faster training of LLMs, and preventing bottlenecks in order to extract the maximum performance from GPUs Data insights and management: The VAST DataBase offers structured and unstructured data analytics and insights that can be rolled out easily Data security: It provides multiple security layers across the VAST Data Platform including encryption, immutable snapshots and auditability, providing customers with a zero-trust configuration for data in flight and at rest in a multi-tenant cloud environment Flexible scalability: It also simplifies multi-site and hybrid cloud operations to allow customers to easily scale to hundreds of petabytes and beyond as they grow “Lambda and VAST are bound by a shared vision to deliver the most innovative, performant, and scalable infrastructure purpose-built from day one with AI and Deep Learning in mind,” says Renen Hallak, Founder and CEO of VAST Data. “We could not be happier to partner with a company like Lambda, who are at the forefront of AI public and private cloud architecture. Together, we look forward to providing organisations with cloud solutions and services that are engineered for AI workloads, offering faster LLM training, more efficient data management, and enabling global collaboration to help customers solve some of the world’s most pressing problems.”

Macquarie Data Centres calls on enterprises to build stronger AI foundations
Macquarie Technology Group's Head of Private Cloud, Jonathan Staff, has called on enterprises and technology providers to do more to lay down the right foundations on which to build AI, warning that cutting corners will come with major risks down the line. Speaking at the Future of Tech, Innovation and Work Conference in Sydney, the technology expert told business leaders they need to approach their AI strategy with a holistic, long-term lens.   “Businesses everywhere are scrambling to figure out how they can leverage AI and make sure they stay ahead of the competition. But in this mad rush to the finish line, we’re seeing lots of companies fail to invest in the right foundations needed to scale in the future,” says Jonathan.  “AI is a huge investment, and there is a lot at stake. It is expensive to ‘lift and shift’ these operations once they’re set up, so getting it right from the get-go is so important.”  Jonathan highlights the challenges of securing the right infrastructure to maximise efficiency, a key priority for Macquarie Data Centres and a factor it says many overlook when embarking on their AI journey.   In response to AI’s greater demands for power, cooling and specialised technology the data centre company is focusing on providing the high-density environments required by these power-hungry AI-engines.   The company has recently revealed plans to supercharge its next and largest facility, IC3 Super West, which is being purpose built for the AI era. The Sydney based data centre will offer AI-ready environments and be flexible enough to accommodate technologies, such as advanced GPUs and liquid cooling. Macquarie recently secured a 41% increase in power to IC3 Super West, bringing the total load of its campus to 63MW.  The industry veteran also stressed the importance of making sure new AI tools are properly integrated into a company's existing systems.   “Organisations need to think carefully about how the AI is going to talk to your existing systems. If you build a new AI tool as a siloed project, and it takes off, then you’re going to have huge problems and probably huge costs, trying to retrofit it and incorporate it into pre-existing systems.  “Organisations need to lay the right foundations to make sure everything is connected now, and that all the systems will provide enough runway to scale and grow quickly in the future.”  Jonathan calls upon the industry to prepare themselves for an AI-driven future and think about how they can adapt to capitalise on the opportunities. However, he also stressed that it needs to be done in a way that is compliant with current and future data regulations.  “AI is currently the wild west, but you can expect regulation around sovereignty and data compliance to get tighter in many countries. Businesses need to choose partners that have the right certifications and policies in place to ensure compliance now and into the future."

Data centres must adapt to meet evolving needs in the era of AI disruption
Schneider Electric has introduced an industry-first guide to addressing new physical infrastructure design challenges for data centres to support the shift in artificial intelligence (AI)-driven workloads, setting the gold standard for AI-optimised data centre design. Titled, "The AI Disruption: Challenges and Guidance for Data Centre Design", this white paper provides invaluable insights and acts as a comprehensive blueprint for organisations seeking to leverage AI to its fullest potential within their data centres, including a forward-looking view of emerging technologies to support high density AI clusters in the future. AI disruption has brought about significant changes and challenges in data centre design and operation. As AI applications have become more prevalent and impactful on industry sectors ranging from healthcare and finance to manufacturing, transportation and entertainment, so too has the demand for processing power. Data centres must adapt to meet the evolving power needs of AI-driven applications effectively. Pioneering the future of data centre design AI workloads are projected to grow at a compound annual growth rate (CAGR) of 26-36% by 2028, leading to increased power demand within existing and new data centres. Servicing this projected energy demand involves several key considerations outlined in the white paper, which addresses the four physical infrastructure categories – power, cooling, racks and software tools. The white paper paves the way for businesses to design data centres that are not just capable of supporting AI, but fully optimised for it. It introduces innovative concepts and best practices, positioning Schneider Electric as a frontrunner in the evolution of data centre infrastructure. “As AI continues to advance, it places unique demands on data centre design and management. To address these challenges, it’s important to consider several key attributes and trends of AI workloads that impact both new and existing data centres,” says Pankaj Sharma, Executive Vice President, Secure Power Division and Data Centre Business at Schneider Electric. “AI applications, especially training clusters, are highly compute-intensive and require large amounts of processing power provided by GPUs or specialised AI accelerators. This puts a significant strain on the power and cooling infrastructure of data centres. And as energy costs rise and environmental concerns grow, data centres must focus on energy efficient hardware, such as high-efficiency power and cooling systems, and renewable power sources to help reduce operational costs and carbon footprint.”  This new blueprint for organisations seeking to leverage AI to its fullest potential within their data centres, has received welcome support from customers. “The AI market is fast-growing and we believe it will become a fundamental technology for enterprises to unlock outcomes faster and significantly improve productivity,” says Evan Sparks, Chief Product Officer for AI, at Hewlett Packard Enterprise. “As AI becomes a dominant workload in the data centre, organisations need to start thinking intentionally about designing a full stack to solve their AI problems. We are already seeing massive demand for AI compute accelerators, but balancing this with the right level of fabric and storage and enabling this scale requires well-designed software platforms. To address this, enterprises should look to solutions such as specialised machine learning development and data management software that provide visibility into data usage and ensure data is safe and reliable before deploying. Together with implementing end-to-end data centre solutions that are designed to deliver sustainable computing, we can enable our customers to successfully design and deploy AI, and do so responsibly.” Unlocking the full potential of AI Schneider Electric's guide explores the critical intersections of AI and data centre infrastructure, addressing key considerations such as: Guidance on the four key AI attributes and trends that underpin physical infrastructure challenges in power, cooling, racks and software management. Recommendations for assessing and supporting the extreme rack power densities of AI training servers. Guidance for achieving a successful transition from air cooling to liquid cooling to support the growing thermal design power (TDP) of AI workloads. Proposed rack specifications to better accommodate AI servers that require high power, cooling manifolds and piping, and large number of network cables. Guidance on using data centre infrastructure management (DCIM), electrical power management system (EPMS) and building management system (BMS) software for creating digital twins of the data centre, operations and asset management.   Future outlook of emerging technologies and design approaches to help address AI evolution.

Colt reveals growth opportunities for partners with AI and on-demand
Colt Technology Services has published research revealing new opportunities for partners, as IT leaders look for support and knowledge on AI and intelligent infrastructure. The research, revealed in the latest version of Colt’s annual Digital Infrastructure report, highlights growth opportunities for partners selling technologies critical to AI adoption and digital infrastructure. Technologies rated ‘absolutely essential’ to AI rollout are named as 5G (cited by 22%), agile connectivity (20%) and edge (20%). The study also outlines opportunities for partners selling consumption-based networks, as global uncertainty drives businesses to build flex into their organisations. Colt’s survey of 755 IT leaders across countries in Europe and Asia revealed enterprises are seeking to expand their knowledge of AI and intelligent infrastructure through a diversity of different partners. About 34% are turning to SaaS providers; one in three (33%) to hardware vendors; and 32% to connectivity partners or systems integrators. Also, 31% look to consultants for advice and 29% to CSPs. The study reveals continued take-up of on-demand connectivity, with one in five (20%) saying it is absolutely essential to their business and 76% saying it is important to some extent. Almost nine in 10 (89%) survey respondents who have aspects of intelligent digital infrastructure are already using or plan to use on-demand connectivity. Opportunities for partners to provide support and guidance to clients and end users on their intelligent infrastructure journeys were also uncovered, as many IT leaders surveyed admitted they’re not maximising their entire digital infrastructure estate. The highest proportion (17%) felt they were only at 70% capacity in terms of the functionality and features they’re already using. The research also highlights pain points between IT leaders and partners with almost one in five respondents (19%) saying relationships with external partners are their biggest obstacle and more than one in four (28%) naming poor integration, as a barrier to the easy management of digital infrastructure. And 34% said a lack of partner APIs held them back.  Download the report here.

Rise of AI to drive growth for Nordic data centre market
The data centre market in the Nordics is primed for exponential growth as a result of the acceleration of AI, according to CBRE. AI and machine learning (ML) technologies have experienced unprecedented adoption levels in 2023 and these wide-scale, digital business transformations are fuelling demand for data centre infrastructure as a result. New research from CBRE suggests that much of this demand can be satisfied in the Nordics, primarily due to the low-cost power availability and leading sustainability credentials in the region. There is an abundance of low-cost hydropower available and with the inherently cold climate, there is minimal need to use additional power to cool equipment. CBRE predicts that the Nordics will account for 8% of all colocation data centre supply in Europe by the end of 2023, a sharp year-on-year increase from 5% in 2022, with many locations in the region set to benefit from increased hyperscaler demand. According to the research, Norway’s data centre capacity is expected to more than double by the end of 2026 to 500MW, compared to the projected 210MW at the end of 2023. Furthermore, CBRE predicts that Stockholm’s data centre capacity will almost double by the end of 2026 to 136MW. Stockholm, alongside Oslo and Copenhagen, already forms part of Europe’s 20 largest data centre markets, with further growth expected as customers with large scale requirements look to the Nordics to fulfil demand. Much of the new capacity will be absorbed by the hyperscale operators, but significant opportunities exist for colocation vendors to develop new purpose-build facilities, according to the research.

CoreWeave, NVIDIA, VAST Data join forces to build AI data centres
VAST Data and CoreWeave have announced a strategic partnership that will further CoreWeave’s mission to deliver highly scalable and performant cloud infrastructure for AI and accelerated compute workloads. CoreWeave has selected the VAST Data Platform to build a global, NVIDIA-powered accelerated computing cloud for deploying, managing and securing hundreds of petabytes of data for generative AI, high performance computing (HPC) and visual effects (VFX) workloads. CoreWeave did extensive research and testing before selecting VAST Data to power all of its data centres. The VAST Data Platform has the necessary scale, performance, and multi-tenant enterprise AI cloud capabilities required to power the massive AI and LLM training and inference applications that are now transforming everything from business to science, and society itself. Through their joint partnership, CoreWeave and VAST Data are leveraging NVIDIA technology to engineer a new data platform architecture for large-scale, end-to-end data pipelines and deliver next-generation data services for AI workloads. To support this, the VAST Data Platform boasts an enterprise network attached data store that is certified for use with NVIDIA DGX SuperPOD, and eliminates tiers and infrastructure silos to make large scale AI simpler, faster and easier to manage at virtually limitless levels of scale and performance.

Launch of Europe’s $1bn NVIDIA GPU AI Supercloud
NexGen Cloud has announced plans and funding for one of Europe’s first AI Supercloud deployments to support the development and growth of AI enterprises.  An elite member of the NVIDIA Partner Network, it plans to invest $1bn to build its AI Supercloud in Europe, with $576 million already committed in hardware orders with suppliers. The AI Supercloud will provide a dedicated compute-intensive platform for Europe’s technology companies, organisations and governments, enabling them to execute sensitive AI applications and research within the European jurisdiction and privacy laws. It is set to begin deployment in October 2023, and it will also help meet increasing demand for accelerated computing, spurred by the technology industry’s growing interest in using generative AI and other applications to drive innovation and improve efficiency. It will also ensure regional and cost-effective access to GPU cloud services for European enterprises and scale-ups.  NexGen Cloud’s AI Supercloud services will be delivered from European data centres, powered exclusively by 100% renewable energy, supporting industries including healthcare, finance, and media and entertainment.  It will eventually consist of more than 20,000 NVIDIA H100 Tensor Core GPUs by June 2024, providing enterprises with access to one of the world’s most powerful GPU-accelerated platforms.  To help with the financing, NexGen Cloud has partnered with Moore and Moore Investments Group (MMI) and created a dedicated fund, which has attracted investment from their private investors. Access to the AI Supercloud over the next 12 months will be provided through NexGen Cloud’s Hyperstack platform. The company is already taking pre-orders for its first deployment in October.



Translate »