AMD has announced its vision for the future of the data centre and pervasive AI, showcasing the products, strategy and ecosystem partners that will shape the future of computing.
The ‘Data Center and AI Technology Premiere’ highlights the next phase of data centre innovation. The company was joined on stage with executives from Amazon Web Services, Citadel, Hugging Face, Meta, Microsoft Azure and PyTorch to showcase the technological partnerships to bring the next generation of high-performance CPU and AI accelerator solutions to market.
“Today, we took another significant step forward in our data centre strategy as we expanded our 4th Gen EPYC processor family with new leadership solutions for cloud and technical computing workloads, and announced new public instances and internal deployments with the largest cloud providers,” says AMD Chair and CEO, Dr Lisa Su. “AI is the defining technology shaping the next generation of computing and the largest strategic growth opportunity for AMD. We are laser focused on accelerating the deployment of AMD AI platforms at scale in the data centre, led by the launch of our Instinct MI300 accelerators planned for later this year and the growing ecosystem of enterprise-ready AI software optimised for our hardware.”
The company was joined by AWS to highlight a preview of the next generation Amazon Elastic Compute Cloud, ‘Amazon EC2’, M7a instances, powered by 4th Gen AMD EPYC processors, ‘Genoa’.
AMD has introduced the 4th Gen AMD EPYC 97X4 processors, formerly named ‘Bergamo.’ With 128 Zen 4c cores per socket, these processors provide great vCPU density, performance and efficiency for applications that run in the cloud. Meta participated in discussing how these processors are well suited for their mainstay applications such as Instagram, WhatsApp and more.
It has also unveiled the AMD 3D V-Cache technology, with x86 server CPU and shared its AI platform strategy, giving customers a cloud, to edge, to endpoint portfolio of hardware products, with deep industry software collaboration, to develop scalable and pervasive AI solutions.
The AMD Instinct MI300X accelerator is one of the advanced accelerators for generative AI, along software ecosystem momentum, with partners PyTorch and Hugging Face. It is based on the next-gen AMD CDNA 3 accelerator architecture and supports up to 192GB of HBM3 memory to provide the compute and memory efficiency needed for large language model training and inference for generative AI workloads.
Finally, there was a showcase of the ROCm software ecosystem for data centre accelerators, highlighting the readiness and collaborations with industry performers to bring together an open AI software ecosystem. PyTorch discussed the work to fully upstream the ROCm software stack. This integration empowers developers with an array of AI models that are compatible and ready to use on AMD accelerators. Hugging Face also announced that it will optimise its models on AMD platforms.
Head office & Accounts:
Suite 14, 6-8 Revenge Road, Lordswood
Kent ME5 8UD
T: +44 (0)1634 673163
F: +44 (0)1634 673173
© 2025 All Things Media Ltd.
© 2025 All Things Media Ltd.