They’re mortal enemies. Sworn rivals. The two top names when it comes to the best graphics cards, duking it out even over petty performance difference. And yet, AMD and Nvidia are partnering for the launch of Team Red’s 5th-gen Epyc server CPUs, code-named Turin.
Through what AMD called a “technical partnership,” Nvidia is providing guidance on how to pair its HGX and MGX data center GPU clusters with AMD’s new Epyc CPUs. Despite the fact that AMD makes its own Instinct AI accelerators, the partnership seems like a read on reality. Customers will want to use Nvidia GPUs with AMD CPUs, and the two companies came together to provide that guidance.
And that’s a good thing because AMD’s new data center chips look mighty impressive. The bedrock they are built on is the Zen 5 architecture that we saw in CPUs like the Ryzen 9 9950X and Ryzen 9900X, but massively expanded. You can see the dizzying product stack below, which ranges from CPUs from eight cores all the way up to a massive chip packing 192 of AMD’s Zen 5c cores. And as a cherry on top, the sweet spot 64-core 9575F can hit 5GHz — a first for AMD’s data center chips.
AMD
AMD is showing some pretty massive performance gains, too — 2.7x the performance on the SPEC CPU benchmark compared to an Intel Emerald Rapid Xeon CPU, and up to four times the enterprise performance, which looks at transcoding video. Intel just recently launched its Granite Rapid Xeon CPUs, but AMD didn’t share any performance numbers for those chips.
AMD
Turin CPUs are massive, and it’s no wonder that Nvidia is working with AMD to provide guidance to customers. Still, AMD isn’t down for the count when it comes to data center GPUs. The company formally launched its new Instinct MI325X AI accelerator alongside Turin CPUs, which it previously previewed at Computex earlier this year. A single GPU packs 256GB of HBM3E memory, and when organized into a cluster, AMD says the platform can support up to 2TB of HBM3E memory and 48TB/s of memory bandwidth.
AMD
AMD is still gunning after Nvidia’s AI crown with the GPU, too. The company showed the MI325X outperforming Nvidia’s H200 in AI inference and providing performance parity in AI training. Nvidia has moved onto its Blackwell data center GPUs, but the H200 comparison is still fair given how rapidly AI hardware is rolling out these days.
AMD
Although most of the readers here at Digital Trends will never interact directly with this hardware, it’s still nice to look and gawk at just how much power companies like AMD can squeeze onto a CPU. Behind the scenes, AMD’s Epyc CPUs help power dozens of services we all interact with every day, from Netflix to Snapchat and beyond.