AMD launches MI325X AI chip to compete with Nvidia’s Blackwell
AMD launched a new artificial intelligence chip on Thursday aimed directly at Nvidia’s data center graphics processing units, known as GPUs.
The Instinct MI325X, as the chip is called, will begin production before the end of 2024, AMD said Thursday during a new product announcement event. If AMD’s AI chip is approved by developers and giants, about the cloud considered a close replacement for by Nvidia products, it could put pricing pressure on Nvidia, which has gross margins around 75% while its GPUs have been in high demand over the past year.
Advanced generative AI like OpenAI’s ChatGPT requires massive data centers filled with GPUs to do the necessary processing, which has created demand for more companies to provide AI chips.
Over the past few years, Nvidia has largely dominated the data center GPU market, but AMD has historically been a close second. Now, AMD is aiming to take market share from its Silicon Valley rival, or at least capture a large chunk of the market, which it says will be worth $500 billion by 2028.
“AI demand has really continued to soar and really exceed expectations. It’s clear that the pace of investment is continuing to increase everywhere,” AMD CEO Lisa Su said at the event.
AMD didn’t reveal major new internet or cloud customers for its Instinct GPUs at the event, but the company has previously revealed that both Meta and Microsoft buy its AI GPUs and that OpenAI uses them for some applications. The company also did not disclose the price of the Instinct MI325X, which is typically sold as part of a complete server.
With the launch of the MI325X, AMD is accelerating its product schedule to release new chips on an annual schedule to better compete with Nvidia and take advantage of the AI chip boom. The new AI chip is the successor of MI300X. Start shipping late last year. The company said AMD’s 2025 chip will be called MI350 and its 2026 chip will be called MI400.
The MI325X implementation will compete with Nvidia’s upcoming Blackwell chips, which Nvidia says will begin shipping in significant numbers early next year.
The successful launch of AMD’s latest data center GPUs could attract interest from investors looking for more companies that can benefit from the AI boom. AMD is up just 20% through 2024 while Nvidia’s stock is up more than 175%. Most industry estimates say Nvidia accounts for more than 90% of the data center AI chip market.
AMD shares fell 3% in Thursday trading.
AMD’s biggest obstacle to gaining market share is that rival chips use its own CUDA programming language, which has become the standard for AI developers. That essentially locks developers into Nvidia’s ecosystem.
In response, AMD this week said it has improved its competing software, called ROCm, so that AI developers can more easily port more of their AI models to AMD’s chips, which they called accelerator.
AMD has framed its AI accelerators as more competitive in use cases where AI models are creating content or making predictions rather than when an AI model is processing terabytes data for improvement. That’s partly due to the advanced memory AMD is using on its chips, allowing it to run Meta’s Llama AI model faster than some Nvidia chips.
“What you see is that the MI325 platform delivers up to 40% higher inference performance compared to H200 on Llama 3.1,” said Su. Meta’s large language AI model.
Confront Intel too
While AI accelerators and GPUs have become the most watched parts of the semiconductor industry, AMD’s core business is the central processing unit, or CPU, that lies at the heart of almost every machine. owner in the world.
In July, the company said AMD’s data center sales in the June quarter more than doubled from last year to $2.8 billion, of which AI chips accounted for only about $1 billion.
The company says AMD accounts for about 34% of all dollars spent on data center CPUs. That’s still less Intelis still the market leader with the Xeon chip line. AMD is aiming to change that with a new line of CPUs, called EPYC 5th generation, which they also announced on Thursday.
These chips come in a variety of configurations, from cheap and low-power 8-core chips priced at $527 to 192-core, 500-watt supercomputer processors priced at $14,813 each.
AMD says the new CPUs are especially good at delivering data for AI workloads. Nearly all GPUs require the CPU on the same system to boot the computer.
“AI today is really about CPU capabilities, and you can see that in data analytics and a lot of those types of applications,” Su said.
CLOCK: AMD CEO Lisa Su says technology trends will play out for years to come, we’re still learning with AI