Advanced Micro Devices (AMD) has unveiled its next-generation AI chips, the Instinct MI400 series, set to ship next year. The new chips will be part of a ‘rack-scale’ system called Helios, enabling thousands of chips to work together as a unified system. This development is crucial for customers seeking ‘hyperscale’ AI computer clusters that can span entire data centers.
Key Features of AMD’s Instinct MI400 Series
AMD CEO Lisa Su introduced the MI400 series at a launch event in San Jose, California, alongside OpenAI CEO Sam Altman. Altman expressed enthusiasm for the new chips, stating, “It’s gonna be an amazing thing.” The rack-scale setup makes the chips appear as a single system to users, a feature vital for AI customers like cloud providers and companies developing large language models.
Competition with Nvidia
AMD’s move is a direct challenge to Nvidia, the current market leader in data center GPUs for AI applications. Nvidia’s Blackwell chips already come in configurations with 72 graphics-processing units stitched together. AMD’s MI400 chips, combined with the Helios system, aim to compete on both performance and price. AMD’s general manager for data center GPUs, Andrew Dieckmann, mentioned that AMD’s chips would offer “significant double-digit percentage savings” compared to Nvidia’s offerings.
Market Impact and Adoption
The total market for AI chips is expected to exceed $500 billion by 2028. AMD is positioning itself to capture a significant share of this market by releasing new AI chips annually. The company has already seen adoption from major AI customers, including OpenAI, Tesla, xAI, and Cohere. Oracle plans to offer clusters with over 131,000 MI355X chips, AMD’s current most advanced AI chip, to its customers.
Open Software Frameworks
Lisa Su highlighted that AMD’s MI355X can outperform Nvidia’s Blackwell chips, despite Nvidia’s proprietary CUDA software. Su attributed this to both strong hardware and advancements in open software frameworks. AMD is also using open-source networking technology called UALink to integrate its rack systems, as opposed to Nvidia’s proprietary NVLink.
Current Offerings and Future Plans
AMD’s current most advanced AI chip, the Instinct MI355X, started shipping last month and will be available for rent from cloud providers in the third quarter. The MI355X has seven times the computing power of its predecessor and is equipped with more high-speed memory, making it superior for inference tasks. AMD has invested heavily in AI, buying or investing in 25 AI companies in the past year, including the purchase of server maker ZT Systems.

As the AI market continues to grow, AMD is poised to challenge Nvidia’s dominance with its competitive pricing and innovative technology. The company’s focus on open software frameworks and rack-scale systems positions it as a significant player in the future of AI hardware.