SAN FRANCISCO, Oct 10 (Reuters) – Advanced Micro Devices (AMD) announced a series of new products, including an updated artificial-intelligence chip and new server and PC processors, at an event in San Francisco on Thursday. The company aims to strengthen its position in the rapidly expanding AI market, which is currently dominated by Nvidia.

During the event, AMD CEO Lisa Su revealed that mass production of the MI325X, a new version of the AI chip, is slated to begin in the fourth quarter of 2024. The company also plans to release its next-generation MI350 series chips in the second half of 2025, which will feature increased memory and a new architecture designed for improved performance compared to the MI300X and MI250X chips.
The announcements were largely anticipated based on earlier communications from AMD this year, but investors appeared unimpressed, leading to a nearly 5% drop in AMD’s share price in afternoon trading. Some analysts suggested that the absence of new major cloud-computing customers for the chips contributed to the decline.
“There are no new customers announced so far,” said Kinngai Chan, research analyst at Summit Insights, referencing the stock’s gains prior to expectations of something new.
In contrast to AMD’s performance, shares of rival Nvidia rose 1.5%, while Intel saw a 1.6% decrease.
Increased demand for AI processors from major technology companies like Microsoft and Meta Platforms has outpaced supply from Nvidia and AMD, allowing the semiconductor companies to sell their entire production output. This has fueled a significant surge in chip stocks over the past two years, with AMD’s shares up approximately 30% since a recent low in early August.
AMD, based in Santa Clara, California, stated that vendors such as Super Micro Computer will begin shipping the MI325X AI chip to clients in the first quarter of 2025. AMD’s design seeks to compete with Nvidia’s Blackwell architecture. The MI325X chip will use the same architecture as the previously released MI300X, which AMD launched last year, but will include new memory to accelerate AI computations.
AMD’s next-generation AI chips are likely to put more pressure on Intel, which has encountered hurdles in deploying a coherent AI chip strategy. Intel anticipates AI chip sales exceeding $500 million in 2024.
New Server and PC Chips
Also at the event, Su stated that the company does not currently have plans to utilize contract chip manufacturers beyond Taiwan’s TSMC for advanced manufacturing processes, which are crucial for producing fast AI chips.
“We would love to use more capacity outside of Taiwan. We are very aggressive in the use of TSMC’s Arizona facility,” Su remarked.
AMD also showcased several networking chips designed to expedite data transfer within data centers. Additionally, the company announced a new version of its server CPU design. The family of chips, previously codenamed Turin, includes a version created to feed data to graphics processing units (GPUs), which will speed up AI processing.
The flagship chip boasts nearly 200 processing cores and is priced at $14,813. The entire line of processors uses the Zen 5 architecture, which offers performance gains of up to 37% for processing advanced AI data.
Beyond the data center chips, AMD unveiled three new PC chips for laptops, based on the Zen 5 architecture. These new chips are tailored to run AI applications and can run Microsoft’s Copilot+ software.
In July, AMD increased its AI chip forecast to $4.5 billion for the year, up from the previous target of $4 billion, due to the rising demand for MI300X chips in the wake of the surge in developing and deploying generative AI products. According to LSEG estimates, analysts expect AMD to report data center revenue of $12.83 billion this year. Wall Street anticipates Nvidia to report data center revenue of $110.36 billion. Data center revenue serves as an indicator for the AI chips needed to build and run AI applications.
Despite the recent surge in share prices, analysts’ increasing earnings expectations have kept AMD and Nvidia’s valuations in check. Both companies trade at over 33 times their 12-month forward earnings estimates, compared to the benchmark S&P 500’s 22.3.
Reporting by Max Cherney in San Francisco; additional reporting by Aditya Soni and Arsheeya Bajwa in Bengaluru; Editing by Sonali Paul, Peter Henderson and Matthew Lewis