AMD’s Chief Technology Officer, Mark Papermaster, has stated that the company believes inference workloads will soon shift from data centers to consumer-oriented devices such as phones and laptops. In an interview with Business Insider, Papermaster revealed that AMD thinks the majority of inference will be done at the edge by 2030, although the exact timeline depends on the development of ‘killer apps’ that can run on these devices.
The Shift to Edge AI
Papermaster explained that the rising costs of AI compute in data centers will force tech giants like Microsoft, Meta, and Google to reconsider their approach, making edge AI more prevalent. This belief is a significant reason AMD is taking the ‘AI PC’ hype seriously, more so than competitors like Intel and Qualcomm.
AMD’s Commitment to Edge AI
AMD’s latest APU lineups, including Strix Point and Strix Halo, demonstrate the company’s commitment to bringing AI computational capabilities to small form factors at a lower cost. The CTO emphasized the importance of increasing the accuracy and efficiency of AI models, noting that advancements like DeepSeek are pushing towards optimized alternatives.
Future of AI Inference
In the long term, Papermaster believes devices will become capable enough to run sophisticated AI models locally, allowing users to experience full AI capabilities. This perspective aligns with previous statements by Intel’s former CEO Pat Gelsinger, who also highlighted the importance of inference.

AMD is positioning itself to challenge NVIDIA in the AI inferencing market, as its competitors acknowledge that competing in AI training markets is challenging due to NVIDIA’s dominance. By focusing on edge AI capabilities, AMD aims to provide a competitive edge in the evolving AI landscape.