Meta Enters the AI Chip Arena
Meta, the parent company of Facebook and Instagram, is venturing into the competitive world of custom AI chip development. This move signals a significant strategic shift, as the company aims to reduce its dependence on external suppliers like Nvidia and join tech giants such as Google, Microsoft, and Amazon in the race to create specialized silicon for artificial intelligence.

Reports from news agency Reuters, citing anonymous sources, indicate that Meta has commenced a small-scale deployment of its first internally developed chip designed for AI training. If the initial testing phases prove successful, the company plans to expand production significantly. This initiative is part of a larger strategy to lower infrastructure costs while continuing to make massive investments in AI technology.
Reduced Reliance on External Suppliers
One of the sources told Reuters that Meta’s new training chip is a dedicated accelerator, specifically engineered to handle AI tasks. This custom design may offer superior power efficiency compared to the graphics processing units (GPUs) that are typically utilized for AI workloads.
Meta is collaborating with TSMC, the Taiwan-based manufacturing giant, to produce the chip. The testing phase commenced following Meta’s completion of the “tape-out” phase—a key stage in chip development where the initial design is sent to a chip factory.
MTIA Series and Future Applications
The new chip is part of Meta’s Training and Inference Accelerator (MTIA) family and initially began inference—the process of running AI systems during user interactions—specifically for content recommendation systems on Facebook and Instagram last year.
Company executives have outlined plans to begin utilizing their in-house chips by 2026 for training, starting with recommendation systems before expanding to generative AI products like their chatbot, Meta AI. Meta’s chief product officer, Chris Cox, described the chip development process as gradual but noted that executives consider their first-generation inference chip a “big success.”
Cox further stated, “We’re working on how would we do training for recommender systems and then eventually how do we think about training and inference for gen AI.”
Meta previously abandoned an in-house custom inference chip after encountering poor test results. In 2022, the company instead ordered billions of dollars worth of Nvidia GPUs.