Meta is testing its first in-house AI training chip, marking pivotal steps to reduce infrastructure costs and reduce dependency on NVIDIA and other suppliers, Reuters reports citing sources familiar with the issue. This move reflects a broader push to design Meta’s custom silicon, allowing for more tight control over AI hardware needs as it expands AI ambitions.
The announcement follows a similar move by Openai, revealing plans two months ago to complete the initial custom AI chip design, aiming to reduce reliance on NVIDIA. Openai, creator of ChatGpt, is expected to complete the design in the coming months and partner with Taiwan Semiconductor Manufacturing Co (TSMC) for production.
Meta’s strategic shift towards in-house AI chips
Sources told Reuters that the initial rollout was small, but if successful, Meta will be strengthening production for wider use.
This chip development is part of Meta’s broad plan to control costs as he invests heavily in AI tools to drive future growth. The company, which owns Instagram and WhatsApp, forecasts costs between $1144 billion and $119 billion in 2025. Of that, $65 billion is expected to be directed primarily towards capital expenditures directed towards AI infrastructure.
One source shared that Meta’s chips are dedicated accelerators built specifically for AI tasks. Unlike general purpose GPUs, these chips are designed to be more power efficient and focused on AI workloads.
Meta is partnering with Taiwan-based TSMC to manufacture the chips, source added.
“The test rollout began after Meta had finished his first ‘tape-out’ of the chip. This is a key marker of success in silicon development work sending initial designs through chip factories,” Reuters reported, citing another source.
Meta’s AI chip development challenges and future plans
The CHIP test phase began after Meta completed its first “tape-out.” This stage is important and expensive, often bumping into tens of millions of dollars and takes 3-6 months. If the test fails, META must troubleshoot and repeat the process.
Both Meta and TSMC declined to comment on the chip’s progress.
The new chip is part of Meta’s Meta Training and Inference Accelerator (MTIA) series, and we’ve seen the rocky journey. Previous chips were discarded during development. However, Meta successfully deployed the MTIA chip for the inference task last year. This is the process of running the AI model when users interact. The chip drives a recommended system that determines what content will be displayed on your Facebook and Instagram feeds.
MetaExecutive aims to use in-house tips for training by 2026. Training involves “teaching” how to process large datasets and perform tasks in AI models. The initial focus will be on the recommendation system, which will later expand to generate AI products such as Meta’s chatbots.
“We’re working on how to train the recommended systems and ultimately we’re thinking about training and reasoning for Gen AI,” says Chris Cox of Morgan Stanley Technology, Media and Telecom Conference.
Cox described the chip development process as a “walk, crawling, running” approach, but noted that the first generation reasoning chips are “a great success.”
Meta doesn’t always travel smoothly with custom chips. During testing, an early in-house inference chip failed, causing Meta to return to Nvidia GPUs, leading in 2022 with orders worth billions of dollars. Since then, Meta has been one of Nvidia’s top customers, relying on GPU-dependent recommendations, ADS, and Llama Foundation series. These GPUs also support the inference of billions of people who use Meta’s apps every day.
However, GPU dependency is under scrutiny. Some AI researchers have questioned whether expanding large-scale language models with more data and computing power will lead to meaningful advances. These doubts gained traction after Chinese startup DeepSeek launched a low-cost model that emphasized computational efficiency. The shift caused a sharp drop in Nvidia’s stock price, but later recovered.
Investors believe that Nvidia chips will remain dominant for AI training and reasoning, but broader trade concerns keep the market cautious.
Source link