Meta has launched its next-generation AI chip, the Meta Training and Inference Accelerator (MTIA). The step is anticipated to have a significant impact for the tech giant by potentially reducing its reliance on chip industry leader NVIDIA. The Next-gen MTIA chip stands as the progression from its predecessor, the MTIA v1 introduced in 2023. It operates various models, including those tailored for ranking and recommending display advertisements across Meta’s platforms such as Facebook.
As per a report of Times of India, Meta debuts its new generation of AI chip, signaling a significant entrance into the competitive world of AI hardware. The latest version of the MTIA chip might represent a breakthrough in performance and efficiency and is expected to exceed the current industry standards.
Meta Advancements in AI Chip Technology
Meta has acknowledged that the chip is not intended to replace traditional GPUs entirely, indicating a more measured approach to the AI hardware. Meta’s MTIA chip architecture is designed with a focus on delivering the optimal balance between computing power, memory bandwidth and memory capacity for serving ranking and recommendation models.
Currently, the MTIA chip is already in use in Meta’s data centers for tasks such as ad ranking and recommendation, but it has not yet been utilised for the company’s primary focus on generative AI training. The high cost associated with training cutting-edge generative AI models makes in-house hardware like the MTIA a potentially cost-effective solution.
Meta is projected to spend a significant amount on GPUs this year, which is projected to be USD 18 Billion and might further emphasises the benefits of developing their own chips as a cost-saving measure. By creating their own hardware, Meta might reduce its reliance on external suppliers and better control the development and deployment of its AI technology.