A major tech company is ramping up its artificial intelligence development. In this case, the protagonist of this familiar story is Meta, which, according to Reuters, is testing its first proprietary AI training chip. The idea is to cut down on gigantic infrastructure costs and reduce dependence on NVIDIA (a company that apparently reveals Mark Zuckerberg’s “adult language”). If all goes well, Meta hopes to use it for education by 2026.
Meta has reportedly begun a small-scale deployment of a specialized accelerator chip that is designed to address AI tasks (and is therefore more energy efficient than NVIDIA’s general-purpose GPUs). The deployment began after the company completed its first “tape-out,” a stage in silicon development in which a complete design is sent for production testing.
The chip is part of the Meta Training and Inference Accelerator (MTIA) series, a family of the company’s own silicon solutions focused on generative AI, recommender systems, and advanced research.
Last year, the company started using the MTIA chip for inference, a prediction process that takes place behind the scenes in AI models. Meta started using inference for its Facebook and Instagram news feed recommendation systems. Reuters reports that the company is also planning to start using learning silicon for this purpose. The long-term plan for both chips is to start with recommendations and eventually use them for generative products such as the Meta AI chatbot.
The company is one of NVIDIA’s largest customers after placing orders for billions of dollars worth of GPUs in 2022. This was a turning point for Meta after it abandoned its previous in-house inference silicon, which failed during a small test deployment – similar to what it is doing now for the training chip.