Meta announced it is joining the ranks of companies developing custom silicon, with plans to develop chips to power its AI models.
Companies are increasingly following Apple’s lead, developing custom silicon specifically suited to their needs and applications. Microsoft is rumored to be working on custom silicon to better compete with Apple’s offerings, Amazon already uses its own chips to power Alexa, and Google’s in-house Tensor powers its Pixel smartphone lineup.
Meta announced plans to develop its own custom chips to enable it to build bigger and more sophisticated AI models, according to Santosh Janardhan, VP & Head of Infrastructure:
We are executing on an ambitious plan to build the next generation of Meta’s AI infrastructure and today, we’re sharing some details on our progress.
This includes our first custom silicon chip for running AI models, a new AI-optimized data center design and the second phase of our 16,000 GPU supercomputer for AI research. These efforts — and additional projects still underway — will enable us to develop larger, more sophisticated AI models and then deploy them efficiently at scale. AI is already at the core of our products, enabling better personalization, safer and fairer products, and richer experiences while also helping businesses reach the audiences they care about most.
Janardhan says the new chip is called MTIA:
MTIA (Meta Training and Inference Accelerator): This is our in-house, custom accelerator chip family targeting inference workloads. MTIA provides greater compute power and efficiency than CPUs, and it is customized for our internal workloads. By deploying both MTIA chips and GPUs, we’ll deliver better performance, decreased latency, and greater efficiency for each workload.
Janardhan does not provide any details regarding which foundry service Meta plans to use.