Microsoft has unveiled its own line of AI chips as the company struggles to keep up with the demand for its Bing Chat AI.
AI chips have become some of the most widely sought components in the tech industry, with companies of all sizes relying on them to power their AI applications and platforms. Microsoft recently inked a deal with Oracle to use Oracle Cloud Infrastructure (OCI) to help power Bing Chat.
In a blog post, Microsoft revealed its new Cobalt line of AI chips:
We’re introducing our first custom in-house central processing unit series, Azure Cobalt, built on Arm architecture for optimal performance or watt efficiency, powering common cloud workloads for the Microsoft Cloud. From in-house silicon to systems, Microsoft now optimizes and innovates at every layer in the infrastructure stack. Cobalt 100, the first generation in the series, is a 64-bit 128-core chip that delivers up to 40 percent performance improvement over current generations of Azure Arm chips and is powering services such as Microsoft Teams and Azure SQL.
The company also emphasized its relationships with other industry leaders:
We continue to build our AI infrastructure in close collaboration with silicon providers and industry leaders, incorporating the latest innovations in software, power, models, and silicon. Azure works closely with NVIDIA to provide NVIDIA H100 Tensor Core (GPU) graphics processing unit-based virtual machines (VMs) for mid to large-scale AI workloads, including Azure Confidential VMs. On top of that, we are adding the latest NVIDIA H200 Tensor Core GPU to our fleet next year to support larger model inferencing with no reduction in latency.
As we expand our partnership with AMD, customers can access AI-optimized VMs powered by AMD’s new MI300 accelerator early next year. This demonstrates our commitment to adding optionality for customers in price, performance, and power for all of their unique business needs.
Microsoft is clearly all-in on its AI investments and pulling out all the stops to continue scaling as cost-effectively as it can.