Business Standard

Sunday, December 22, 2024 | 07:52 AM ISTEN Hindi

Notification Icon
userprofile IconSearch

Microsoft introduces two custom AI chips to power Azure services: Details

Microsoft does not intend on selling the chips but using them internally to power its in-house AI services such as Bing, Microsoft 365 and Azure AI service

Microsoft AI Chips, Azure Maia 100, Azure Cobalt 100, AI chips, AI GPU, AI CPU, Microsoft Ignite conference, AI on Microsoft, Microsoft AI, AI training, Microsoft 365, Microsoft Copilot, Microsoft Bing

Photo: Microsoft

Harsh Shivam New Delhi

Listen to This Article

At its Ignite developers conference on November 15, Microsoft introduced a duo of custom AI chips- Azure Maia 100 and Azure Cobalt 100 CPU- to speed up its in-house AI computing tasks. 

The Mia 100 chip is designed to run large language models and can be used to train AI models, while Cobalt, a custom Arm-based CPU, is designed to handle general computing workload.

“Microsoft is building the infrastructure to support AI innovation, and we are reimagining every aspect of our data centres to meet the needs of our customers,” said Scott Guthrie, executive vice president of Microsoft’s Cloud + AI Group

 

Microsoft said the Maia 100 is based on a 5 nm architecture engineered explicitly for the Azure hardware stack. The company said the AI accelerator chip will power some of the largest internal AI workloads running on Microsoft Azure cloud computing service, including Bing, Microsoft 365, and the OpenAI service.

Microsoft said the Cobalt 100 chipset has 128 cores and is based on an Arm Neoverse CSS architecture for “delivering greater efficiency and performance in cloud native offerings”. 

Both custom chipsets will roll out to Azure data centres early next year. The American tech giant does not intend to sell the chips but use them internally to power its in-house AI services. 

Alongside its own custom-based AI chips, Microsoft also has plans to add the NVIDIA H200 Tensor Core GPU to its Azure fleet next year to support larger language models. The company has partnered with NVIDIA to train mid-sized gen-AI models on NVIDIA H100 Tensor Core GPUs.


Don't miss the most important news and views of the day. Get them on our Telegram channel

First Published: Nov 16 2023 | 11:01 AM IST

Explore News