Microsoft reveals second generation of its AI chip in effort to bolster cloud business

Microsoft reveals second generation of its AI chip in effort to bolster cloud business


Scott Guthrie, executive vice president of cloud and enterprise at Microsoft, speaks at the Microsoft Build developer conference in Seattle on May 7, 2018. The Build conference, marking its second consecutive year in Seattle, is expected to put emphasis on the company’s cloud technologies and the artificial intelligence features within those services.

Grant Hindsley | Bloomberg | Getty Images

Microsoft announced the next generation of its artificial intelligence chip, a potential alternative to leading processors from Nvidia and to offerings from cloud rivals Amazon and Google.

The Maia 200 comes two years after Microsoft said it had developed its first AI chip, the Maia 100, which was never made available for cloud clients to rent. Scott Guthrie, Microsoft’s executive vice president for cloud and AI, said in a blog post Monday that, for the new chip, there will be “wider customer availability in the future.”

Guthrie called the Maia 200 “the most efficient inference system Microsoft has ever deployed.” Developers, academics, AI labs and people contributing to open-source AI models can apply for a preview of a software development kit.

Microsoft said its superintelligence team, led by Mustafa Suleyman, will use the new chip. The Microsoft 365 Copilot add-on for commercial productivity software bundles and the Microsoft Foundry service, for building on top of AI models, will use it as well.

Cloud providers face surging demand from generative AI model developers such as Anthropic and OpenAI and from companies building AI agents and other products on top of the popular models. Data center operators and infrastructure providers are trying to increase their computing prowess while keeping power consumption in check.

Microsoft is outfitting its U.S. Central region of data centers with Maia 200 chips, and they’ll arrive at the U.S. West 3 region after that, with additional locations to follow.

The chips use Taiwan Semiconductor Manufacturing Co.’s 3 nanometer process. Four are connected together inside each server. They rely on Ethernet cables, rather than the InfiniBand standard. Nvidia sells InfiniBand switches following its 2020 Mellanox acquisition.

The chip offers 30% higher performance than alternatives for the same price, Guthrie wrote. Microsoft said each Maia 200 packs more high-bandwidth memory than a third-generation Trainium AI chip from Amazon Web Services or from Google’s seventh-generation tensor processing unit.

Microsoft can achieve high performance by wiring up to 6,144 of the Maia 200 chips together, reducing energy usage and total cost of ownership, Guthrie wrote.

In 2023, Microsoft demonstrated that its GitHub Copilot coding assistant could run on Maia 100 processors.

WATCH: Chinese AI models adapt without Nvidia

Chinese AI models adapt without Nvidia



Source

Jensen Huang says Nvidia has received orders from China and is ‘restarting our manufacturing’
Technology

Jensen Huang says Nvidia has received orders from China and is ‘restarting our manufacturing’

After an extended delay in selling into the world’s second-largest economy, chipmaker Nvidia is gearing up to provide some customers in China with its H200 processors, CEO Jensen Huang said on Tuesday. “We have received purchase orders, and we’re in the process of restarting our manufacturing,” Huang told reporters at the company’s GTC conference in […]

Read More
Elon Musk and SEC in talks to settle lawsuit over Twitter deal
Technology

Elon Musk and SEC in talks to settle lawsuit over Twitter deal

Elon Musk looks on as President Donald Trump speaks at the U.S.-Saudi Investment Forum at the John F. Kennedy Center for the Performing Arts in Washington, Nov. 19, 2025. Brendan Smialowski | Afp | Getty Images Elon Musk is in talks with the ​Securities and ‌Exchange Commission to settle a lawsuit filed by the regulator […]

Read More
OpenAI preps for IPO by end of year, tells employees ChatGPT must be ‘productivity tool’
Technology

OpenAI preps for IPO by end of year, tells employees ChatGPT must be ‘productivity tool’

Fidji Simo, CEO of Instacart Inc., speaks during an interview in San Francisco, March 3, 2022. David Paul Morris | Bloomberg | Getty Images OpenAI is focusing employee and investor attention on its enterprise business as the artificial intelligence startup gears up to go public, potentially by the end of the year, CNBC has learned. […]

Read More