Skip to main content
See every side of every news story
Published loading...Updated

Microsoft rolls out next generation of its AI chips, takes aim at Nvidia's software

Microsoft's Maia 200 chip offers 30% better performance per dollar and powers services like Azure and GPT-5.2 with over 10 petaflops in 4-bit precision.

  • On Jan 26, Microsoft unveiled Maia 200 in San Francisco, deploying it this week in the Iowa data center as `the most efficient inference system Microsoft has ever deployed`.
  • Tech giants are designing their own chips to cut reliance on NVIDIA, and Microsoft built Maia 200 to compete with Amazon Web Services and Google while addressing surging demand from generative AI developers.
  • Built on Taiwan Semiconductor Manufacturing Co.'s 3-nanometer process, Maia 200 contains over 100 billion transistors, delivers over 10 PFLOPS and around 5 PFLOPS , and links four chips per server with up to 6,144 chips wired together.
Insights by Ground AI
Podcasts & Opinions

74 Articles

Think freely.Subscribe and get full access to Ground NewsSubscriptions start at $9.99/yearSubscribe

Bias Distribution

  • 62% of the sources are Center
62% Center

Factuality Info Icon

To view factuality data please Upgrade to Premium

Ownership

Info Icon

To view ownership data please Upgrade to Vantage

Forbes broke the news in United States on Monday, January 26, 2026.
Too Big Arrow Icon
Sources are mostly out of (0)

Similar News Topics

News
Feed Dots Icon
For You
Search Icon
Search
Blindspot LogoBlindspotLocal