OpenAI says it has no plan to use Google's in-house chip
- As of July 1, 2025, OpenAI confirmed that although it has begun initial testing with Google's in-house AI chips, it does not intend to deploy them widely in its products at this time.
- This declaration followed reports that OpenAI considered using Google's tensor processing units to meet growing compute demand.
- OpenAI currently relies on Nvidia GPUs, AMD AI chips, and GPU servers from CoreWeave while developing its own chip, expected to reach chip design finalization this year.
- Earlier this month, OpenAI began utilizing Google Cloud to enhance its computing resources, highlighting an unexpected partnership amid their rivalry in the AI industry.
- The company’s rejection of scaling Google’s TPUs for now suggests challenges related to architecture adaptation and a continued focus on diversified hardware and in-house development.
18 Articles
18 Articles
OpenAI says no plans to use Google's AI chips at scale
OpenAI said it has no active plans to use Google's in-house chip to power its products, two days after several news outlets reported the AI lab was turning to its competitor's artificial intelligence (AI) chips to meet growing demand.

OpenAI says it has no plan to use Google's in-house chip
OpenAI said it has no active plans to use Google's in-house chip to power its products, two days after Reuters and other news outlets reported on the AI lab's move to turn to its competitor's artificial intelligence chips to meet growing demand. A spokesperson for OpenAI said on Sunday that while the AI la
OpenAI Confirms It Won’t Scale Use of Google’s In-House Chips
OpenAI has clarified that, despite initial trials, it does not currently plan to deploy Google's Tensor Processing Units (TPUs) at scale for its AI services. The announcement follows reports suggesting the company was exploring TPUs to support growing compute needs. A company spokesperson told that OpenAI is in early testing with some of Google’s chips but has no active intentions to roll them out broadly. For now, the company continues to rely …


With OpenAI, there are no allegiances - just compute at all costs
Google's TPUs might not be on Altman's menu just yet, but he's never been all that picky about hardware Analysis No longer bound to Microsoft's infrastructure, OpenAI is looking to expand its network of compute providers to the likes of Oracle, CoreWeave, and apparently even rival model builder Google. . . .
Coverage Details
Bias Distribution
- 57% of the sources are Center
To view factuality data please Upgrade to Premium