Image from 采集站点

[Global Times Comprehensive Report] “Google May Assist OpenAI in Reducing its Reliance on Nvidia” – The Indian Times reported on the 28th that OpenAI has for the first time adopted Google’s AI chips to provide computational power support for its ChatGPT and other products. Prior to this, OpenAI had been one of the largest purchasers of Nvidia’s graphics processing unit (GPU). The report suggests that this AI company is seeking diversification in chip suppliers to reduce its reliance on Nvidia.
According to a report by Reuters on the 27th, this rental of Google’s TPU marks the first substantial use of non-Nvidia chips by OpenAI, reflecting the company’s shift away from its reliance on Microsoft’s data center. Previously, OpenAI primarily obtained Nvidia chips through partnerships with Microsoft and Oracle for model training and deployment.
The American technology industry media “The Information” believes that this move may drive Google’s TPU to become a more cost-effective alternative to Nvidia’s GPU. However, sources say that although OpenAI aims to reduce inference costs by renting TPUs, Google did not lease its most powerful version to OpenAI, indicating that Google’s most advanced TPUs are still reserved for internal use.
For Google, Reuters analysis suggests that this collaboration coincides with its expansion into supplying its own TPUs externally. Previously, TPUs were mainly used internally before being opened to external customers, attracting tech companies including Apple and US AI startup Anthropic.
Currently, leading tech companies are strengthening their control over computational infrastructure through self-developed chips.

According to a previous report by The New York Times, Amazon, Supermicro Semiconductors, and several startups have begun providing reliable alternatives to Nvidia chips, especially for the “inference” stage in AI development. Among them, Amazon announced the launch of its new Trainium 2 chip-based computing services, which have received positive feedback from potential users including Apple.
On the 12th of this month, Super Micro Semiconductors released the new MI350 series and MI400 series chips set to be launched in 2026, comparing them with Nvidia’s Blackwell series. According to Reuters, Su Zifeng, CEO of Super Micro Semiconductors, stated that these chips will compete with Nvidia’s Blackwell series. Super Micro Semiconductors has struggled to take away from Nvidia the rapidly growing AI chip market share. “The future of AI will not be built by any single company or closed ecosystem; it will be shaped by open collaboration across the industry,” said Su Zifeng.
However, according to a report by Canada’s tech media website Wccftech on the 27th, despite technology giants investing heavily in developing their own chips to reduce reliance on Nvidia, Nvidia’s AI chips still maintain a leading position within the industry. The ambition of Microsoft to overturn Nvidia’s dominance in AI is being shaken, with the first independently developed AI chip “Braga” delayed due to performance not meeting expectations. Preliminary performance evaluation reports indicate that its performance cannot surpass that of Nvidia’s Blackwell released in 2024.

By word

Leave a Reply

Your email address will not be published. Required fields are marked *