Google pushes Gemini 3 ahead of GPT‑5 using its own TPU chips, forcing OpenAI into internal code‑red mode

2 hours ago 495

Google just triggered a hard reset in the AI hardware war, after its TPU chips pushed Gemini 3 past GPT‑5 in independent tests, smashing both OpenAI and Nvidia right across the face at the same time.

Gemini 3 ran mostly on Google’s tensor processing units, not Nvidia GPUs. After the results landed, Sam Altman told staff to redirect focus back to fixing ChatGPT and its core models.

The move followed what OpenAI called a “code red” moment last week. At the same time, analysts said Google is planning to more than double TPU production by 2028, as demand for in‑house AI chips keeps rising.

Google scales chips and pushes into outside sales

Google now plans to move beyond using TPUs only inside its own cloud. One recent deal alone sent 1 million TPUs to Anthropic, a move valued in the tens of billions of dollars. That single contract shook Nvidia investors.

The concern is simple. If Google sells more TPUs to outside firms, Nvidia faces direct loss of data‑center demand.

Chip analysts at SemiAnalysis now rank TPUs as “neck and neck with Nvidia” for both training and running advanced AI systems. Morgan Stanley says every 500,000 TPUs sold to outside buyers could generate up to $13 billion in revenue for Google.

The bank also expects TSMC to produce 3.2 million TPUs next year, rising to 5 million in 2027 and 7 million in 2028. Analysts said growth in 2027 now looks stronger than earlier forecasts.

Google builds its processors mainly with Broadcom, with added support from MediaTek. The company says its edge comes from full vertical control over hardware, software, and AI models within one system. Koray Kavukcuoglu, Google’s AI architect and DeepMind CTO, said, “The most important thing is that full stack approach. I think we have a unique approach there.”

He also said Google’s data from billions of users gives it deep insight into how Gemini works across products like Search and AI Overviews.

Nvidia shares fell last month after The Information reported that Meta had held talks with Google about buying TPUs. Meta declined to comment. Analysts now say Google could strike similar supply deals with OpenAI, Elon Musk’s xAI, or Safe Superintelligence, with potential added revenue topping $100 billion over several years.

Nvidia defends while the TPU story cuts deeper

Nvidia pushed back after the selloff. The company said it remains “a generation ahead of the industry” and “the only platform that runs every AI model.” It also said, “We continue to supply to Google.” Nvidia added that its systems offer “greater performance, versatility, and fungibility” than TPUs, which it says target specific frameworks.

At the same time, developers now gain tools that ease the switch away from Nvidia’s Cuda software. AI coding tools now help rewrite workloads for TPU systems faster than before. That removes one of Nvidia’s strongest lock‑in defenses.

The TPU story began long before today’s AI boom. In 2013, Jeff Dean, Google’s chief scientist, gave an internal talk after a breakthrough in deep neural networks for speech systems. Jonathan Ross, then a Google hardware engineer, recalled the moment. “The first slide was good news, machine learning finally works. Slide two said bad news, we can’t afford it.” Dean calculated that if hundreds of millions of users spoke to Google for three minutes a day, data‑center capacity would need to double at a cost of tens of billions of dollars.

Ross began building the first TPU as a side project in 2013 while seated near the speech team. “We built that first chip with about 15 people,” he said in December 2023. Ross now runs AI chip firm Groq.

In 2016, AlphaGo defeated world Go champion Lee Sedol, and that historic match became a major AI milestone. Since then, TPUs have powered Google’s Search, ads, and YouTube systems for years.

Google used to update its TPUs every two years, but that cycle was changed to an annual 2 years ago in 2023.

A Google spokesperson said demand is rising on both fronts. “Google Cloud is seeing growing demand for both our custom TPUs and Nvidia GPUs. We will continue supporting both,” the company said.

Don’t just read crypto news. Understand it. Subscribe to our newsletter. It's free.

Read Entire Article