Start now →

Google Takes Aim at Nvidia With New Tensor Chips to Power AI Boom

By Decrypt Agent · Published April 23, 2026 · 2 min read · Source: Decrypt
AI & Crypto
Google Takes Aim at Nvidia With New Tensor Chips to Power AI Boom
NewsArtificial Intelligence

Google Takes Aim at Nvidia With New Tensor Chips to Power AI Boom

Google’s new processors target massive model training and the emerging AI agent economy, offering distinct builds for both needs.

Decrypt AgentBy Decrypt AgentEdited by Andrew HaywardApr 23, 2026Apr 23, 20262 min read
Google. Source: Decrypt/Shutterstock
Google. Source: Decrypt/Shutterstock
Create an account to save your articles.Add on GoogleAdd Decrypt as your preferred source to see more of our stories on Google.

In brief

Google unveiled two AI processors at its Cloud Next 2026 conference in Las Vegas on Wednesday, marking the company's eighth generation of custom silicon designed to challenge Nvidia's AI chip dominance.

The training-focused TPU 8t delivers nearly 3x the compute performance per pod compared to its predecessor, with a single superpod scaling to 9,600 chips and delivering 121 ExaFlops of compute capacity. The architecture also offers 2.8x better price-to-performance, according to Google.

The TPU 8i takes a different approach, optimizing for inference workloads with 3x more on-chip SRAM than previous generations—384 MB of on-chip SRAM paired with 288 GB of high-bandwidth memory. The chip delivers up to 80% better performance per dollar and 2x the performance per watt, the company claimed.

Both chips leverage Google's new Boardfly architecture, which achieves up to a 50% improvement in latency for communication-intensive workloads by reducing network diameter, the technical documentation shows.

The hardware announcement follows Google's expanded partnership with Anthropic earlier this month, which will provide the AI startup with multiple gigawatts of next-generation TPU capacity. The deal highlights how Google is leveraging its custom silicon to attract major AI companies seeking alternatives to Nvidia's GPUs in the increasingly competitive infrastructure market.

Google CEO Sundar Pichai positioned the chips as purpose-built for AI agents, stating they deliver the massive throughput and low latency needed to concurrently run millions of agents cost-effectively. The company has already secured adoption from Citadel Securities, with the financial services firm choosing TPUs to power their AI workloads.

The dual-chip strategy reflects the diverging computational needs of modern AI systems: massive parallel processing for training frontier models versus rapid, memory-intensive operations for deploying those models as interactive agents.

Pichai said Wednesday that Google is on track to spend up to $185 billion this year alone to power AI infrastructure for the “agentic era,” with the firm already generating nearly 75% of its new code with AI under the watchful eye of engineers.

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.
This article was originally published on Decrypt and is republished here under RSS syndication for informational purposes. All rights and intellectual property remain with the original author. If you are the author and wish to have this article removed, please contact us at [email protected].

NexaPay — Accept Card Payments, Receive Crypto

No KYC · Instant Settlement · Visa, Mastercard, Apple Pay, Google Pay

Get Started →