
Every computing revolution started the same way — an application outgrew the hardware. Here’s the pattern, explained simply.
There are four letters that tell the entire story of modern computing.
CPU. GPU. TPU. QPU.
Each one was born the same way: an application got so big, so demanding, that the existing chip couldn’t keep up. So someone built a new one. And a giant company emerged.
This has happened three times already. It’s happening a fourth time right now. And almost nobody is connecting the dots.
The CPU: Microsoft made it necessary
In 1981, a 25-year-old named Bill Gates made the most consequential business deal in computing history.
IBM wanted an operating system for their new personal computer. Gates sold them one, a system called MS-DOS. But he added one clause: he kept the right to sell the same software to anyone else.
That clause changed everything.
Within a decade, dozens of companies were building PCs that ran Microsoft’s software. And every one of those PCs needed a processor — a CPU. Intel made those processors.
By the mid-1990s, Windows was on more than 90% of the world’s computers. Over 1 billion PCs were sold by 2002. Intel held 90% of the CPU market and was worth $509 billion.
The lesson: Intel didn’t create the demand for PCs. Microsoft did. Office software, email, spreadsheets , those were the reasons people bought computers. The CPU was just the chip that made it possible.
Application → created demand → new chip became essential → a giant emerged.
The GPU: First crypto, then AI made it necessary
Fast forward to 2009. Bitcoin launches. And people stumble onto a discovery.
Graphics cards, originally built to render video game graphics, are incredible at the repetitive math that cryptocurrency mining requires. A regular CPU processes calculations one at a time. A GPU handles thousands simultaneously. Perfect for mining.
The numbers were staggering. By 2021, crypto miners were buying 25% of all graphics cards manufactured worldwide. They spent $15 billion on GPUs. Prices tripled. If you wanted a graphics card for gaming, you were out of luck.
But here’s what most people get wrong: crypto didn’t make NVIDIA the most valuable company in the world. AI did.
When Ethereum switched off mining in September 2022, the crypto GPU market collapsed overnight. NVIDIA barely noticed. Because a much larger wave had arrived.
Deep learning, the technology behind ChatGPT, Claude, and every modern AI, needs the same kind of parallel processing, but at a scale crypto never approached. After ChatGPT launched, NVIDIA’s data center revenue went from $3.6 billion to $51.2 billion per quarter in three years. Their market cap grew from $1 trillion to $4.5 trillion in just 25 months. They now control 92% of the GPU market.
Crypto was the spark. AI was the wildfire.
Application → created demand → chip became essential → a giant emerged.
The TPU: The wave everyone forgets
This is where the story gets really interesting. And it’s the chapter most people skip entirely.
In 2016, Google’s AI team was training massive neural networks on GPUs. And they hit a realization: GPUs aren’t designed for this. They were built for rendering 3D graphics. Yes, they happen to be good at the parallel math AI needs. But they’re not optimized for it.
So Google did something radical. They designed their own chip from scratch, built specifically for the math that AI training and inference require. They called it the Tensor Processing Unit. The TPU.
This is not a minor footnote. This is a paradigm shift.
Think about what Google was saying: the GPU, the most important chip of the past decade, isn’t good enough for AI anymore. We need something purpose-built.
And they were right.
Today, Claude, built by Anthropic, and arguably the most capable AI system in the world, runs entirely on TPUs. Not GPUs. When you ask Claude to analyze a document, write code, or reason through a problem, the computation happens on Google’s custom-designed chips.
In October 2025, Anthropic signed the largest TPU deal in history: access to 1 million TPU chips from Google Cloud. Over 1 gigawatt of electrical power, enough to power a small city. For one AI company.
Google is now on its 7th generation TPU, called Ironwood. Microsoft is building its own AI chip (Maia). Amazon has Trainium. The entire industry looked at GPUs and said: not enough.
Application (AI training) → outgrew the existing chip (GPU) → new chip born (TPU) → a new infrastructure giant emerged (Google’s cloud AI business).
The QPU: The wave that’s starting right now
And now we arrive at the present moment. And the question that’s worth trillions of dollars.
AI is outgrowing everything.
Not just CPUs. Not just GPUs. Even the TPUs that Google designed specifically for AI are being pushed to their limits. The demand is simply unprecedented.
Here are the numbers:
NVIDIA made $57 billion in a single quarter, and their chips are still sold out.
Anthropic needs 1 million TPU chips, over 1 gigawatt, and they’re just one of many companies deploying AI at scale.
McKinsey projects $6.7 trillion in new data center infrastructure will be needed by 2030.
AI data centers could consume 9% of the world’s electricity within five years.
There is a ceiling. CPUs, GPUs, and TPUs are all built on the same foundation: silicon transistors governed by classical physics. You can make them smaller. You can make them more specialized. But you cannot break the laws of thermodynamics. At some point, the heat is unmanageable, the electricity isn’t there, and the physics simply says: no further.
This is exactly the kind of pressure that has, three times before, forced a new computing paradigm into existence.
Enter the QPU, the quantum processing unit.
Quantum computers use qubits instead of bits. A regular bit is either 0 or 1. A qubit can be both at the same time. This isn’t a metaphor, it’s quantum mechanics. And it means that for certain types of calculations, a quantum computer can find solutions that would take a classical machine longer than the age of the universe.
Until recently, this was laboratory science. Interesting, but impractical. That’s changing fast.
Google’s Willow chip, 105 qubits, solved a problem 13,000 times faster than the world’s most powerful classical supercomputer. Published in Nature. Not a marketing claim.
IBM has publicly stated that 2026 will be the first year a quantum computer outperforms classical computers on a real-world problem. Their latest processor, Nighthawk, arrived a year ahead of schedule.
PsiQuantum raised $1 billion in September 2025 to build a million-qubit machine. NVIDIA, the company that dominates GPUs, invested in the round.
And this is the detail that tells you where things are heading: NVIDIA launched NVQLink in October 2025, a hybrid architecture that connects GPUs directly to quantum processors. The two types of chips, working together in the same system.
NVIDIA isn’t fighting quantum. NVIDIA is integrating with quantum. Because they see the same pattern we’re describing: when the application outgrows the hardware, you don’t fight the next paradigm. You ride it.

The full pattern
Wave 1–1981 App that outgrew the chip: Office software (Microsoft) New chip: CPU Giant created: Intel ($509B)
Wave 2–2009 App that outgrew the chip: Crypto, then AI New chip: GPU Giant created: NVIDIA ($4.5T)
Wave 3–2016 App that outgrew the chip: AI training (too big for GPUs) New chip: TPU Giant created: Google Cloud AI
Wave 4–2024→ App that outgrew the chip: AI agents (too big for everything) New chip: QPU Giant created: ???
The acceleration is striking. CPU to GPU: ~28 years. GPU to TPU: ~7 years. TPU to QPU: ~8 years and counting. Each wave is faster. Each mobilizes more capital. Each creates a bigger giant.
Notice the acceleration:
- CPU to GPU: ~28 years
- GPU to TPU: ~7 years
- TPU to QPU: ~8 years (and counting)
Each wave is faster. Each wave mobilizes more capital. Each wave creates a bigger giant.
Intel peaked at $509 billion. NVIDIA reached $4.5 trillion, almost 10× bigger. If the pattern holds, the quantum giant could be the largest company in history.
The honest caveat
I believe this pattern is real. But I also believe the best ideas survive honesty.
Quantum computing is early. Jensen Huang, NVIDIA’s CEO, said at CES 2025 that truly useful quantum computing is 15 to 30 years away. The entire quantum industry made less than $1 billion in 2024. McKinsey projects a $97 billion quantum market by 2035, significant, but still small compared to today’s GPU economy.
Nobody has identified quantum’s “killer application” with certainty. The leading candidates are molecular simulation (drug discovery), optimization (finance, logistics), and cryptography. AI agent inference is a possibility but not a proven use case for quantum yet.
But here’s what I keep coming back to.
In 1981, nobody thought a software licensing clause would create a trillion-dollar industry.
In 2012, nobody thought a gaming chip would power a $4.5 trillion company.
In 2016, nobody outside Google thought we’d need an entirely new chip just for AI.
The next paradigm shift doesn’t announce itself. It builds quietly, in quantum labs, in hybrid architectures, in billion-dollar funding rounds, until one day the whole world wakes up and says: oh, that’s obvious.
We’re in the quiet part right now.
Claude, GPT, and Gemini are pushing computation to its classical limits. Google, IBM, and NVIDIA are building the quantum bridge. The capital is flowing. The physics is proven. The pattern is repeating.
The question is simple:
Who becomes the next NVIDIA?
That answer is being decided right now. In labs, in cloud partnerships, in architectural choices being made this year.
Most people won’t see it until it’s too late to matter.
Younes Quorsane CEO & CTO, YMH INNOVATION — Casablanca Multi-agent AI systems architect | GCRAI Ambassador MENA
Sources: DOJ v. Microsoft (1999), NVIDIA SEC filings (Q3 FY2026), Jon Peddie Research, Bloomberg, McKinsey Quantum Technology Monitor (2025), Google Cloud, Anthropic, IBM Research, Nature, Gartner, Bain & Company.
CPU, GPU, TPU, QPU: The Four Chips That Tell the Story of Computing was originally published in DataDrivenInvestor on Medium, where people are continuing the conversation by highlighting and responding to this story.