What Is a TPU?

So, you’ve known CPU and sort-of GPU since the days of your TI-99/4A, TRS80, and Commodore 64. But what is a TPU, that’s going to save the world?

1. CPU (Central Processing Unit)
The Generalist
The CPU is the “conductor” of the orchestra. It is designed to be versatile and handle a wide variety of tasks one after another (serial processing).
• How it works: It has a small number of very powerful cores that can handle complex logic and branching (if/then/else instructions).
• Real-world use: Running Windows/macOS, opening a spreadsheet, browsing the web, and managing the computer’s input/output.
2. GPU (Graphics Processing Unit)
The Multi-tasker
Originally designed for rendering video games, GPUs turned out to be amazing at doing simple math problems in parallel.
• How it works: Instead of a few powerful cores, it has thousands of tiny, simple cores. It breaks a complex task into thousands of smaller pieces and solves them all at the exact same time.
• Real-world use: Rendering 3D games, video editing, and training general AI models (like ChatGPT in its early stages).
3. TPU (Tensor Processing Unit)
The Specialist
The TPU is a custom chip (ASIC) designed by Google specifically for machine learning and deep learning. It is not sold to consumers as a component; it is accessed primarily through Google Cloud.
• How it works: It is stripped of features needed for graphics or general computing. Instead, it uses “Matrix Processing” (Systolic Arrays) to perform massive amounts of multiplication and addition simultaneously—which is exactly what neural networks need.
• Real-world use: Powering Google Search, Google Photos, Voice Search, and training massive AI models where speed and power efficiency are critical.

__________

Linas BeliūnasIncredible: Google may have just pulled off the quietest power move in AI, and almost nobody seems to grasp its significance 😳

Google’s latest AI model Gemini 3 Pro is now sitting at the top of the major leaderboards.

Not by a little, but clearly ahead on LMArena, WebDev Arena, and Vision Arena – in a year where the frontier is brutally competitive.

But here’s the part that actually matters:

Google trained it entirely on TPUs.

No NVIDIA.
No GPUs.
No external silicon.

Sure, that’s not new, as Google has used TPUs since Gemini 1.0…

But this marks the first time a TPU-trained model has matched and, in some areas, surpassed the strongest GPU-trained systems from OpenAI, Anthropic, and xAI in the same generation 🤯

Most importantly, it validates something Google has been quietly betting billions on:

Custom silicon + mature software + vertically integrated infrastructure can compete at the absolute frontier.

In other words, you don’t need Nvidia to build a world-class frontier AI model anymore.

Turns out, the next Google is Google.

See post on LinkedIn