TPU Projects .

Technology

TPU

TPU (Tensor Processing Unit) is Google's custom-designed ASIC, engineered for maximum efficiency in accelerating large-scale machine learning training and inference workloads.

The Tensor Processing Unit (TPU) is Google’s application-specific integrated circuit (ASIC) built to accelerate demanding AI workloads, particularly those involving large matrix calculations. These accelerators are purpose-built for deep learning models, including Large Language Models (LLMs), and power core Google products like Gemini, Search, and Maps (serving over 1 billion users). Current generations, such as Cloud TPU v5p and v5e, offer cost-effective, scalable solutions for both training and inference. Unlike general-purpose GPUs, TPUs feature specialized components like the Matrix Multiply Unit (MXU) and SparseCores, delivering superior performance for high-volume, low-precision computation, with a single 256-chip pod achieving up to 11.5 petaFLOPS of performance in earlier versions.

https://cloud.google.com/tpu
1 project · 1 city

Related technologies

Recent Talks & Demos

Showing 1-1 of 1

Members-Only

Sign in to see who built these projects