Google Teams With Marvell to Build Two New AI Chips – MPU & Next-Gen TPU

Release date:2026-04-22 Number of clicks:154

Google is in deep talks with Marvell to co‑develop two new AI chips, directly challenging NVIDIA’s dominance.

The first is a Memory Processing Unit (MPU) – designed to break the memory bottleneck in AI computing, working alongside Google’s existing TPU. The second is a next‑generation TPU optimized specifically for AI inference workloads.

1776845812114993.jpg

Target: MPU design and trial production by 2027. Google plans to produce nearly 2 million MPUs that year, complementing ~6 million TPUs – building a full‑scale AI compute backbone.

Google has long pushed TPU as an alternative to NVIDIA GPUs. It now plans to license TPU IP to Anthropic, Meta and others. By 2027, TPU licensing could bring over $10 billion in high‑margin annual revenue.

Broadcom was Google’s sole TPU design partner, but rising licensing costs pushed Google to diversify. Marvell won the deal thanks to its custom silicon expertise and the Structera CXL solution for rack‑level memory pooling. Marvell’s custom XPU business already serves Amazon and Microsoft, generating $1.5 billion annualized revenue.

With inference now accounting for over 60% of AI total cost, this partnership targets the fastest‑growing battlefield in AI silicon.

ICgoodFind : Google and Marvell take direct aim at inference and memory bottlenecks – two critical levers for AI’s next phase. We track custom AI chip supply chains.

Home
TELEPHONE CONSULTATION
Whatsapp
Semiconductor Technology