Anthropic Inks New Deal With Google for A Million Cloud TPUs to Supercharge Claude AI
The AI startup strengthens multi-cloud strategy amid growing competition
Anthropic has just announced plans to expand its use of Google Cloud TPUs and services. The company is now adding up to one million AI accelerators in a deal worth tens of billions of dollars. Anthropic says that with this move, it wants to have a gigawatt of compute capacity online by 2026.
Anthropic has long relied on Google’s Tensor Processing Units (TPUs) to train its Claude AI models. Thomas Kurian, Google Cloud CEO, said this latest expansion reflects “the strong price-performance and efficiency Anthropic teams have seen with TPUs for several years.”
Anthropic, which now serves over 300,000 business users, has seen a sevenfold increase in large enterprise accounts over the past year. With demand surging, the new TPU capacity will help Anthropic perform more rigorous testing, model alignment, and large-scale deployments.
Speaking about the deal, Anthropic CFO Krishna Rao says:
Anthropic and Google have a longstanding partnership, and this latest expansion will help us continue to grow the compute we need to define the frontier of AI. Our customers—from Fortune 500 companies to AI-native startups—depend on Claude for their most important work, and this expanded capacity ensures we can meet our exponentially growing demand while keeping our models at the cutting edge of the industry.
In addition to Google’s TPUs, Anthropic also uses Amazon’s Trainium and NVIDIA’s GPUs. This multi-platform approach ensures flexibility, efficiency, and performance across workloads.
The company continues to collaborate closely with Amazon, its primary training partner, on Project Rainier. Anthropic’s expansion comes as rival OpenAI entered $100 billion GPU investment deal with NVIDIA to build next-generation AI infrastructure. However, that deal comes with a chip leasing twist.
Read our disclosure page to find out how can you help Windows Report sustain the editorial team. Read more
User forum
0 messages