Google Just Bet $40 Billion on Anthropic: How Claude Code Sparked the Largest AI Funding Frenzy in History
In the same week, Amazon committed $25 billion and Google followed with $40 billion. What makes Anthropic so irresistible to two tech giants at once? Two words: Claude Code.
The Scale of This Deal
On April 24, Google announced an investment of up to $40 billion in Anthropic — with a firm $10 billion cash commitment upfront, and the remaining $30 billion to be unlocked as Anthropic hits specific performance milestones. The round values Anthropic at $350 billion.
That same week, Amazon announced an additional $5 billion investment in Anthropic, with an option to invest up to $20 billion more.
Combined, Anthropic secured up to $65 billion in committed investment within a single week.
This isn’t a normal fundraise. This is the world’s two largest cloud providers making a head-on bet on the same AI startup.
Why Now? The Claude Code Breakout
To understand this round, you need to understand one thing: Claude Code is reshaping how Silicon Valley engineers actually write software.
Bloomberg’s reporting included a key line worth highlighting:
Anthropic has ramped up its fundraising amid the breakout success of Claude Code, an AI agent that speeds the process of writing computer software.
Claude Code has become the go-to AI coding tool for engineers across the industry — including engineers inside Google itself. That reality has created real urgency at the top of Google’s leadership: they’re investing in Anthropic while simultaneously scrambling to catch up with their own AI coding products.
Meanwhile, Anthropic’s Cowork agent (a general-purpose AI agent for non-technical users) is also growing fast, broadening Anthropic’s user base well beyond developers.
Google Isn’t Just Writing a Check
The deal structure tells a bigger story. Beyond cash, Google is committing to:
- 5 gigawatts of compute capacity for Anthropic over the next five years (a major expansion of their previous agreement)
- Potentially additional gigawatts of compute on top of that
- For reference, 5 GW is roughly enough to power 3.75 million American homes simultaneously
Why does compute matter more than cash? Because the core bottleneck for training and running frontier models is compute. Google’s TPUs are one of the most credible alternatives to Nvidia GPUs — and nearly as scarce. Securing this capacity means Anthropic can keep training bigger, better models without getting dragged into the chip procurement war.
The Competitive Landscape in Numbers
| Investor | This Round | Total Commitment (Up To) | Valuation |
|---|---|---|---|
| $10B | $40B | $350B | |
| Amazon (same week) | $5B | $25B | $350B |
| Feb 2026 Round | $30B | — | $350B |
Here’s the kicker: when Anthropic closed its $30 billion round in February, it was already priced at $350 billion. But since then, secondary market buyers have been willing to pay valuations north of $800 billion. Google and Amazon locked in at $350 billion — a strategic bet in itself.
One more piece of context: Anthropic is reportedly considering an IPO as early as October this year. Back-to-back mega-rounds like this serve a dual purpose — stockpiling resources and setting the valuation narrative ahead of a public listing.
Can You Be Both Investor and Competitor?
There’s an inherent tension in this deal: Google is simultaneously one of Anthropic’s biggest backers and one of its most direct competitors.
Google has Gemini. It has Codey. It has its own AI coding toolchain. But its own engineers are using Claude Code. Its cloud customers are buying Claude API access. Its TPUs are powering Anthropic’s model training.
This kind of “cooperate and compete” dynamic has become the norm in AI. Bloomberg calls these circular deals — big tech invests in AI startups, sells them chips and cloud services, and the startups use their funding to buy those very resources right back.
How long this loop can sustain itself is something market analysts are watching closely.
What This Means for Developers
From a developer’s perspective, this funding round has two practical implications:
1. Expect faster model iterations. More compute = shorter training cycles = more frequent model updates. Over the past quarter, Anthropic shipped Opus 4.6, Opus 4.7, and continued iterating on Sonnet and Haiku. With this level of compute backing, that pace will only accelerate.
2. Claude API reliability and availability should improve. As Anthropic’s infrastructure scales, API capacity and uptime get more headroom. For developers relying on Claude in production — whether through the official API or services like ClaudeAPI.com — a more robust upstream means a better experience downstream.
If you haven’t integrated the Claude API yet, now is a good time to set up your toolchain while model capabilities are still rapidly improving. Visit claudeapi.com to get an API key with direct access from anywhere — no VPN required — and support for international payment methods including Alipay and WeChat Pay.
The Bottom Line
The signal from this funding round is clear: Claude Code has transformed Anthropic from “a promising AI company” into core infrastructure for engineering productivity. When engineers start depending on a product for their daily workflow, investment competition around that product gets fierce — fast.
Google and Amazon placing a double bet in the same week is essentially saying: we can’t let Anthropic fall into a competitor’s hands, and we can’t afford to sit out the AI coding race.
For developers, this is good news: Anthropic won’t be short on cash or compute anytime soon, and will keep investing aggressively in both models and products. Claude’s competitiveness isn’t going anywhere.



