Amazon Invests $5 Billion in Anthropic, Secures $100 Billion AWS Cloud Commitment
Image: TradingView

Amazon Invests $5 Billion in Anthropic, Secures $100 Billion AWS Cloud Commitment

21 April, 2026.Technology and Science.15 sources

Key Takeaways

  • Amazon commits $5B upfront to Anthropic, up to $25B total.
  • Up to 5 GW of Trainium-based compute for Claude on AWS.
  • Anthropic will spend over $100B on AWS over the next decade.

Deal locks compute

Amazon and Anthropic announced an expanded partnership that ties a new investment to a long-term cloud spending commitment, with the companies framing it as an infrastructure exchange rather than a simple equity bet.

Anthropic to secure up to 5 gigawatts (GW) of current and future generations of Amazon’s Trainium chips to train and power their advanced AI models

About AmazonAbout Amazon

TechCrunch reported that Amazon agreed to invest a fresh $5 billion, bringing Amazon’s total investment in Anthropic to $13 billion, while Anthropic agreed to spend over $100 billion on AWS over the next 10 years.

Image from About Amazon
About AmazonAbout Amazon

TechCrunch added that the deal provides “up to 5 GW of new computing capacity to train and run Claude,” and that it covers Trainium2 through Trainium4 chips, even though “Trainium4 chips are not currently available.”

The Financial Express similarly described an immediate $5 billion investment and said Amazon could invest up to an additional $20 billion depending on commercial milestones, while Anthropic pledged to purchase massive computing capacity from AWS, including up to 5 gigawatts dedicated to training and running Claude AI models.

GeekWire said the arrangement mirrors Amazon’s earlier OpenAI cloud deal, describing “up to $25 billion in new investment and a $100 billion-plus commitment to AWS over 10 years,” and noted that the full Claude Platform would be available directly within AWS.

In parallel, PYMNTS reported that AWS customers will be able to access the full Anthropic-native Claude console “from within AWS,” using existing AWS contract terms “with no additional credentials, contracts or billing relationships.”

Chips and capacity

The expanded agreement centers on Amazon’s custom AI silicon and the compute capacity Anthropic will receive to train and deploy Claude.

TechCrunch said the deal is “At the heart of this deal is Amazon’s custom chips: Graviton (a low-power CPU) and Trainium (an Nvidia competitor and AI accelerator chip),” and it specified that “The Anthropic deal specifically covers Trainium2 through Trainium4 chips.”

Image from Anthropic
AnthropicAnthropic

Techzine Global described the same chip line in more operational terms, saying the commitment involves “Trainium2, Trainium3, and Trainium4 chips,” along with “tens of millions of Graviton cores,” and it added that Anthropic aims to secure “up to 5 gigawatts (GW) of computing capacity for training and running Claude models.”

The Financial Express reported that the deal features “the Trainium2, and the upcoming Trainium3 and Trainium4 accelerators,” and it stated that Anthropic said the agreement will bring “nearly 1 GW of Trainium2 and Trainium3 capacity online by the end of 2026.”

Chosunbiz described the same structure as a circular commitment, saying Amazon could increase the investment to “as much as $20 billion” depending on commercial performance, while Anthropic pledged “to make more than $100 billion in expenditure on Amazon's cloud technology over the next 10 years,” including use of Trainium.

GeekWire added that Anthropic’s cloud commitment spans “Trainium2 through Trainium4,” and said the companies stated “nearly 1 gigawatt of Trainium2 and Trainium3 capacity will come online by the end of this year.”

Data Center Knowledge reinforced the infrastructure framing by quoting Stephen Sopko, saying “What you’re seeing is major AI players aligning with cloud providers that have the scale and reach,” and it also quoted Dario Amodei’s line about building infrastructure to keep pace with demand.

Executives justify the push

TechCrunch quoted the structure as a cloud-infrastructure exchange, and it described how the deal “echoes an agreement Amazon struck with OpenAI just two months ago,” when Amazon joined a $110 billion funding round and contributed $50 billion.

In the Financial Express, Anthropic’s CEO Dario Amodei stressed the need for infrastructure, saying, “Our users tell us Claude is increasingly essential to how they work, and we need to build the infrastructure to keep pace with rapidly growing demand,” while Amazon CEO Andy Jassy said, “Our custom AI silicon offers high performance at significantly lower cost for customers, which is why it’s in such hot demand.”

Chosunbiz likewise attributed a rationale to Amodei, saying “Claude is becoming an essential tool for work, and we need to expand infrastructure to meet surging demand,” and it quoted Jassy that “Anthropic's decision to run large-scale AI models on Amazon Trainium demonstrates the results of Amazon's custom-designed semiconductor technology.”

PYMNTS framed the same message through direct quotes in its account of the press release, quoting Jassy: “Anthropic’s commitment to run its large language models on AWS Trainium for the next decade reflects the progress we’ve made together on custom silicon, as we continue delivering the technology and infrastructure our customers need to build with generative AI,” and quoting Amodei: “Our collaboration with Amazon will allow us to continue advancing AI research while delivering Claude to our customers, including the more than 100,000 building on AWS.”

GeekWire added that Anthropic’s blog post acknowledged “surging consumer demand has strained its infrastructure, impacting reliability during peak hours,” and it described the expanded AWS deal as designed to relieve that pressure.

Across outlets, the executives’ statements were consistently tied to scaling Claude and to using Trainium and Graviton for training and inference.

Platform integration and access

Beyond the investment and compute, the partnership changes how customers access Anthropic’s Claude tools inside AWS.

TechCrunch said the deal is structured partly as cloud infrastructure services rather than straight cash, and it described how Anthropic obtained capacity to train and run Claude, with an option to buy capacity on future Amazon chips as they become available.

Image from bloomingbit
bloomingbitbloomingbit

Techzine Global reported that “AWS customers can access Anthropic’s native Claude experience from their existing AWS account, without additional login credentials or separate billing,” and it connected this to the partnership’s earlier start when “Amazon first invested $4 billion” in 2023 and “AWS became Anthropic’s primary cloud provider.”

GeekWire similarly said “the full Claude Platform will be available directly within AWS,” and it described this as deeper integration than having Claude available through Amazon’s Bedrock marketplace.

PYMNTS echoed the access change, stating that “AWS customers will be able to access the full Anthropic-native Claude console fromwithin AWS,” and that it would be done “using their existing AWS contract with no additional credentials, contracts or billing relationships.”

The Next Web added that AWS customers will be able to access “the full Claude Platform, Anthropic’s native product interface, directly through their existing AWS accounts, without separate credentials or billing,” and it described this as a step beyond Claude through Bedrock.

Across these accounts, the integration is presented as both a distribution mechanism for Claude and a way to bind Anthropic’s model delivery to AWS accounts and billing controls.

Market framing and stakes

The deal is also being positioned as a strategic response to competition in AI infrastructure and as a signal about future public-market expectations.

EnglishIT Amazon boosts Anthropic with $5b investment to drive AWS chip demand Deal ties Anthropic spending to AWS as Trainium adoption and compute scale accelerate By Ahn Shang-hee Published 2026

ChosunbizChosunbiz

GeekWire described Amazon’s move as “mirroring its OpenAI cloud deal,” and it said the expanded AWS arrangement is “a direct rebuttal to OpenAI’s claim last week that Anthropic made a ‘strategic misstep to not acquire enough compute’ and was ‘operating on a meaningfully smaller curve.’”

Image from Chosunbiz
ChosunbizChosunbiz

TechCrunch similarly framed the announcement as part of a broader pattern of cloud-infrastructure deals, noting that Amazon’s OpenAI agreement two months earlier joined a $110 billion funding round and valued the ChatGPT maker at a $730 billion pre-money valuation.

TradingView’s market-focused account said Amazon shares rose about 3% premarket after the company announced plans to invest up to $25 billion in Anthropic, and it described the structure as turning Amazon into a “long-term infrastructure anchors.”

The Financial Express added a potential IPO angle by asking “Is an IPO next?” and it reported that the initial investment is priced at Anthropic’s latest valuation of $380 billion, while GeekWire said both Anthropic and OpenAI prepare for potential IPOs and that each company is seeking to demonstrate long-term capacity commitments.

Data Center Knowledge included an analyst quote from Stephen Sopko, saying “AWS’s positioning here makes sense,” and it tied the alignment to enterprise customers moving “beyond pilots into real ROI from AI.”

Meanwhile, The Next Web described the geopolitical and policy backdrop, saying Anthropic is “currently barred from Department of Defensecontracts following a supply-chain risk designation that it is contesting in court,” and it placed that dispute alongside a broader AI policy environment.

More on Technology and Science