AWS and Anthropic Deepen Ties: Claude Now Trained on AWS Silicon, 'Cowork' Lands in Bedrock

From Corea24, the free encyclopedia of technology

AWS and Anthropic have announced a major expansion of their partnership, revealing that Anthropic is now training its most advanced foundation models on AWS Trainium and Graviton chips, while also launching 'Claude Cowork'—a collaborative AI capability—within Amazon Bedrock.

The move signals a deepening of the strategic alliance between the cloud giant and the AI startup, positioning AWS as the preferred infrastructure provider for cutting-edge AI development. Anthropic’s decision to co-engineer at the silicon level with AWS’s Annapurna Labs aims to maximize computational efficiency from hardware through the full stack.

“This is not just a cloud contract; it’s a co-engineering partnership,” said Dr. Sarah Chen, cloud infrastructure analyst at Gartner. “By training on AWS silicon, Anthropic is committing to a long-term architectural alignment that could redefine AI hardware optimization.”

Claude Cowork Now Available in Amazon Bedrock

Starting today, enterprise builders can deploy Claude Cowork within their existing Amazon Bedrock environments. The feature allows teams to work alongside Claude as a true collaborator—handling multi-step tasks, code generation, and real-time reasoning while keeping data secure within AWS.

AWS and Anthropic Deepen Ties: Claude Now Trained on AWS Silicon, 'Cowork' Lands in Bedrock
Source: aws.amazon.com

“Claude Cowork transforms the AI from a tool into a teammate,” said an AWS spokesperson. “Enterprises can now run collaborative AI workflows without data leaving their secure AWS perimeter.”

Meta Signs Agreement for Graviton Chips

Separately, Meta has signed an agreement to deploy AWS Graviton processors at scale, beginning with tens of millions of cores to power agentic AI workloads. These include real-time reasoning, code generation, search, and multi-step task orchestration—all CPU-intensive tasks that benefit from Graviton’s efficiency.

The partnership underscores AWS’s push to position its custom chips as the foundation for next-generation AI infrastructure. Meta’s adoption of Graviton for agentic AI marks one of the largest such deployments to date.

AWS Lambda Now Supports S3 Files as File Systems

AWS also announced that Lambda functions can now mount Amazon S3 buckets as file systems using S3 Files. Built on Amazon EFS, this feature lets functions perform standard file operations without downloading data, and multiple functions can share a common workspace simultaneously.

AWS and Anthropic Deepen Ties: Claude Now Trained on AWS Silicon, 'Cowork' Lands in Bedrock
Source: aws.amazon.com

This is particularly valuable for AI and machine learning workloads where agents need to persist memory and share state. “S3 Files eliminates the friction of data staging for serverless AI pipelines,” the spokesperson added.

Background

The AWS-Anthropic partnership dates back to 2023, when AWS invested $4 billion in the AI startup. Since then, the two companies have collaborated on model training, safety research, and enterprise deployment tools. The latest announcements deepen that collaboration at the hardware and application layers.

Meta’s decision to use AWS Graviton chips comes as the social media giant builds out its internal AI capabilities, including large language models and agent systems. The deal signals a shift toward custom silicon for AI inference and reasoning tasks, moving beyond traditional GPUs.

What This Means

For enterprise builders, the combined announcements mean tighter integration between AI models and the underlying cloud infrastructure. Training on AWS silicon could reduce costs and latency for Claude-based applications, while Claude Cowork offers a new paradigm for team-based AI interaction.

The Meta-Graviton agreement may accelerate adoption of AWS custom chips for agentic AI, potentially challenging Nvidia’s dominance in the AI hardware market. Meanwhile, Lambda’s S3 Files feature removes a key friction point for serverless AI pipelines, making it easier to build memory-augmented agents on AWS.

In the coming weeks, expect more details on the Claude Platform for AWS, a unified developer experience that will let builders build, deploy, and scale Claude-powered applications entirely within the AWS ecosystem.

— Reporting contributed by industry analysts.