Microsoft & Anthropic: AI in Office 365 & Lock-in Risks

by Archynetys Technology & Science Desk

Microsoft Relies on AWS for AI Model Capacity Amid Chip Delays

By Amelia Monroe | SAN FRANCISCO – 2025/09/11 07:17:39

Delays in microsoft’s Maia AI-ASICs may be pushing the tech giant to leverage Amazon Web Services (AWS) for running Anthropic’s AI models, revealing a complex interplay of competition and collaboration in the AI landscape.


According to Harrowell, “The downside, of course, is the margin stacking that results. AWS is not the cheapest LLM API provider,and their margin is layered on top of Anthropic’s. Microsoft will want to bring it on-platform as soon as they understand it and have the capacity. They seem to be buying capacity in every direction at the moment, with the deal with Nebius possibly reflecting delays in the Maia AI-ASICs.”

This situation suggests that the setbacks in Microsoft’s Maia AI-ASICs, which are crucial for Azure‘s AI capabilities, might be compelling the company to depend on AWS to operate Anthropic’s models.

Cooperation Among Competitors

“AWS is not the cheapest LLM API provider, and their margin is layered on top of Anthropic’s.”

Such collaboration between rivals isn’t unusual in the tech industry, notes Sharath Srinivasamurthy, research vice president at IDC. He points to Apple sourcing display panels from Samsung as a similar example of competitors working together.

Author Avatar

About the Author:

Amelia Monroe is a technology reporter covering the latest developments in AI, cloud computing, and the semiconductor industry.


Related Posts

Leave a Comment