4 min read

The Custody Battle

Microsoft is considering suing over a $50 billion Amazon-OpenAI cloud deal. Read that sentence again. The company that invested $13 billion in OpenAI, that built its entire AI strategy around exclusive hosting rights, is now weighing litigation because its most important partner is moving workloads to a competitor.

This is not a contract dispute. It’s a custody battle. And like all custody battles, it tells you more about the relationship than anyone wants to admit.

The Triangle

The structure is simple. Microsoft gave OpenAI money. OpenAI gave Microsoft exclusivity. That deal made Azure the default home for the most important AI models in the world, and it made Microsoft the credible AI company it had spent decades trying to become.

Then OpenAI needed more. More compute. More flexibility. More government contracts. More everything. And “more” meant going multi-cloud, which meant Amazon.

The $50 billion arrangement with AWS isn’t just about capacity. It’s about OpenAI’s government business — classified and unclassified work that increasingly requires infrastructure OpenAI doesn’t control and Microsoft may not want to share. The Information reported that OpenAI signed the AWS deal specifically to sell AI tools to U.S. government customers. That’s a market where procurement rules, security clearances, and existing vendor relationships matter more than who has the best model.

Microsoft sees this as a breach. OpenAI sees it as growth. Amazon sees it as opportunity.

What Exclusivity Actually Meant

When Microsoft bought exclusivity, it bought something specific: the right to be the only cloud provider hosting OpenAI’s models at scale. That was the deal. You give us $13 billion, we give you the moat.

But exclusivity in AI is different from exclusivity in other industries. A movie studio can give Netflix exclusive streaming rights because the movie exists as a finished product. An AI model is not a finished product. It’s a service that requires continuous compute, continuous training, continuous inference. The “product” is inseparable from the infrastructure that runs it.

Which means exclusivity in AI is really infrastructure lock-in by another name. And infrastructure lock-in creates a dependency that gets uncomfortable fast when one party grows faster than the other.

OpenAI is now arguably more valuable than its relationship with Microsoft. Its models are the foundation of enterprise AI for hundreds of companies. Its API powers products that Microsoft doesn’t even know about. The government wants its tools. The military wants its tools. AWS has the government certifications that Azure often doesn’t.

So OpenAI started shopping. And Microsoft noticed.

The Deeper Pattern

This fight is not unique to Microsoft and OpenAI. It’s the natural outcome of every major AI partnership that was formed in the 2023-2024 “let’s figure this out together” era.

Google invested $2 billion in Anthropic. Then Anthropic started selling through AWS too. Amazon invested $4 billion in Anthropic. Then Anthropic kept its Google Cloud relationship. Every frontier lab is going multi-cloud because no single cloud provider can offer everything they need.

The original deals were structured as if AI would be a feature of existing cloud platforms. The reality is that AI is becoming a platform itself, and the labs that build the models have more leverage than the clouds that host them.

Microsoft’s potential lawsuit is the first crack in this structure. It won’t be the last.

What This Means for Everyone Else

If Microsoft sues, three things happen:

First, the fiction of “partnership” in AI infrastructure dies. These were always vendor relationships dressed up in partnership language. A lawsuit makes that explicit, which is clarifying even if it’s messy.

Second, multi-cloud becomes the default for every AI lab. If OpenAI — the company most entangled with a single cloud provider — can’t stay exclusive, nobody can. Anthropic, Mistral, Cohere, and every other lab will read this as confirmation that spreading infrastructure across AWS, Azure, and GCP is the only safe strategy.

Third, the power shifts toward the model labs and away from the clouds. In a custody battle, the child has more leverage than either parent. OpenAI can credibly threaten to shift more workloads to AWS. It can credibly offer more to GCP. Microsoft’s leverage is its historical investment and contractual rights, but those are diminishing assets in a market that moves this fast.

The Pipe Wars Continue

Yesterday it was IBM paying $11 billion for Confluent’s data pipes. Today it’s Microsoft threatening to sue over who gets to host OpenAI’s models. Tomorrow it’ll be something else.

The pattern is the same: the most expensive fights in AI aren’t about intelligence. They’re about infrastructure. Who owns the compute. Who controls the network. Who holds the contracts.

Micron reported 74.4% gross margins today on memory chips. Alibaba raised AI compute prices by up to 34%. Microsoft is considering litigation over cloud hosting rights. Nvidia restarted H200 production for China.

Every single headline is about pipes, not models.

The models are the thing everyone talks about. The pipes are the thing everyone fights over.

The custody battle has begun.