Bloomberg reports that Microsoft is ending the revenue-sharing arrangement that was a cornerstone of the financial relationship since 2019. The reporting frames this as Microsoft moving from variable-cost to more predictable economics, reflecting Redmond's growing independence from OpenAI.
The editorial argues the power dynamic has inverted since the original deal. Microsoft now has multiple model providers on Azure (Mistral, Llama, Cohere, Phi), massive distribution through Office and GitHub, and enough in-house ML talent to reduce single-vendor risk — making the original revenue share no longer justified.
The editorial emphasizes that the per-API-call revenue share was what made the deal unprecedented, creating a flywheel where OpenAI's incentives were directly aligned with Azure's commercial success. Ending this mechanism fundamentally changes the economic logic of the partnership and represents a significant financial hit to OpenAI, which relied on Azure as a primary commercial channel.
The editorial frames the original investment as Microsoft 'buying a lottery ticket on a research lab that might produce something useful.' By 2026, with Azure AI at an estimated $15-20 billion annual run rate and Copilot shipped across the entire product suite, the relationship has shifted from speculative partnership to standard enterprise vendor management.
Microsoft is ending its revenue-sharing arrangement with OpenAI, according to Bloomberg reporting on April 27, 2026. The agreement — which gave OpenAI a cut of revenue generated through Azure's hosting of OpenAI models — was a cornerstone of the financial relationship between the two companies since Microsoft's initial $1 billion investment in 2019 and its subsequent $10 billion commitment in early 2023.
The revenue-sharing mechanism was what made the Microsoft-OpenAI deal unprecedented: OpenAI got paid not just for licensing models, but for every API call enterprises made through Azure. This created a flywheel where OpenAI's incentives were directly aligned with Azure's commercial success. Ending it fundamentally changes the economic logic of the partnership.
The exact terms of the new arrangement haven't been disclosed, but the direction is clear: Microsoft is moving from a variable-cost relationship (pay per usage) to something that gives Redmond more predictable economics and greater independence.
This isn't a breakup — it's a renegotiation from a position of strength. When Microsoft first invested in OpenAI, it was buying a lottery ticket on a research lab that might produce something useful. By 2026, Microsoft has shipped Copilot across its entire product suite, built significant in-house model capabilities, and watched Azure AI revenue grow to what analysts estimate is a $15-20 billion annual run rate.
The revenue share made sense when Microsoft needed OpenAI more than OpenAI needed Microsoft. That power dynamic has inverted. Microsoft now has multiple model providers on Azure (Mistral, Meta's Llama, Cohere, its own Phi series), a massive distribution advantage through Office and GitHub, and enough internal ML talent to reduce its single-vendor risk.
For OpenAI, this is a significant financial hit. Azure OpenAI Service represents one of the primary commercial channels for OpenAI's technology reaching enterprise customers. Without revenue sharing, OpenAI becomes more dependent on its own ChatGPT consumer revenue and direct API sales — businesses where it competes with, rather than complements, Microsoft.
The timing is notable. OpenAI has been restructuring toward a for-profit entity, raising capital at increasingly aggressive valuations, and expanding into enterprise sales directly. From Microsoft's perspective, continuing to fund a company that's simultaneously becoming a competitor in enterprise AI makes diminishing strategic sense.
If you're building on Azure OpenAI Service today, nothing changes immediately. Your API endpoints still work, your models are still served, your SLAs are still in effect. But the long-term signal is that Microsoft's AI platform strategy is diversifying away from OpenAI exclusivity, which means your vendor lock-in risk on any single model family just decreased.
Practically, this likely accelerates Microsoft's push toward model-agnostic tooling. Expect Azure AI Studio to increasingly treat OpenAI models as one option among many, with better abstractions for swapping between providers. If you've been building OpenAI-specific integrations on Azure (using features only available on GPT models), consider whether an abstraction layer makes sense now rather than later.
For teams evaluating AI infrastructure decisions: the era of assuming Microsoft and OpenAI are functionally the same bet is over. They're becoming separate bets with overlapping but diverging roadmaps. Price your platform risk accordingly.
The broader ecosystem effect is also worth noting. With Microsoft less financially tied to OpenAI's success, expect Azure to more aggressively court and promote alternative model providers. Anthropic, Google, and open-source models on Azure all benefit from Microsoft having less incentive to privilege OpenAI in its marketplace.
The Microsoft-OpenAI partnership will likely continue in some form — Microsoft still holds a significant equity stake and the models remain genuinely good. But the financial umbilical cord being cut means both companies will optimize for their own interests more openly. For developers, the practical takeaway is to build for model portability. The days of any single AI provider being a safe 10-year bet are clearly numbered, and the two companies that seemed most likely to remain joined at the hip just proved the point.
This agreement feels so friendly towards OpenAI that it's not obvious to me why Microsoft accepted this. I guess Microsoft just realized that the previous agreement was kneecapping OpenAI so much that the investment was at risk, especially with serious competition now coming from Anthropic?
This gives OpenAI the ability to goto AWS instead of exclusively on Azure. I guess Azure really is hanging on by a thread.https://news.ycombinator.com/item?id=47616242
As former corporate restructuring lawyer…this kind of stuff indicates the cash strapped scramble of the end days.
A wise man from Google said in an internal memo to the tune of: "We do not have any moat neither does anyone else."Deepseek v4 is good enough, really really good given the price it is offered at.PS: Just to be clear - even the most expensive AI models are unreliable, would make stupid mist
Top 10 dev stories every morning at 8am UTC. AI-curated. Retro terminal HTML email.
Opinions are my own.I think the biggest winner of this might be Google. Virtually all the frontier AI labs use TPU. The only one that doesn't use TPU is OpenAI due to the exclusive deal with Microsoft. Given the newly launched Gen 8 TPU this month, it's likely OpenAI will contemplate using