Argues that by tearing up both the revenue-sharing and exclusivity provisions simultaneously, both companies are acknowledging the original deal was structurally broken. When the terms of a deal start actively harming the asset you invested in, you renegotiate or watch it depreciate.
Submitted the Bloomberg report detailing how Microsoft will stop sharing revenue with OpenAI in exchange for releasing OpenAI from its Azure exclusivity commitment, framing it as a fundamental restructuring of the $13 billion partnership.
Drawing on experience with corporate restructurings, argues that this kind of deal renegotiation 'indicates the cash strapped scramble of the end days.' Views the willingness to give up guaranteed Azure revenue as a sign of underlying financial distress rather than strategic optimization.
Highlights that OpenAI's release from Azure exclusivity opens the door to running workloads on Google's TPUs and AWS custom silicon. Views multi-cloud access as the real strategic prize, especially given Azure's capacity constraints and performance criticism relative to competitors.
Identifies OpenAI's potential access to Google's TPUs as 'the most consequential downstream effect,' noting that Azure exclusivity prevented OpenAI from optimizing training runs across the best available hardware while competitors like Anthropic and DeepSeek operated with more infrastructure flexibility.
Notes the timing is 'not accidental,' citing increasing competitive pressure from Anthropic (backed by Amazon and Google), DeepSeek (which proved frontier models can be trained far more cheaply), and Google's Gemini line. Azure's own AI infrastructure capacity constraints and performance shortcomings relative to TPUs and AWS silicon made the exclusivity clause a competitive liability.
Microsoft and OpenAI have agreed to fundamentally restructure their partnership. The headline change: Microsoft will stop sharing revenue with OpenAI, and in exchange, OpenAI is released from its exclusive commitment to run on Azure infrastructure. The deal, reported by Bloomberg on April 27, effectively ends the most unusual corporate marriage in the AI era — one where Microsoft poured $13 billion into a company that was contractually locked to its cloud platform.
The previous arrangement gave Microsoft a cut of OpenAI's commercial revenue and required all OpenAI workloads to run on Azure. By tearing up both provisions simultaneously, both companies are acknowledging that the original deal was structurally broken. Microsoft gets to stop writing checks from its cloud division back to its AI investment; OpenAI gets to shop for compute on the open market.
The timing is not accidental. OpenAI has been under increasing competitive pressure from Anthropic (backed by Amazon and Google), DeepSeek (which proved you can train frontier models for a fraction of the assumed cost), and Google's own Gemini line. Meanwhile, Azure's AI infrastructure has faced capacity constraints and performance criticism relative to Google's TPUs and AWS's custom silicon.
The Azure exclusivity clause was always a double-edged sword. It guaranteed Microsoft a massive anchor tenant for its AI cloud business, but it also meant OpenAI couldn't optimize its training runs across the best available hardware. As one Hacker News commenter with a corporate restructuring background put it, "this kind of stuff indicates the cash strapped scramble of the end days." That's probably too dramatic, but the underlying logic holds: when the terms of a deal start actively harming the asset you invested in, you renegotiate or watch it depreciate.
The most consequential downstream effect may be OpenAI's access to Google's TPUs. As HN commenter thanhhaimai noted, virtually all frontier AI labs except OpenAI use TPU infrastructure. Google launched its Gen 8 TPU this month, and the performance-per-dollar improvements are significant enough that any rational compute buyer would want access. OpenAI, until now, was the only major lab contractually prevented from even evaluating TPU workloads.
The competitive dynamics here are layered. Microsoft accepted a deal that looks "so friendly towards OpenAI that it's not obvious why Microsoft accepted this," as one commenter observed. The answer is probably straightforward: Microsoft's alternative was watching OpenAI slowly lose its lead while locked into Azure, then having a $13 billion stake in the second-best AI company. Anthropic's Claude models have been gaining enterprise share. DeepSeek v4 is, in the words of the HN community, "really really good given the price it is offered at." The no-moat memo from Google's internal leak two years ago keeps looking more prescient.
There's also a reading of this that's less about generosity and more about accounting. Revenue sharing meant Microsoft was effectively subsidizing OpenAI's operations through Azure credits while simultaneously counting that usage as Azure revenue. Eliminating the revenue share simplifies Microsoft's AI financials at a time when investors are increasingly skeptical about when AI infrastructure spending will produce real returns.
If you're an engineering team currently running OpenAI workloads, the practical implications unfold over the next 6-18 months. OpenAI on Azure isn't going away — Microsoft remains both an investor and a distribution partner. But OpenAI endpoints served from AWS or GCP become a plausible near-future scenario, which changes the calculus for teams that avoided OpenAI specifically because of Azure lock-in.
For infrastructure teams evaluating cloud commitments, this is a signal to build more cloud-agnostic AI pipelines. If you've been assuming OpenAI means Azure, stop. The abstraction layers you invest in now — whether that's LiteLLM, a custom gateway, or just a clean provider interface in your codebase — will pay off as OpenAI's infrastructure diversifies. Teams that hardcoded Azure-specific assumptions into their AI stack are about to learn why coupling your application logic to a specific provider's deployment model is always a mistake.
The broader market implication is that AI model providers and cloud providers are decoupling. Anthropic already runs on AWS and GCP. Google serves Gemini from its own infrastructure. Now OpenAI joins the multi-cloud party. For practitioners, this means the winning strategy is optimizing for model quality and cost per token, not cloud provider loyalty — the vendor lock-in play in AI infrastructure is officially dead.
If you're a startup building on OpenAI's API, this is net positive. More infrastructure competition for OpenAI's training and serving workloads should eventually translate into lower API pricing or faster model iteration. Whether that happens in Q3 2026 or Q1 2027 depends on how quickly OpenAI can actually migrate workloads, which is a non-trivial engineering effort even with unlimited budget.
The Microsoft-OpenAI restructuring marks the end of the AI industry's "exclusive partnership" phase and the beginning of something more closely resembling how the rest of enterprise software works: providers compete on merit, customers pick the best tool for the job, and no single cloud vendor gets to claim an entire model ecosystem. The question that remains is whether OpenAI's models are still differentiated enough to justify premium pricing once the Azure captive-audience effect disappears. If DeepSeek and open-source alternatives keep closing the gap, Microsoft may look back on this deal and realize they didn't just release OpenAI from exclusivity — they released themselves from a $13 billion bet that the moat was real.
This agreement feels so friendly towards OpenAI that it's not obvious to me why Microsoft accepted this. I guess Microsoft just realized that the previous agreement was kneecapping OpenAI so much that the investment was at risk, especially with serious competition now coming from Anthropic?
This gives OpenAI the ability to goto AWS instead of exclusively on Azure. I guess Azure really is hanging on by a thread.https://news.ycombinator.com/item?id=47616242
Nadella had OpenAI by the short and curlies early on. But all I've seen from him in the last couple of years is continuously acquiescing to OpenAI's demands. I wonder why he's so weak and doesn't exert more control over the situation? At one point Microsoft owned 49% of OpenAI bu
A wise man from Google said in an internal memo to the tune of: "We do not have any moat neither does anyone else."Deepseek v4 is good enough, really really good given the price it is offered at.PS: Just to be clear - even the most expensive AI models are unreliable, would make stupid mist
Top 10 dev stories every morning at 8am UTC. AI-curated. Retro terminal HTML email.
Opinions are my own.I think the biggest winner of this might be Google. Virtually all the frontier AI labs use TPU. The only one that doesn't use TPU is OpenAI due to the exclusive deal with Microsoft. Given the newly launched Gen 8 TPU this month, it's likely OpenAI will contemplate using