The seven-year Microsoft–OpenAI exclusivity arrangement formally ended this week, and the day-after coverage reset what the rest of the cloud-AI market thinks it can ask for. The amended agreement OpenAI and Microsoft published replaces the original AGI revocation clause — the contractual provision that let OpenAI unilaterally withdraw IP from Microsoft once an internal AGI determination was made — with a defined-stage handover process tied to capability thresholds and external attestation. In practical terms, Microsoft loses its hard veto on where OpenAI's frontier weights are served, and OpenAI loses its all-or-nothing exit. Both sides walked away with what the markets read as the more durable settlement: Microsoft retains a long-term economic interest plus committed compute purchases, OpenAI gets multi-cloud distribution and the ability to sign deals like the AWS one announced the same morning.
The AWS leg is the headline operational consequence. OpenAI's GPT-class models, Codex, and the Managed Agents tier are now available natively on Amazon Bedrock, with new SKUs appearing on AWS Marketplace within hours of the announcement. Stratechery's interview with Altman and Garman frames the deal as Bedrock graduating from a model-aggregator to a managed-agent platform — Garman emphasized that Bedrock's value proposition shifts from "any frontier model in your VPC" to "any frontier agent that can reach your AWS data without leaving your VPC." Altman's framing was narrower and more financial: the AWS partnership unlocks federal and enterprise demand that was constrained by AWS-residency requirements OpenAI could not meet under the prior Microsoft-exclusive arrangement.
The structural read is that the AGI clause was load-bearing for both companies' long-term investor pitch — without it, OpenAI's defense against being permanently dependent on Microsoft was nominal, and with it, Microsoft's claim to having locked in the strongest model lab was perpetually under a poison pill. Replacing it with capability-threshold language plus a graduated transfer schedule lowers the legal risk on both sides. Hacker News ran two parallel front-page threads — one on the exclusivity ending (966 points) and one on the AWS/Bedrock interview (227 points). Top comments converged on two open questions: how the new "AGI" language operationalizes the threshold without recreating the same all-or-nothing trigger, and whether Microsoft's Azure compute commitments now act as a soft retention mechanism in lieu of contractual exclusivity.
For the cloud-AI market, this resets the negotiating posture for every other lab-cloud relationship. Anthropic–AWS, Google–DeepMind, Mistral–Azure, and the smaller Chinese-lab cloud deals are all now operating in a context where OpenAI–Microsoft exclusivity is no longer the implicit reference point. TechCrunch's reporting noted that Bedrock's new managed-agent SKUs include OpenAI Codex pricing tiers that undercut OpenAI's direct API for AWS-resident workloads — meaning AWS-resident enterprise spend on OpenAI now flows partially through Amazon's economics, not OpenAI's, and that compromise is presumably what made the deal possible at all.
- OpenAI Research blog frames the AWS launch as enterprises gaining secure access to GPT models, Codex, and Managed Agents inside their own AWS environments — explicit Bedrock-resident managed-agent positioning.
- Stratechery's Altman/Garman interview emphasizes the partnership as legitimization of OpenAI as a multi-cloud vendor and a structural break from Microsoft exclusivity; Thompson reads the AGI clause replacement as the gating change that unlocked it.
- Hacker News ran two parallel front-page threads — one on the Microsoft exclusivity ending (966 points) and one on the AWS/Bedrock interview (227 points) — community comments mostly focus on whether Microsoft retains a backstop equity claim and what 'AGI' actually means now in the new contract.
- TechCrunch reports the AWS Marketplace listings are already live with new SKUs the same day the partnership was announced, suggesting months of pre-staging.