OpenAI Pushes Into AWS as Microsoft Ties Loosen

OpenAI’s move onto Amazon Bedrock gives AWS customers access to its models, Codex and managed agents, as Microsoft’s partnership becomes less exclusive.

Agentic artificial intelligence

OpenAI is bringing Codex, its AI coding agent used by more than four million people each week, to Amazon Bedrock alongside its frontier models and managed agents. For AWS customers, the limited-preview launch puts OpenAI tools inside existing cloud systems, security controls, procurement processes and compliance structures.

The move comes days after Microsoft amended its long-running partnership with OpenAI. Microsoft remains OpenAI’s primary cloud partner, but OpenAI can now serve its products to customers across any cloud provider. Microsoft’s licence to OpenAI intellectual property will continue through 2032, but is now non-exclusive.

OpenAI enters the AWS stack

The expanded AWS partnership covers three areas: OpenAI models on AWS, Codex on AWS, and Amazon Bedrock Managed Agents powered by OpenAI.

OpenAI models, including GPT-5.5, are coming to Amazon Bedrock, enabling AWS customers to build with them alongside the services, security controls, identity systems and procurement processes already used inside AWS environments.

“For many companies, using AI at scale requires bringing the best models to the systems their teams already use,” OpenAI said. “Customers can now build with OpenAI models in AWS, alongside the services, security controls, identity systems, and procurement processes they already rely on.”

AWS has positioned the launch around customer choice. OpenAI models will be available through the same Bedrock APIs and controls customers already use, alongside models from Anthropic, Meta, Mistral, Cohere, Amazon and other providers.

For enterprises, the commercial relevance sits less in model access alone than in procurement and operational control. Customers will be able to apply OpenAI model usage towards existing AWS cloud commitments and manage usage through established governance and cost-control structures.

Microsoft remains primary, but less exclusive

The AWS launch follows Microsoft and OpenAI’s amended partnership terms.

OpenAI products will continue to ship first on Azure unless Microsoft cannot, or chooses not to, support the required capabilities. Microsoft also remains OpenAI’s primary cloud partner and continues to hold a licence to OpenAI intellectual property for models and products through 2032.

The amended agreement changes the nature of that licence. Microsoft’s licence is now non-exclusive, while OpenAI can serve all its products to customers across any cloud provider.

Microsoft said the agreement was designed to give both companies “flexibility, certainty, and a focus on delivering the benefits of AI broadly”. The company will also continue to participate directly in OpenAI’s growth as a major shareholder.

The arrangement preserves Microsoft’s central role while giving OpenAI more room to distribute products through other cloud platforms. For AWS, it creates an opening to bring OpenAI tools into Bedrock. For OpenAI, it widens enterprise access beyond customers already standardised on Azure.

Codex moves into enterprise development

Codex shows why cloud placement is commercially important for enterprise AI tools. OpenAI says more than four million people use Codex each week across software development and adjacent professional workflows, including code generation, refactoring, testing, research, analysis and document-based work.

Through the AWS launch, organisations will be able to power Codex with OpenAI models served from Amazon Bedrock. Customers can configure Codex to use Bedrock as the provider, beginning with Codex CLI, the Codex desktop app and the Visual Studio Code extension.

The integration is aimed at enterprise software teams already operating inside AWS environments. Customer data will be processed by Amazon Bedrock, while eligible customers can apply Codex usage towards AWS cloud commitments.

AWS described Codex as one of the stronger examples of AI agents carrying out work inside enterprise environments. Codex on Bedrock will allow teams to authenticate using AWS credentials, process inference through Bedrock infrastructure and bring AI-powered software development into the systems where enterprise teams already build and operate.

Agents become the cloud battleground

The third part of the launch is Amazon Bedrock Managed Agents, powered by OpenAI, a service for building production-ready agents using OpenAI models within AWS infrastructure.

Amazon Bedrock Managed Agents combines OpenAI frontier models and agentic capabilities with AWS security, governance and operational controls. The service is intended to support agents that maintain context, execute multi-step workflows, use tools and take action across business processes.

Ben Kus, CTO at Box, said the combination of OpenAI models and AWS infrastructure would support agents that “operate with the governance and auditability enterprises require, all running on the cloud we already trust.”

OpenAI has positioned the product as a way for enterprises to move from experimentation to production while keeping agent development aligned with AWS infrastructure, security and operational standards.

The agent product also brings the cloud providers’ infrastructure role into more focus. Model access remains important, but enterprise adoption depends on the surrounding infrastructure: identity, permissions, logging, memory, tool use, orchestration, auditability and integration with internal systems.

AWS says Bedrock Managed Agents gives each agent its own identity, logs actions for auditability and runs inside the customer’s environment, with model inference on Amazon Bedrock.

Enterprise AI becomes multi-cloud

The AWS announcement does not end the Microsoft-OpenAI relationship. Microsoft remains OpenAI’s primary cloud partner, and the two companies continue to work together across products, infrastructure and commercial distribution.

The deal does, however, give OpenAI another route into enterprise customers. AWS says the launch allows customers to use OpenAI models, Codex and managed agents through existing Bedrock APIs, security controls and procurement structures. Microsoft, meanwhile, says OpenAI can now serve its products across any cloud provider, while Microsoft retains a licence to OpenAI intellectual property through 2032.

The commercial logic is tied to how large companies buy and deploy AI. AWS has framed the launch around customers using OpenAI tools inside the cloud environments they already operate. Microsoft has positioned Azure as OpenAI’s primary cloud route, with OpenAI products continuing to ship first on Azure unless Microsoft cannot, or chooses not to, support the required capabilities.

That leaves OpenAI with distribution across both cloud platforms, while AWS and Microsoft each retain different claims on the enterprise AI market: AWS through Bedrock’s multi-model platform and Microsoft through its primary cloud partnership with OpenAI.