OpenAI Jumps Out of Microsoft’s Bed, Into Amazon’s Bedrock
By Linda Park, Technology Editor
San Francisco, April 28, 2026
In a move that reshapes the competitive landscape of artificial intelligence, OpenAI has quietly migrated its most advanced models—including the much-anticipated GPT-5.5—from Microsoft’s Azure cloud to Amazon Web Services’ Bedrock platform. The shift, announced today in a limited preview, marks the first time OpenAI’s frontier models will be accessible outside Microsoft’s ecosystem, giving enterprises a new way to integrate cutting-edge AI into their workflows without leaving AWS.
For years, OpenAI’s partnership with Microsoft was seen as an exclusive alliance, with the software giant investing billions to secure early access to OpenAI’s models. But the latest announcement suggests that exclusivity is no longer the priority. Instead, OpenAI is positioning itself as a more open provider, offering its models through AWS Bedrock—a managed inference and agent platform that promises enterprise-grade security, governance, and operational controls. The move could signal a broader strategy to diversify distribution channels although maintaining control over how its models are deployed.
“This is a significant pivot,” said a senior cloud analyst at Gartner, who requested anonymity to discuss the strategic implications. “OpenAI is no longer tethered to a single cloud provider. That flexibility is a win for customers who desire choice, but it also raises questions about how Microsoft will respond.”
What’s New: OpenAI Models on Amazon Bedrock
Starting today, AWS customers can access OpenAI’s latest models—including GPT-5.5 and GPT-5.4—through Amazon Bedrock in a limited preview. The models are available via the same Bedrock APIs that enterprises already use, meaning no additional infrastructure or security configurations are required. This seamless integration is a key selling point for businesses already invested in AWS’s ecosystem.

According to an official announcement from AWS, the OpenAI models on Bedrock are designed for “reasoning, agents, coding, and complex analysis.” This aligns with OpenAI’s recent focus on agentic workflows—AI systems capable of autonomously performing multi-step tasks, such as debugging code, analyzing financial data, or conducting scientific research. The models are hosted on AWS infrastructure, which AWS touts as offering “end-to-end protection from data to deployment.”
Key offerings in the preview include:
- OpenAI Models on Amazon Bedrock: Access to GPT-5.5 and GPT-5.4 for tasks like reasoning, coding, and agentic workflows.
- Codex on Amazon Bedrock: OpenAI’s coding agent, Codex, is now available for enterprise software development at scale. Codex can handle large codebases, making it a potential game-changer for development teams.
- Amazon Bedrock Managed Agents, Powered by OpenAI: A streamlined experience for building production-ready AI agents with OpenAI’s frontier models, persistent memory, and built-in security.
The limited preview suggests that AWS and OpenAI are still ironing out the details before a broader rollout. Interested customers can sign up for updates on the AWS website.
Why This Matters: Choice and Flexibility for Enterprises
The partnership between AWS and OpenAI is a direct response to enterprise demand for flexibility. Many businesses have already standardized their operations on AWS, and the ability to access OpenAI’s models without switching clouds is a major advantage. “Enterprises want the best models for their use cases, but they also need the security, governance, and operational maturity that production workloads demand,” AWS stated in its press release.

For OpenAI, the move is a strategic expansion beyond its long-standing partnership with Microsoft. While Microsoft remains a key investor and partner, OpenAI’s decision to distribute its models through AWS suggests a desire to reach a broader audience. It also reflects the growing trend of AI providers diversifying their cloud partnerships to avoid over-reliance on a single vendor.
“This is about meeting customers where they are,” said Swami Sivasubramanian, Vice President of Data and AI at AWS, in a blog post published by OpenAI. “By bringing OpenAI’s models to Bedrock, we’re giving customers the choice to use the best AI tools on the infrastructure they already trust.”
What’s Next for OpenAI and Microsoft?
The announcement raises questions about the future of OpenAI’s relationship with Microsoft. While the two companies have collaborated closely since 2019, with Microsoft investing over $10 billion in OpenAI, the latest move suggests that OpenAI is no longer content to operate exclusively within Microsoft’s ecosystem. However, neither company has indicated that the partnership is ending. In fact, Microsoft’s Azure cloud will likely continue to play a significant role in OpenAI’s operations, particularly for research and development.
For now, the focus is on the AWS integration. The limited preview is expected to run for several months, with AWS and OpenAI gathering feedback from early adopters. If successful, the partnership could pave the way for a full-scale launch later this year, potentially including additional OpenAI models and features.
“This is just the beginning,” said an OpenAI spokesperson in a statement to World Today Journal. “We’re committed to making our models accessible to as many developers and businesses as possible, and AWS Bedrock is a critical part of that strategy.”
Who Stands to Benefit?
The AWS-OpenAI partnership is poised to benefit several key groups:
- Enterprise Developers: Teams already using AWS for cloud services can now integrate OpenAI’s models without migrating to a different platform. This reduces friction and accelerates AI adoption.
- Startups and SMBs: Smaller businesses that rely on AWS for scalability and cost-efficiency can now access OpenAI’s models without the need for extensive infrastructure investments.
- Researchers: Scientists and academics using AWS for data analysis can leverage OpenAI’s models for complex reasoning tasks, such as hypothesis generation or data interpretation.
- AI Enthusiasts: While the preview is limited to enterprise customers, the broader availability of OpenAI’s models could trickle down to consumer applications in the future.
However, the partnership is not without challenges. Enterprises will need to evaluate the cost of using OpenAI’s models on AWS, as pricing details have not yet been disclosed. Concerns about data privacy and security—particularly for industries like healthcare and finance—will need to be addressed as the preview expands.
Key Takeaways
- OpenAI’s models are now available on AWS Bedrock: GPT-5.5 and GPT-5.4 are accessible in a limited preview, marking the first time OpenAI’s models are available outside Microsoft’s ecosystem.
- No new infrastructure required: Customers can use OpenAI’s models through the same Bedrock APIs they already rely on, with unified security and governance controls.
- Focus on agentic workflows: The models are optimized for tasks like coding, reasoning, and complex analysis, with support for production-ready AI agents.
- Enterprise flexibility: The partnership gives businesses the choice to use OpenAI’s models on AWS, reducing vendor lock-in and expanding access to cutting-edge AI.
- Limited preview for now: The rollout is initially restricted to select customers, with a broader launch expected later in 2026.
What Happens Next?
AWS and OpenAI have not announced a timeline for the full launch of OpenAI’s models on Bedrock. However, the companies are expected to share updates as the limited preview progresses. Interested customers can sign up for updates on the AWS website or follow OpenAI’s official blog for the latest news.

For now, the tech world is watching closely to see how this partnership evolves—and whether Microsoft will respond with new offerings of its own.
What do you reckon about OpenAI’s move to AWS? Will this change how enterprises adopt AI? Share your thoughts in the comments below, and don’t forget to share this article with your network.