The multiyear partnership between OpenAI and AWS unveiled Friday elevated the cloud giant’s profile as a strategic platform amid enterprise efforts to deploy agentic AI.
Amazon said it will invest $50 billion in the large language model provider to accelerate enterprise AI innovation. OpenAI also unveiled an additional $60 billion in new investments from SoftBank and Nvidia for a total of $110 billion.
As part of the Amazon deal, AWS will be the exclusive cloud distributor of OpenAI Frontier, an enterprise platform for building and deploying AI agents launched earlier this month.
AWS and OpenAI also plan to create a Stateful Runtime Environment powered by OpenAI models and available through Amazon Bedrock — an AWS service providing API access to foundation models. The platform will let developers “build generative AI applications and agents at production scale,” according to the press release.
The new platform will allow developers to remember previous work and is designed to manage ongoing projects and workflows. It’s expected to launch in the next few months and will be integrated with Amazon Bedrock AgentCore, allowing customer AI applications and agents to operate cohesively with the rest of their AWS environment, the release said.
“This move clearly puts AWS as a cloud platform back in the conversation for AI with OpenAI’s Frontier capabilities and also its own Bedrock AgentCore tools and all the additional services they’re building on top of it,” Gartner Distinguished VP Analyst Jason Wong told CIO Dive.
What this means for CIOs
In the push to embed AI across operations, CIOs have looked closely at offerings from their own cloud providers. Hyperscalers gained an advantage with existing enterprise relationships and moved to rapidly expand their menu of AI offerings.
AWS’ exclusive operation of OpenAI Frontier is a “gamechanger” from a cloud competition perspective, Wong said.
Microsoft and OpenAI’s relationship underwent changes last year, leaving the hyperscaler with a 27% stake in the model provider and room for both companies to establish partnerships and develop AI products with outside parties.
While Microsoft maintains exclusivity in terms of OpenAI model APIs, the Frontier technology stack includes governance, identity management, observability and context for agents, all of which enterprises want, Wong said.
“The ability for AWS to claim exclusivity on this layer, on top of Bedrock, with their own technology — AgentCore — supporting it as well, is a significant advantage in the enterprise,” Wong said. “CIOs really need to consider how AWS could become a strategic platform for agentic AI in this combination.”
The partnership positions AWS to better compete in AI with Microsoft Azure and Google Cloud offerings, said Forrester Principal Analyst Lee Sustar. Amazon maintains the largest market share among cloud providers with 28%, followed by Microsoft at 21% and Google at 14%, according to Synergy Research Group.
It also sets Amazon up to compete with Nvidia.
AWS is expanding its existing $38 billion multiyear agreement with OpenAI by $100 billion over eight years as the model developer commits to consuming 2 gigawatts of AWS Trainium3 and Trainium4 chip capacity through AWS infrastructure to support demand for Stateful Runtime Environment, Frontier and other advanced workloads, according to the release.
OpenAI gains long-term compute capacity “while working with AWS to deploy purpose-built silicon alongside its broader compute ecosystem,” the release said.
“The wider $100 billion deal that involves running OpenAI on AWS’ Trainium chips helps fund Amazon’s long-term effort to build a rival GPU ecosystem alternative to Nvidia,” Sustar said in an email to CIO Dive.