Openai partners with aws to bring open weight models and codex to amazon bedrock

OpenAI expanded its reach into Amazon’s cloud infrastructure through a partnership that integrated its latest models and coding capabilities into Amazon Bedrock, bringing together OpenAI’s artificial intelligence models with AWS’s enterprise-grade infrastructure to offer developers a unified platform for building generative AI applications.

The integration centred on OpenAI’s open-weight models, specifically the gpt-oss-20b and gpt-oss-120b variants. The smaller model served use cases requiring lower latency and specialised applications, while the larger model targeted production environments and complex reasoning tasks. Both models featured a context window of 128,000 tokens, enabling developers to work with substantial amounts of information in a single interaction. The models supported text input and output modalities and could be invoked through multiple operations, including standard model invocation and batch inference capabilities.

The Codex coding agent formed a key component of the integration. Recent updates to Codex introduced first-class Amazon Bedrock support, allowing developers to connect existing code using OpenAI or Anthropic SDKs by modifying configuration settings. The latest version of Codex CLI incorporated AWS SigV4 signing and credential-based authentication, streamlining the connection process for developers working within AWS environments.

A new Stateful Runtime Environment for Agents in Amazon Bedrock, powered by OpenAI models, emerged from the partnership. The environment was optimised for AWS infrastructure and designed for agentic workflows, incorporating state management, reliability features and governance capabilities required for production deployments. The runtime environment enabled AWS customers to access OpenAI’s intelligence while maintaining the security, scalability and cost optimisation that Amazon Bedrock provided.

The integration provided developers access to model diversity through a single unified API, removing the need to change application code when switching between different AI providers. A pay-as-you-go pricing model reduced financial barriers, allowing organisations to scale AI workloads without fixed monthly commitments. Amazon Bedrock’s managed infrastructure abstracted deployment challenges and provided built-in memory management and identity management capabilities.

The partnership announcement in February included investment commitments that underscored both companies’ confidence in the collaboration, aligning OpenAI’s artificial intelligence expertise with Amazon’s cloud infrastructure capabilities for developers building AI applications in an integrated environment.

Comments

Popular posts from this blog

Anthropic launches Claude Design, a new product for creating quick visuals

Samsung galaxy S25 S24 and S23 get final push ahead of one ui 8.5 rollout

OpenAI rolls out major Codex update with automation, plugins and computer use