Matt Garman, AWS

AWS and OpenAI agree multi-year strategic partnership worth $38bn

Seven-year deal to opens up massive compute resources
Trade
Matt Garman, AWS

4 November 2025

Amazon Web Services (AWS) and OpenAI have agreed a $38 billion strategic partnership that will see AWS’s infrastructure run and scale OpenAI’s core artificial intelligence (AI) workloads.

Under the seven-year agreement OpenAI will access AWS compute comprising hundreds of thousands of Nvidia GPUs, with the ability to expand to tens of millions of CPUs to scale agentic workloads. AWS has unusual experience running large-scale AI infrastructure securely, reliably, and at scale – with clusters topping 500,000 chips.

OpenAI will immediately start using AWS compute as part of this partnership, with all capacity targeted to be deployed before the end of 2026, and the ability to expand further into 2027 and beyond.

 

advertisement



 

The infrastructure deployment that AWS is building for OpenAI features a sophisticated architectural design optimised for maximum AI processing efficiency and performance. Clustering the Nvidia GPUs via Amazon EC2 UltraServers on the same network enables low-latency performance across interconnected systems, allowing OpenAI to efficiently run workloads with optimal performance. The clusters are designed to support various workloads, from serving inference for ChatGPT to training large language models.

“As OpenAI continues to push the boundaries of what’s possible, AWS’s best-in-class infrastructure will serve as a backbone for their AI ambitions,” said Matt Garman, CEO of AWS (pictured). “The breadth and immediate availability of optimised compute demonstrates why AWS is uniquely positioned to support OpenAI’s vast AI workloads.”

“Scaling frontier AI requires massive, reliable compute,” said OpenAI co-founder and CEO Sam Altman. “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”

This news continues the companies’ work together to provide AI technology to benefit organisations worldwide. Earlier this year, OpenAI open weight foundation models became available on Amazon Bedrock, bringing these additional model options to millions of customers on AWS. OpenAI has quickly become one of the most popular publicly available model providers in Amazon Bedrock with thousands of customers – including Bystreet, Comscore, Peloton, Thomson Reuters, Triomics, and Verana Health – working with their models for agentic workflows, coding, scientific analysis, mathematical problem-solving, and more.

Fastnet rollout

In a separate development, Amazon Web Services (AWS) announced Fastnet, a state-of-the-art transatlantic subsea fiber optic cable system connecting Maryland, U.S., and Co Cork. When operational in 2028, Fastnet will add diversity for customers by building a new data pathway with unique landing points, keeping services running even if other undersea cables encounter issues. This enhanced network resilience will improve global connectivity and meet rising demand for cloud computing and AI.

The Fastnet system incorporates advanced optical switching branching unit technology, engineered to accommodate future topology needs. This state-of-the-art unit is strategically positioned on the cable route, enabling seamless redirection of data to future landing points as network demands evolve. This scalable architecture is specifically designed to handle growing AI traffic loads, allowing customers to rapidly expand their data demands while the system adapts to accommodate future growth.

TechCentral Reporters

Read More:


Back to Top ↑