Amazon Web Services (AWS) and OpenAI have announced a multi-year strategic partnership worth $38 billion.

The deal provides AWS’s infrastructure to run and scale OpenAI’s core artificial intelligence workloads, starting immediately.

Under the agreement, which will have continued growth over the next seven years, OpenAI is accessing AWS compute comprising hundreds of thousands of state-of-the-art NVIDIA GPUs, with the ability to expand to tens of millions of CPUs to rapidly scale agentic workloads.

AWS has experience of running large-scale AI infrastructure securely, reliably and at scale – with clusters topping 500K chips. The companies say that AWS’s leadership in cloud infrastructure combined with OpenAI’s pioneering advancements in generative AI will help millions of users continue to get value from ChatGPT.

“Scaling frontier AI requires massive, reliable compute,” said OpenAI co-founder and CEO Sam Altman (pictured). “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”

The infrastructure deployment that AWS is building for OpenAI features ‘a sophisticated architectural design optimised for maximum AI processing efficiency and performance’.

Clustering the NVIDIA GPUs – both GB200s and GB300s – via Amazon EC2 UltraServers on the same network enables low-latency performance across interconnected systems, allowing OpenAI to efficiently run workloads with optimal performance.

The clusters are designed to support various workloads, from serving inference for ChatGPT to training next generation models, with the flexibility to adapt to OpenAI’s evolving needs.

FTSE October review: Next & THG big winners as index hits record peak

“As OpenAI continues to push the boundaries of what’s possible, AWS’s best-in-class infrastructure will serve as a backbone for their AI ambitions,” said Matt Garman, CEO of AWS. “The breadth and immediate availability of optimized compute demonstrates why AWS is uniquely positioned to support OpenAI’s vast AI workloads.”

Earlier this year, OpenAI open weight foundation models became available on Amazon Bedrock – a fully managed service from AWS that provides access to a choice of foundation models through a single API.

OpenAI has quickly become one of the most popular publicly available model providers in Amazon Bedrock, it says, with thousands of customers – including Bystreet, Comscore, Peloton, Thomson Reuters, Triomics and Verana Health – using its models for agentic workflows, coding, scientific analysis and mathematical problem-solving.

BusinessCloud achieves highest ever website audience