OpenAI Partners with Cerebras Systems in $10 Billion AI Computing Infrastructure Deal
OpenAI has signed a multi-year agreement with Cerebras Systems to deploy computing capabilities totaling 750 megawatts, marking a major step in the company’s strategy to expand the infrastructure needed to power advanced artificial intelligence technologies.
The joint statement released on Wednesday stated that OpenAI will integrate Cerebras’ hardware into its computing network to accelerate response times. The deployment is planned in phases, extending through 2028, with sources familiar with the deal valuing it at over $10 billion.
Greg Brockman, OpenAI’s co-founder and president, said, “This partnership will make ChatGPT not only the most capable platform but also the fastest AI system in the world.” He added that this speed will enable a new generation of AI applications and help attract one billion new users.
Cerebras, a semiconductor startup, has developed an innovative approach using massive chips to handle data at unprecedented speeds, aiming to compete with market leader Nvidia. This agreement is part of a broader push by the technology sector to meet the surging demand for electricity-intensive AI tools.
Earlier, Nvidia announced a $100 billion investment in OpenAI for AI infrastructure and data centers with a total power capacity of at least 10 gigawatts, while Advanced Micro Devices (AMD) revealed plans to deploy GPUs with a combined 6-gigawatt capacity for OpenAI over several years. In parallel, OpenAI is developing its own custom AI chip in collaboration with Broadcom.
OpenAI and Cerebras have been exploring collaboration opportunities since 2017. Recent deployments of Cerebras hardware supporting OpenAI’s GPT-OSS-120B model reportedly delivered performance 15 times faster than traditional systems. Andrew Feldman, CEO of Cerebras, emphasized that inference—the stage where AI models respond to queries—is critical for AI development and a key differentiator of their products.
