A new version of OpenAI’s Codex is powered by a new custom chip

✨ Read this trending post from TechCrunch 📖

📂 **Category**: AI,OpenAI,Cerebras,codex,Codex-Spark

📌 **What You’ll Learn**:

On Thursday, OpenAI announced the launch of a lightweight version of its Codex middleware, the latest model that OpenAI launched earlier this month. The company describes GPT-5.3-Codex-Spark as a “smaller version” of this model, one designed for faster inference. To support this inference, OpenAI brought in a custom chip from its hardware partner Cerebras, representing a new level of integration into the company’s physical infrastructure.

The partnership between Cerebras and OpenAI was announced last month, when OpenAI said it had reached a multi-year agreement with the company worth more than $10 billion. “Integrating Cerebras into our computing solutions mix is ​​about making our AI respond much faster,” the company said at the time. Now, OpenAI calls Spark the “first milestone” in that relationship.

Spark, which OpenAI says is designed for rapid, real-time collaboration and “rapid iteration,” will be powered by Cerebras’ Wafer Scale Engine 3. The WSE-3 is Cerebras’ third-generation megachip, packed with 4 trillion transistors. OpenAI describes the new lightweight tool as an “everyday productivity engine, helping users create rapid prototyping” rather than the longer, heavier tasks for which the original 5.3 was designed. Spark currently has a research preview for ChatGPT Pro users in the Codex app.

In a tweet before the announcement, CEO Sam Altman seemed to hint at the new model. “We have something special to launch for Codex users on the Pro plan later today,” Altman tweeted. “It sparks joy for me.”

In its official statement, OpenAI emphasized that Spark is designed for the lowest possible latency on Codex. “Codex-Spark is the first step toward a Codex that operates in two complementary modes: real-time collaboration when you want fast iteration, and long-running tasks when you need deeper thinking and execution,” OpenAI shared. The company added that Cerebras chips excel at helping “workflows that require ultra-low latency.”

Cerebras has been around for more than a decade, but in the age of artificial intelligence, it has enjoyed an increasingly prominent role in the technology industry. Just last week, the company announced it had raised $1 billion in new capital at a $23 billion valuation. The company had previously announced its intention to pursue an IPO.

“What excites us most about GPT-5.3-Codex-Spark is partnering with OpenAI and the developer community to discover what makes fast inference possible — new interaction patterns, new use cases, and a completely different prototyping experience,” Sean Lie, CTO and co-founder of Cerebras, said in a statement. “This preview is just the beginning.”

TechCrunch event

Boston, MA
|
June 23, 2026

🔥 **What’s your take?**
Share your thoughts in the comments below!

#️⃣ **#version #OpenAIs #Codex #powered #custom #chip**

🕒 **Posted on**: 1770919685

🌟 **Want more?** Click here for more info! 🌟

By

Leave a Reply

Your email address will not be published. Required fields are marked *