OpenAI may switch to GPT-4.5 instead of GPT-5 this year: Here's the reason

While many expect OpenAI to release GPT-5 in 2024, some analysts argue that it won't happen. They point to computational power as the reason.

While GPT-4 currently sits at the pinnacle of the productive AI landscape, competitors like Anthropic’s Claude and Meta’s open-source Llama continue to improve day by day. Therefore, OpenAI must maintain its position in the competition with new models. While many expect the company to release GPT-5 in 2024, some analysts argue that it won’t happen.

The computational requirement for transitioning from GPT-4 to GPT-5 will increase by 100 times.

Dan Hendrycks, Director of the Center for Artificial Intelligence Safety, claims that due to the requirement for 10 times more computational resources with each new version of OpenAI’s large language model (LLM), the company won’t transition to GPT-5 this year. Hendrycks suggests that if OpenAI skips GPT-4.5 and moves directly to GPT-5, there would be an approximately 100-fold increase in computational requirements compared to GPT-4. This would necessitate the uninterrupted operation of around 1 million H100 chips for three months. OpenAI may not currently possess such a level of computational power.

Dan Hendrycks Tweet ;

GPT-5 doesn’t seem likely to be released this year. Ever since GPT-1, the difference between GPT-n and GPT-n+0.5 is ~10x in compute. That would mean GPT-5 would have around ~100x the compute GPT-4, or 3 months of ~1 million H100s. I doubt OpenAI has a 1 million GPU server ready.

This comment aligns with a statement from Dario Amodei, CEO of Anthropic, who stated that a state-of-the-art language model currently costs approximately $1 billion, with expectations for this cost to rise to between $5 billion and $10 billion by 2025-2026.

Recently, NVIDIA announced that the H100 units established this year are expected to consume around 13,000 GWh of electricity annually. This amount is equivalent to the annual electricity consumption of countries like Lithuania and Guatemala. By 2027, it’s anticipated that global power consumption by data centers will range from 85 to 134 TWh.

Some experts believe that GPT-5 may need to alter its original training program, which included poorly curated human conversations and a generally naive training process. Therefore, it’s suggested that rather than completely overhauling everything and releasing GPT-5 this year, OpenAI is more likely to release GPT-4.5 gradually.

Scroll to Top