Click to Skip Ad
Closing in...

ChatGPT o3 is coming in January, but there’s still no word on GPT-5

Published Dec 23rd, 2024 3:10PM EST
OpenAI made some of ChatGPT's best features free.
Image: OpenAI

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

OpenAI ended its “12 Days of ChatGPT” announcements on Friday with a bang. The company unveiled the next-gen reasoning model that will power ChatGPT, which is called o3. A ChatGPT o3-mini will also be available to users.

According to OpenAI’s presentation, the o3 models will deliver big performance boosts over their predecessors. OpenAI also revealed that it’s conducting safety training for the new reasoning models and taking registrations for third-party safety testers ahead of the models’ release. OpenAI also revealed that it plans to give o3-mini a late January release date, with o3 to follow.

You wouldn’t be alone if you thought Friday’s ChatGPT surprise might be OpenAI soft-launching GPT-5. However, it turns out that the big upgrade we’re waiting for is reportedly behind schedule and incurring massive costs. Therefore, o3 isn’t the GPT-5 model in disguise, but rather a precursor of that next big ChatGPT upgrade.

Sam Altman & Co. detailed the capabilities of the o3 models during a short live stream on Friday. That’s where he said that OpenAI will launch o3-mini around the end of January, with the full o3 model to follow shortly after that.

Then, The Wall Street Journal penned a detailed report about OpenAI’s struggles with GPT-5 development, indicating the o3 models are entirely different projects. It’s unclear when GPT-5 training will be ready, and there’s no release estimate for the next ChatGPT breakthrough model.

The hype around GPT-5 is real, however. The expectation is for the next genAI model to outperform GPT-4o while making fewer errors than its predecessors.

Called Orion internally, GPT-5 has been in development for 18 months. It was initially expected to drop in 2024, but OpenAI encountered unexpected delays while burning through cash. Training GPT-5 might cost up to $500 million per run, and the results aren’t exciting. Training GPT-4 cost the company over $100 million, according to Altman.

One issue with the training process concerns the lack of data. The internet, which OpenAI and others mined for data during the training phases of previous AI models, is finite. OpenAI needs more data of better quality to train the GPT-5.

That data needs to be generated by humans tasked with solving specific problems, whether coding or math. The alternative is the production of synthetic data from a reasoning model like o1.

The GPT-5 training process isn’t just generating high costs for processing all that data. It’s also time-consuming. A training run can take months and can’t guarantee success. If it fails, the teams have to rethink the process and restart it.

The report also details the various staffing problems OpenAI has been dealing with since Sam Altman was ousted and rehired in November 2023. Many high-ranking executives and researchers have left the company.

OpenAI has diverted resources to other products that might have impacted the development of GPT-5. This happened only after OpenAI researchers realized the Orion training runs failed to produce the expected results.

The Journal’s report isn’t the first to say GPT-5 will be delayed. Others said recently that several next-gen AI models deal with the same setbacks, not just GPT-5. With that in mind, it’s unclear when OpenAI will have GPT-5 ready. But, if you had any doubts, o3 isn’t GPT-5 by another name. It’s just a more advanced reasoning AI from OpenAI.

Reasoning could be the key to developing better genAI in the future. The report cites a quote from a recent Ted Talk featuring OpenAI senior research scientist Noam Brown. He said that “having the bot think for just 20 seconds in a hand of poker got the same boost in performance as scaling up the model by 100,000x and training for 100,000 times longer.”

On that note, I’ll speculate that the o3 models may be what OpenAI needs to generate that additional data to train GPT-5. That’s speculation, however, and there’s no indication that’s what’s happening behind the scenes. As for OpenAI, the company is not ready to make any GPT-5 announcements.

Chris Smith Senior Writer

Chris Smith has been covering consumer electronics ever since the iPhone revolutionized the industry in 2007. When he’s not writing about the most recent tech news for BGR, he closely follows the events in Marvel’s Cinematic Universe and other blockbuster franchises.

Outside of work, you’ll catch him streaming new movies and TV shows, or training to run his next marathon.