On August 7, 2025, OpenAI began rolling out its new GPT-5 model. Now, a few months later in February 2026, with the release of GPT 5.3 “Garlic”, the 4o, 4.1 and o4-mini. I decided to quickly compile the most important things you should know following the impending deprecation of the older models. Here is what the situation looks like.
You might also be interested in: ChatGPT Ads Are Rolling Out – Here Is How They Work (So Far)
Major updates:
– February 10, 2026 – Updated with new initial information on the release of GPT-5.3 “Garlic”.
Update: GPT-4o, 4.1 & o4-mini Are Set To Go on February 13

There used to be a way to get GPT-4o back in your ChatGPT web interface by getting to the settings menu and select the “Show additional models” option as shown in the image above. After February 13 2026 however, these models are said to be gone from the list in the web interface, presumably forever.
Time will tell whether or not OpenAI will actually get rid of all of the legacy models entirely, or leave access to some of them like they did after the backlash they received upon the removal of GPT-4o from the web interface. You will most likely still be able to access those legacy models via the API, even for some time after their final deprecation.
What if the GPT-5.3 Model Won’t Be Available To You Right Away?
GPT-5.3 is likely going to be rolled out to different groups of users in stages. Regardless of whether you have a ChatGPT Free or Plus/Pro subscription, the model availability during the first days after launch will most likely depend on your region and user group.
While you’re waiting, you can go ahead and read about the GPT-5.3 Codex, which has already been released with the promise of launching the Codex App similar to Google’s Antigravity platform. There is a lot to unpack there.
What About The API Access?

The list of OpenAI models available via the API is quite extensive. While the o3, o4-mini, and a large amount of other legacy models are still on the list of the models available via the OpenAI API, it’s reasonable to suspect that their deprecation will come sometime in the future.
It’s still reasonably safe to assume that these will not be leaving right away, mainly to let the developers of LLM-driven apps to update their software to use newer versions of the OpenAI models.
Going forward, it’s reasonable to expect all of the previous-gen models to approach deprecation, although with a little bit of luck it may not happen in the next few upcoming months. To be one of the first people to know when the API access to the legacy models is cut off, check the official OpenAI models and features deprecations history page here.
For now, that is all that we know!
Check out also: Best GPUs For Local LLMs This Year (My Top Picks – Updated)

