Unless it wasn’t as low as they wanted it. It’s at least cheap enough to run that they can afford to drop the pricing on the API compared to their older models.
I get the distinct impression that most of the focus for GPT5 was making it easier to divert their overflowing volume of queries to less expensive routes.
I’m thinking otherwise. I think GPT5 is a much smaller model - with some fallback to previous models if required.
Since it’s running on the exact same hardware with a mostly similar algorithm, using less energy would directly mean it’s a “less intense” model, which translates into an inferior quality in American Investor Language (AIL).
And 2025’s investors doesn’t give a flying fuck about energy efficiency.
I believe in verifiable statements and so far,with few exceptions, I saw nothing. We are now speculating on magical numbers that we can’t see, but we know that ai is demanding and we know that even small models are not free. The only accessible data come from mistral, most other ai devs are not exactly happy to share the inner workings of their tools.
Even than, mistral didn’t release all their data, even if they did it would only apply to mistral 7b and above, not to chatgpt.
Obviously it’s higher. If it was any lower, they would’ve made a huge announcement out of it to prove they’re better than the competition.
Unless it wasn’t as low as they wanted it. It’s at least cheap enough to run that they can afford to drop the pricing on the API compared to their older models.
I get the distinct impression that most of the focus for GPT5 was making it easier to divert their overflowing volume of queries to less expensive routes.
I’m thinking otherwise. I think GPT5 is a much smaller model - with some fallback to previous models if required.
Since it’s running on the exact same hardware with a mostly similar algorithm, using less energy would directly mean it’s a “less intense” model, which translates into an inferior quality in American Investor Language (AIL).
And 2025’s investors doesn’t give a flying fuck about energy efficiency.
And they don’t want to disclose the energy efficiency becaaaause … ?
Because the AI industry is a bubble that exists to sell more GPUs and drive fossil fuel demand
Because, uhhh, whoa what’s that? ducks behind the podium
They probably wouldn’t really care how efficient it is, but they certainly would care that the costs are lower.
I’m almost sure they’re keeping that for the Earnings call.
Do they do earnings calls? They’re not public.
probably VC money, the investors going to want some answers.
It’s cheaper though, so very likely it’s more efficient somehow.
I believe in verifiable statements and so far,with few exceptions, I saw nothing. We are now speculating on magical numbers that we can’t see, but we know that ai is demanding and we know that even small models are not free. The only accessible data come from mistral, most other ai devs are not exactly happy to share the inner workings of their tools. Even than, mistral didn’t release all their data, even if they did it would only apply to mistral 7b and above, not to chatgpt.