“It’s a bit technical,” begins Birdwell, "but the simple version is that graphics cards at the time always stored RGB textures and even displayed everything as non linear intensities, meaning that an 8 bit RGB value of 128 encodes a pixel that’s about 22% as bright as a value of 255, but the graphics hardware was doing lighting calculations as though everything was linear.
Jesus Christ, I knew this was a problem with image editing software back then, but I never knew, that GPU manufacturers fucked it up as well. How did this happen?
I have a good guess on how this would actually happen:
PM: We need this
Specialist: makes this (doesn’t check results)
QC: Looks good (but doesn’t actually check)
Some updates later may further break the functionality. And as long as numbers aren’t blatantly wrong (think 0s everywhere, for example) and nobody checks thoroughly enough, the issue will remain.
I have unfortunate experience of being a part of such a story, haha. There are ways to counter it. Mainly, their project documentation either wasn’t up to par or wasn’t used as a reference during creation and tests. Either way, it’s negligence.
I imagine in case of GPU design, there should be a bunch of tests for image correctness at some point, which would require pixel perfect reproduction to pass.
But it’s plausible that tests were running incorrect math too.
Because back in the days people used RGB without a spec (even though sRGB exists since 1996). RGB on its own doesn’t mean anything, so you get random shit on screen. Any variant of RGB (sRGB, Adobe RGB, no spec RGB) should only be used for presentation, not for composition. Composition should be done in a colour space which is luma aware.
Jesus Christ, I knew this was a problem with image editing software back then, but I never knew, that GPU manufacturers fucked it up as well. How did this happen?
I have a good guess on how this would actually happen:
PM: We need this
Specialist: makes this (doesn’t check results)
QC: Looks good (but doesn’t actually check)
Some updates later may further break the functionality. And as long as numbers aren’t blatantly wrong (think 0s everywhere, for example) and nobody checks thoroughly enough, the issue will remain.
I have unfortunate experience of being a part of such a story, haha. There are ways to counter it. Mainly, their project documentation either wasn’t up to par or wasn’t used as a reference during creation and tests. Either way, it’s negligence.
I imagine in case of GPU design, there should be a bunch of tests for image correctness at some point, which would require pixel perfect reproduction to pass.
But it’s plausible that tests were running incorrect math too.
Because back in the days people used RGB without a spec (even though sRGB exists since 1996). RGB on its own doesn’t mean anything, so you get random shit on screen. Any variant of RGB (sRGB, Adobe RGB, no spec RGB) should only be used for presentation, not for composition. Composition should be done in a colour space which is luma aware.
“They are just human after aaaall 🎶”
Laziness
Linear is probably a lot faster?