Advertisement

Gamma-correction

Started by November 06, 2020 02:20 PM
6 comments, last by Bonefest 3 years, 10 months ago

Hi there, I'm struggling to understand why we need a gamma-correction. Here is what I understand so far:

  • We load a texture from file (e.g with jpeg format), it almost always pre-corrected (??) - color (0.5, 0.5, 0.5) is stored as (0.72, 0.72, 0.72). (power 1/2.2)
  • Inside a fragment shader we need to convert our texture fron non-linear space back to linear - (0.72, 0.72, 0.72) is converted back to (0.5, 0.5, 0.5) (power 2.2)
  • After all callculations are done, color should be trasformed back to non-linear space - color (0.5, 0.5, 0.5) (or some another final color) is converted back to (0.72, 0.72, 0.72) (power 1/2.2)

But there are two more steps:

  • Monitor transforms color (almost) back - color (0.72, 0.72, 0.72) is converted again to (0.5, 0.5, 0.5)
  • Our eyes - color (0.5, 0.5, 0.5) is converted AGAIN? to (0.72, 0.72, 0.72)

The question is, why do the formats like JPEG are converting color by raising it to (1 / 2.2), isn't our task is to see (0.5, 0.5, 0.5) (gray) and not (0.72, 0.72, 0.72) (light-gray)?

Another question. If I have a texture and I don't know whether the first step was applied (i.e raising to (1 / 2.2) power) or not, should I apply the conversion to linear space? I'm using DirectX and WICTextureLoader, but it doesn't load my image in *_SRGB format. I have an option to force it to do so, but I'm not sure is that correct.

Bonefest said:
But there are two more steps: Monitor transforms color (almost) back - color (0.72, 0.72, 0.72) is converted again to (0.5, 0.5, 0.5) Our eyes - color (0.5, 0.5, 0.5) is converted AGAIN? to (0.72, 0.72, 0.72)

No, monitors natively display sRGB images correctly and our eyes do not do sRGB color conversion.

Also, I'm not sure if it's the monitor that natively outputs sRGB or the GPU. My notes are from 2013 when I was leaning Maya.

🙂🙂🙂🙂🙂<←The tone posse, ready for action.

Advertisement

@fleabay Thank you for your reply. This article https://www.cambridgeincolour.com/tutorials/gamma-correction.htm​ starts with "Our eyes do not perceive light the way cameras do. With a digital camera, when twice the number of photons hit the sensor, it receives twice the signal (a "linear" relationship). Pretty logical, right? That's not how our eyes work. Instead, we perceive twice the light as being only a fraction brighter — and increasingly so for higher light intensities (a "nonlinear" relationship)." Could you please rephrase it?


And what do you mean by “monitors natively display sRGB images correctly”? Is that mean that they convert colors by raising them to power 2.2 or there is something else?

why do the formats like JPEG are converting color by raising it to (1 / 2.2), isn't our task is to see (0.5, 0.5, 0.5) (gray) and not (0.72, 0.72, 0.72) (light-gray)?

that's because historically monitors (CRT monitors) used to display images with gamma. Gamma, basically said, consisted in doubling the input voltage of these monitors. The goal was to increase brightness onscreen in a non-linear fashion and the value of this non-linear increase was settled at 2.2, which coincidentally (at the time) nearly matched the way humans perceive brightness (in other words it became possible for humans to distinguish darker and brighter colours better on a CRT monitor).

So jpeg formatting encodes -or if u prefer- stores the picture data by gamma correcting them -that is- jpeg applies the inverse gamma value (pow(1.0/2.2)) to each picture colour value, (it makes the picture brighter when it saves) because when this picture is displayed on a CRT monitor, this monitor will automatically invert the picture colour value. This way we get to see the picture at the colour values which the “artist/author” intended for us to see it at.

The other way to say all this is like this:

Let's say Jpeg didn't do anything to our picture.

If author creates picture with colour value (0.5,  …) that means this is what the artist wants us to see (0.5, …).

And when the monitor applies gamma (pow(2.2)) to this colour value, 0.5 becomes 0.22, so you see this would be a darker value than 0.5 (not what the author wants us to see).

Therefore Jpeg gamma corrects 0.5 and stores it at 0.72 (pow(1.0/2.2)) this is a brighter value than 0.5;

so finally when the monitor displays this, it applies gamma to 0.72 and therefore drops it to 0.5, which is what the artist wants us to see.

Another question. If I have a texture and I don't know whether the first step was applied (i.e raising to (1 / 2.2) power) or not, should I apply the conversion to linear space? I'm using DirectX and WICTextureLoader, but it doesn't load my image in *_SRGB format. I have an option to force it to do so, but I'm not sure is that correct.

If you are coding a game, you should offer the user/player in your game Menu Graphics Settings the option:

  • Gamma: Yes or No

I think you have seen this in some games right ?

This way, you can load your textures and apply gamma correction to them (or not) depending on the user's choice.

Another way to do this is:

  • you load your textures without applying gamma correction
  • render your scene but in the final pixel shader there you apply final gamma correction:
// pseudo

void pixel_shader( )
{
....
float4 final_col = bla bla bla....
final_col = pow(final_col.rgb, vec3(1.0/gamma)); // <-- do gamma correction here
return final_col;
}

You can also use this method to work out if gamma correction was already applied by WICTextureLoader (or else), but I leave it to u as an exercise to work out, dinner is calling me ?

That's it … all the best ?

@ddlox Thank you for your detailed answer. But I still cannot understand one thing, all these talks about how our eye see a color (I mean that it “applies” raising to power 1.0 / 2.2) could be safely ommited? Because otherwise, if we want to see color (0.5, 0.5, 0.5) from our screen, we do not need to raise it to power (1.0 / 2.2): we simply output (0.5, 0.5, 0.5), our screen make it less brighter and it becomes (0.22, 0.22, 0.22) but our eye perceive (0.5, 0.5, 0.5)? Or our task is not to perceive (0.5, 0.5, 0.5) (Well it seems like a probable answer!) ?

But I still cannot understand one thing, all these talks about how our eye see a color (I mean that it “applies” raising to power 1.0 / 2.2) could be safely ommited?

that is correct;

our eyes do not apply gamma inversion because our eyes “recognize” or “have the ability to recognize” what they see onscreen. Eyes can differentiate more shades of lights and dark values than a camera can. (That is why cameras need flashes and more lighting equipment, our eyes don't need all that equipment to tell the difference)

do not confuse this with colour range recognition: the ability for eyes to tell subtle differences in 1 HUE of red for example (but that's a different topic altogether)

also do not confuse all this with light spectrum (infra-red, lasers etc…), this is also matter for another topic ?

… our screen make it less brighter and it becomes (0.22, 0.22, 0.22) but our eye perceive (0.5, 0.5, 0.5)?

that is incorrect, our eyes would also perceive 0.22 ; (and that is why gamma correction is needed on your texture or your game output if you want to show 0.5 to your eyes)

Or our task is not to perceive (0.5, 0.5, 0.5) (Well it seems like a probable answer!) ?

And there you go and that's the point ?

Advertisement

@ddlox Thank you a lot! Your explanations finally helped me to solve this tangled puzzle. Now everything looks clear to me :)

This topic is closed to new replies.

Advertisement