Posted May 20, 2015
There is a reasonably good solution but I need to rant about this first:
Colour management is all about making an image appear identical across all monitors. In windows this is done via ICC profiles. This ICC profile can be inbuilt, or you can create a brand new one using a colorimeter which will then map out the exact colour capabilities of your monitor.
The reason this is done is most monitors have differing backlights and colour filters and so the colours that are displayed at the default settings vary considerably between different monitors. In the case of a game like the Witcher, the display colour is going to vary depending on your display; perhaps have a bit of a green tint, or a blue tint.
Creating a ICC colour profile, allows you to map out the characteristics of your display and so create a colour space which details the entire colour range (as long as you have a colour meter). This is very useful since colours consist of 3 colour channels RGB (Red, Green, Blue) which have 256 shades each.
Now there are several industry standard colourspaces out there (sRGB, AdobeRGB are pretty well known). When you have mapped out your display colour range, if you were to open an image that was encoded in sRGB, it would look at your custom colour profile and then remap the colour to the correct place within your monitors colour space.
For example, lets say that you have a monitor which has a colour space that is slightly larger than sRGB. This just means that the RGB channels have stronger colours than what sRGB requires. So when we select the brightest Red which corresponds to shade 256, we find that the colour is 'redder' than required; so if we open up an image encoded in sRGB, it looks at the custom monitor colour space and goes "Oh, I see, I need to choose a Red colour level of 245 to get the correct colour I need.
This is then done for all other channels so that when you look at that image, it appears identical on your display, despite the fact that your display may have stronger colours. This is where you start hearing about 6-bit, 8-bit and 10-bit displays. When you map the colour range to a lower set of numbers, you are effectively cutting out the additional shades above those selected values.
With 6-bit and 8-bit monitors (64 and 256 shades per colour), you end up with larger steps between the colours since colour 245 now becomes shade 256 for the sRGB image. Since there are 245 shades that must be mapped to 256 shades we get banding as obviously certain colours will need to be duplicated (for example shade 21 gets mapped to shade 19 and shade 20 also might get mapped to shade 19, since it is still closer to the correct colour then shade 18).
With a 10-bit display, we have a much finer granuality between colours (1024), so we reduce the banding when we map the colours to the sRGB space.
This process when done correctly allows the textures within a game like the witcher 3 to appear EXACTLY as the artists created them. Which I think would be ideal, but it doesn't often happen.
Now for most people, they aren't going to have any advantage to having the game be colour aware, because most people don't callibrate their display. But why does the Witcher 3 ACTIVELY IGNORE YOUR COLOUR PROFILE, it makes me angry >:(
Don't allow a game to change the gamma curve. The game has no idea what monitor I use, so overriding it will ALWAYS be incorrect.
My game in true fullscreen mode has a pinkish tinge to it that looks terrible (I have a wide gamut display, which brings it's own problems to the table, but it is callibrated correctly).
Now the solution is "Borderless Window", that will allow the game to utilize my carefully callibrated colour profile, but I lose vsync. It is a compromise that I prefer not to make, as I do notice the stuttering and horizontal lines occasionally due to the lack of vsync.
DEVELOPERS NEVER EVER CHANGE THE GAMMA SETTINGS IN YOUR GAME, create brightness and contrast controls, but please don't touch the gamma curves. If your artists callibrated their monitors, wouldn't it be better if those with properly calibrated monitors could enjoy seeing the game exactly as those that created it do?
Colour management is all about making an image appear identical across all monitors. In windows this is done via ICC profiles. This ICC profile can be inbuilt, or you can create a brand new one using a colorimeter which will then map out the exact colour capabilities of your monitor.
The reason this is done is most monitors have differing backlights and colour filters and so the colours that are displayed at the default settings vary considerably between different monitors. In the case of a game like the Witcher, the display colour is going to vary depending on your display; perhaps have a bit of a green tint, or a blue tint.
Creating a ICC colour profile, allows you to map out the characteristics of your display and so create a colour space which details the entire colour range (as long as you have a colour meter). This is very useful since colours consist of 3 colour channels RGB (Red, Green, Blue) which have 256 shades each.
Now there are several industry standard colourspaces out there (sRGB, AdobeRGB are pretty well known). When you have mapped out your display colour range, if you were to open an image that was encoded in sRGB, it would look at your custom colour profile and then remap the colour to the correct place within your monitors colour space.
For example, lets say that you have a monitor which has a colour space that is slightly larger than sRGB. This just means that the RGB channels have stronger colours than what sRGB requires. So when we select the brightest Red which corresponds to shade 256, we find that the colour is 'redder' than required; so if we open up an image encoded in sRGB, it looks at the custom monitor colour space and goes "Oh, I see, I need to choose a Red colour level of 245 to get the correct colour I need.
This is then done for all other channels so that when you look at that image, it appears identical on your display, despite the fact that your display may have stronger colours. This is where you start hearing about 6-bit, 8-bit and 10-bit displays. When you map the colour range to a lower set of numbers, you are effectively cutting out the additional shades above those selected values.
With 6-bit and 8-bit monitors (64 and 256 shades per colour), you end up with larger steps between the colours since colour 245 now becomes shade 256 for the sRGB image. Since there are 245 shades that must be mapped to 256 shades we get banding as obviously certain colours will need to be duplicated (for example shade 21 gets mapped to shade 19 and shade 20 also might get mapped to shade 19, since it is still closer to the correct colour then shade 18).
With a 10-bit display, we have a much finer granuality between colours (1024), so we reduce the banding when we map the colours to the sRGB space.
This process when done correctly allows the textures within a game like the witcher 3 to appear EXACTLY as the artists created them. Which I think would be ideal, but it doesn't often happen.
Now for most people, they aren't going to have any advantage to having the game be colour aware, because most people don't callibrate their display. But why does the Witcher 3 ACTIVELY IGNORE YOUR COLOUR PROFILE, it makes me angry >:(
Don't allow a game to change the gamma curve. The game has no idea what monitor I use, so overriding it will ALWAYS be incorrect.
My game in true fullscreen mode has a pinkish tinge to it that looks terrible (I have a wide gamut display, which brings it's own problems to the table, but it is callibrated correctly).
Now the solution is "Borderless Window", that will allow the game to utilize my carefully callibrated colour profile, but I lose vsync. It is a compromise that I prefer not to make, as I do notice the stuttering and horizontal lines occasionally due to the lack of vsync.
DEVELOPERS NEVER EVER CHANGE THE GAMMA SETTINGS IN YOUR GAME, create brightness and contrast controls, but please don't touch the gamma curves. If your artists callibrated their monitors, wouldn't it be better if those with properly calibrated monitors could enjoy seeing the game exactly as those that created it do?
Post edited May 20, 2015 by Jamie.monro