Posted July 08, 2024
Because i was wondering about choices in the CyberPunk 2077 menu
HDR10 scRGB (16-bit)
is higher quality; 16 is larger than 10 :slight_smile:
is floating-point; it suffers less visible banding artifacts
is based on Rec709 / sRGB color primaries, so well suited for consumer displays
HDR10 PQ (10-bit)
uses half as much memory as scRGB
… can’t think of anything else good about it :stuck_out_tongue:
The PQ in this mode refers to Perceptual Quantization, it is a form of gamma used in HDR.
Both of these formats eventually have PQ applied to them.
“HDR10 PQ” pre-encodes the image using PQ gamma
“HDR10 scRGB” renders the image without gamma and the driver applies PQ
In the end, the 16-bit scRGB HDR mode should be your goto for higher quality. It might incur a couple % more GPU load than the inferior 10-bit mode.
Technically, scRGB gets converted to HDR10 before your display gets a signal, but results are better if this conversion is from 16-bit color to 10-bit.
HDR10 scRGB (16-bit)
is higher quality; 16 is larger than 10 :slight_smile:
is floating-point; it suffers less visible banding artifacts
is based on Rec709 / sRGB color primaries, so well suited for consumer displays
HDR10 PQ (10-bit)
uses half as much memory as scRGB
… can’t think of anything else good about it :stuck_out_tongue:
The PQ in this mode refers to Perceptual Quantization, it is a form of gamma used in HDR.
Both of these formats eventually have PQ applied to them.
“HDR10 PQ” pre-encodes the image using PQ gamma
“HDR10 scRGB” renders the image without gamma and the driver applies PQ
In the end, the 16-bit scRGB HDR mode should be your goto for higher quality. It might incur a couple % more GPU load than the inferior 10-bit mode.
Technically, scRGB gets converted to HDR10 before your display gets a signal, but results are better if this conversion is from 16-bit color to 10-bit.