It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Because i was wondering about choices in the CyberPunk 2077 menu

HDR10 scRGB (16-bit)
is higher quality; 16 is larger than 10 :slight_smile:
is floating-point; it suffers less visible banding artifacts
is based on Rec709 / sRGB color primaries, so well suited for consumer displays

HDR10 PQ (10-bit)
uses half as much memory as scRGB
… can’t think of anything else good about it :stuck_out_tongue:
The PQ in this mode refers to Perceptual Quantization, it is a form of gamma used in HDR.

Both of these formats eventually have PQ applied to them.

“HDR10 PQ” pre-encodes the image using PQ gamma
“HDR10 scRGB” renders the image without gamma and the driver applies PQ
In the end, the 16-bit scRGB HDR mode should be your goto for higher quality. It might incur a couple % more GPU load than the inferior 10-bit mode.

Technically, scRGB gets converted to HDR10 before your display gets a signal, but results are better if this conversion is from 16-bit color to 10-bit.
avatar
Zimerius: Because i was wondering about choices in the CyberPunk 2077 menu

HDR10 scRGB (16-bit)
is higher quality; 16 is larger than 10 :slight_smile:
is floating-point; it suffers less visible banding artifacts
is based on Rec709 / sRGB color primaries, so well suited for consumer displays

HDR10 PQ (10-bit)
uses half as much memory as scRGB
… can’t think of anything else good about it :stuck_out_tongue:
The PQ in this mode refers to Perceptual Quantization, it is a form of gamma used in HDR.

Both of these formats eventually have PQ applied to them.

“HDR10 PQ” pre-encodes the image using PQ gamma
“HDR10 scRGB” renders the image without gamma and the driver applies PQ
In the end, the 16-bit scRGB HDR mode should be your goto for higher quality. It might incur a couple % more GPU load than the inferior 10-bit mode.

Technically, scRGB gets converted to HDR10 before your display gets a signal, but results are better if this conversion is from 16-bit color to 10-bit.
Neither are as good as Dolby Vision. There's HDR, HDR10+ and Dolby Vision. Even Hisense and TCL have given in and paid the licensing fee for Dolby Vision.
avatar
Zimerius: Because i was wondering about choices in the CyberPunk 2077 menu

HDR10 scRGB (16-bit)
is higher quality; 16 is larger than 10 :slight_smile:
is floating-point; it suffers less visible banding artifacts
is based on Rec709 / sRGB color primaries, so well suited for consumer displays

HDR10 PQ (10-bit)
uses half as much memory as scRGB
… can’t think of anything else good about it :stuck_out_tongue:
The PQ in this mode refers to Perceptual Quantization, it is a form of gamma used in HDR.

Both of these formats eventually have PQ applied to them.

“HDR10 PQ” pre-encodes the image using PQ gamma
“HDR10 scRGB” renders the image without gamma and the driver applies PQ
In the end, the 16-bit scRGB HDR mode should be your goto for higher quality. It might incur a couple % more GPU load than the inferior 10-bit mode.

Technically, scRGB gets converted to HDR10 before your display gets a signal, but results are better if this conversion is from 16-bit color to 10-bit.
avatar
u2jedi: Neither are as good as Dolby Vision. There's HDR, HDR10+ and Dolby Vision. Even Hisense and TCL have given in and paid the licensing fee for Dolby Vision.
I think Andromeda (Mass Effect) is the only title in my library that at least offers a choice with Dolby Vision