It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Don't know a whole lot about monitors and wasn't able to find an answer on Google, maybe people in here have an idea.

I have two Omen 25i monitors, got them both at big discounts and they're solid IPS monitors. However, I noticed that when HDR is enabled they have different peak brightness. Doesn't appear to be a huge deal, HDR on these cheap monitors is mostly a gimmick anyways.

Monitor 1 - 500 nits
Monitor 2 - 486 nits

But I am wondering how this could be the case, is it normal for two of the same monitors to have this kind of difference?
avatar
botan9386: Don't know a whole lot about monitors and wasn't able to find an answer on Google, maybe people in here have an idea.

I have two Omen 25i monitors, got them both at big discounts and they're solid IPS monitors. However, I noticed that when HDR is enabled they have different peak brightness. Doesn't appear to be a huge deal, HDR on these cheap monitors is mostly a gimmick anyways.

Monitor 1 - 500 nits
Monitor 2 - 486 nits

But I am wondering how this could be the case, is it normal for two of the same monitors to have this kind of difference?
I think it's just "manufacturing variance", which is to say that nothing can be physically manufactured to be 100% identical. There's only at most a 3% difference between those numbers. I think it's also why specs for monitors usually specify max brightness as "typical" values.
avatar
twistedpony: I think it's just "manufacturing variance", which is to say that nothing can be physically manufactured to be 100% identical. There's only at most a 3% difference between those numbers. I think it's also why specs for monitors usually specify max brightness as "typical" values.
This is what I suspect, just found it odd that there'd be a manufacturing difference on something like brightness.
avatar
twistedpony: I think it's just "manufacturing variance", which is to say that nothing can be physically manufactured to be 100% identical. There's only at most a 3% difference between those numbers. I think it's also why specs for monitors usually specify max brightness as "typical" values.
avatar
botan9386: This is what I suspect, just found it odd that there'd be a manufacturing difference on something like brightness.
I'm going to hazard a guess here that the max brightness of a LED monitor would be proportional to the amount of current that each individual LED can sustain, which in turn would depend on the resistance of each individual LED and no two of them would have the same chemical composition, hence resistance.
Nefarious and nebulous things like monitor calibration. This is why print shops and artists often have to ask for expensive devices and special test cards, and exact lighting setups.
Yeah, I suspect the issue could be mostly negated by calibration. Realistically, no piece of hardware is going to be 100% identical. Especially for something like monitors. Whether the variance is 0.3% or 3% is all luck of the draw, also known as hardware lottery. Mostly talked about when it comes to GPUs, but applies to anything really.
avatar
dnovraD: Nefarious and nebulous things like monitor calibration. This is why print shops and artists often have to ask for expensive devices and special test cards, and exact lighting setups.
avatar
idbeholdME: Yeah, I suspect the issue could be mostly negated by calibration. Realistically, no piece of hardware is going to be 100% identical. Especially for something like monitors. Whether the variance is 0.3% or 3% is all luck of the draw, also known as hardware lottery. Mostly talked about when it comes to GPUs, but applies to anything really.
I just also caught that my newer monitor is warmer in tone, I have adjusted the RGB levels slightly to make it cooler but if I were nit-picking I can still see the difference.

I can't imagine how annoying it would be for artists or photographers to get consistency across different set-ups.
avatar
botan9386: I just also caught that my newer monitor is warmer in tone, I have adjusted the RGB levels slightly to make it cooler but if I were nit-picking I can still see the difference.

I can't imagine how annoying it would be for artists or photographers to get consistency across different set-ups.
Oh, some of them go to great lengths, requireing entirely special bespoke devices to be plugged in in order which to do so by. A spectrometer or a colormeter, but at the end of the day, they're typically working with a very specific system setup.