It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
I've noticed many high refresh rate monitors are 144Hz, but how do these handle lower frame rates? Ignoring FrreSync and GSync, that is.

With vsync on, the framerate would be capped to 144, and if not capable of producing 144 fps it would drop to 72, right? Then 48 (144/3), then 36 (144/4)?

But what of games that have a cap, like 60 fps, or 30 fps? Or even older games where you have to cap at 20 or 30 fps else the physics goes all weird and the game becomes unplayable? These aren't divisors of 144, so would the monitor then display certain frames for longer than others, potentially causing an almost stuttering feeling every now and then? If so, why is 144Hz used instead of, say, 120Hz, which these lower framerates would divide well? And would the stutter even be noticeable?

Next question, is can 144Hz monitors be set to use any refresh rate up to 144Hz? Is this done via the display driver control panel or a setting within the monitor's menus itself, or both? And do they only allow set values (e.g. 60, 120, 144)?

If I use a framerate limit for a given game in the display drivers using Nvidia Inspector or via MSI Afterburner, or even in-game settings if it allows it, but the display is still set to 144Hz in Windows, would the monitor switch refresh rate if the game is a fullscreen application or would it operate as I described above with the potential stutter of some longer duration frames?
This question / problem has been solved by IFWimage
What refresh rates the monitor can handle at a certain resolution, and how they handle non-integer divisors of their native refresh rate, is different for each model. What you can set in various applications is up to said application, and how they handle the information they get when querying the monitor.

In short - I've no idea.
Post edited December 25, 2018 by user deleted
Most monitors support VESA standard resolutions and refresh rates, as a bare minimum.
Additionally, it's common to support standard HD resolutions and refresh (frame/field) rates, or at least part of those. It's fairly uncommon to support SD resolutions as well, but more professional monitors still support those too.

The maximum refresh rate quoted for a monitor is just that - it's not mandatory for any application to use that.

On more recent graphics card drivers (even including Intel ones) you can select the refresh rate alongside with the resolution.
This data is communicated by the monitor, e.g. a decent monitor may claim to support say 24, 25, 30, 50, 60Hz for a 1080p resolution. The driver enumerates all of these as selectable refresh rates for that specific resolution, and consequently Windows lists all of these as available choices when the user queries the available resolutions.
Once the user selects the resolution the monitor should be able to correctly display the scanout at the selected resolution and refresh rate - as long as it matches the parameters sent to the PC.
When the display driver selects different scanout parameters than what the monitor claims to support (e.g. user defined custom scanout parameters, refresh rate, porch etc), it's up to the display electronics to decide what to do with the illegal signal, if not supported - usually display a message on the monitor instead of the picture.

https://en.wikipedia.org/wiki/Extended_Display_Identification_Data
https://en.wikipedia.org/wiki/High-definition_television

You should check the supported resolutions/refresh rates in the data sheet of the monitor. Cheaper monitors cut back on electronics, and thus offer more restrictive support.
If you can't find these available either in the user manual or the data sheet of the monitor, and customer service can't/won't help you with this information, just stay away from the monitor, unless you can get the EDID data, e.g. by trying the monitor yourself in the shop.

In other words, a decent 144Hz monitor will be just as fine displaying a 60Hz scan out as any other refresh rate (supported by the electronics) - the refresh rate does not have to be an integral multiple of the highest refresh rate supported.
Post edited October 10, 2016 by IFW
However... if you select a scanout refresh rate that will be the refresh rate displayed on the monitor, and that will be the refresh rate to keep up with by your PC, and that's where stuttering occurs if the screen updates are not an integral multiple of this rate.
e.g. you select a resolution with a 144Hz refresh rate. Your PC should be able to update the scanout at 144Hz ideally, but should that fail, the only way to avoid noticeable jitter in the screen update is your game to fall back to 72Hz or 36Hz. Any other frequency, and your mind will notice the inconsistency in the screen update. Notice, the scanout itself remains 144Hz all the time, it's just the content that changes at every frame, every second frame and so on.
Similarly, if you select a resolution with 100Hz refresh rate, your program should try and achieve screen updates at 100, 50 or 25Hz to avoid noticeable update artefacts for your mind. Again, the scanout itself remains 100Hz all the time, it's just the content that gets updated at different intervals.
The monitor only cares about the scanout rate, and not the content update rate.

This is what changes with a Freesync/G-Sync monitor: the scanout rate actually matches the content update rate, as long as the minimum and maximum refresh rate supported by the monitor electronics can keep up with it.
Post edited October 10, 2016 by IFW
avatar
IFW: However... if you select a scanout refresh rate that will be the refresh rate displayed on the monitor, and that will be the refresh rate to keep up with by your PC, and that's where stuttering occurs if the screen updates are not an integral multiple of this rate.
e.g. you select a resolution with a 144Hz refresh rate. Your PC should be able to update the scanout at 144Hz ideally, but should that fail, the only way to avoid noticeable jitter in the screen update is your game to fall back to 72Hz or 36Hz. Any other frequency, and your mind will notice the inconsistency in the screen update. Notice, the scanout itself remains 144Hz all the time, it's just the content that changes at every frame, every second frame and so on.
Similarly, if you select a resolution with 100Hz refresh rate, your program should try and achieve screen updates at 100, 50 or 25Hz to avoid noticeable update artefacts for your mind. Again, the scanout itself remains 100Hz all the time, it's just the content that gets updated at different intervals.
The monitor only cares about the scanout rate, and not the content update rate.

This is what changes with a Freesync/G-Sync monitor: the scanout rate actually matches the content update rate, as long as the minimum and maximum refresh rate supported by the monitor electronics can keep up with it.
This is pretty much what I was expecting, I just find it odd that the standard seems to be 144Hz rather than 120Hz, considering that there are many games capped at 60 and 30, and even older ones designed for 20.

Which causes screen tearing? The framerate going above the monitor refresh rate or below? Or do both cause tearing in different ways? I've seen some people say to drop vsync and cap at a frame or two below the refresh rate (143 on 144Hz or 59 on 60Hz), and yet I've also seen instances where people say to cap just above the refresh rate (Overwatch does this by default, capping at 70 FPS on my current 60Hz monitor). Is it a case of capping below to prevent tearing and above to reduce input lag or am I missing something?

My main concern is that a high refresh rate monitor will more quickly experience times when the framerate won't reach the monitor refresh rate, as the PC gets older, so I'm trying to understand what to expect and how to 'fix' this, by reducing the refresh rate on a game by game basis, as well as how to handle those odd games with silly caps and those old games that need them to work properly.
(strictly speaking of content update here, not scanout refresh rate)
Screen tearing is caused by updating the display buffer while the scanout occurs - one part of the display buffer would contain the old frame, the other part the new frame. This a bit simplistic; actually it can get more complex than this, due to how the display actually gets updated, but should give you a good basic understanding.
Double or triple buffering works by using one display buffer for scanout, while the other buffer is the one that gets updated. Effectively, you eliminate tearing caused by buffer updates mixing with screen updates.
However, you can still get tearing as switching between the buffers must occur at the same time period as the display itself starts refreshing the screen update.
That's why you need v-sync. Without v-sync you "delegate" the tearing caused by buffer update/scanout to the display itself - you switch the display content at the wrong time, while the previous frame is still being displayed.
That's why you need to have at least double-buffering (or more) with v-sync both in order to completely avoid tearing - again, this is only true on traditional displays without Freesync/G-Sync.

Higher than supported refresh rate will cause skipped frames or if v-sync is missing partially updated frames, that change during the scanout causing various visual artefacts.
Lower than supported refresh rate will just cause the scanout to repeat; visible as inconsistency/stutter for the user.

For old games you should simply select a resolution with a scanout refresh rate that is an integral multiple of the expected frame rate.
e.g. if you know your game works correctly at say 30Hz, select a resolution and scanout refresh rate that is actually a multiple of 30, ie 60, 90 or 120 Hz on your 144Hz monitor and it will be perfect (assuming correct double-buffering and v-sync).
Your 144Hz monitor is likely to work with such resolution/frame rate combinations, but - again! - if this compatibility matters to you, check the information provided by the monitor.
Post edited October 10, 2016 by IFW
I have to admit I am quite puzzled by refresh rates, when it comes to non-CRT monitors (and TVs). I mean... wasn't the refresh rates specifically something relevant for CRTs, as the picture had to be constantly refreshed? Weren't these newer LED screens and such supposed to be completely "flicker-free", ie. they are not constantly redrawn, and only if something changes on the screen, then relevant pixels get changed on the screen etc.?

Yeah I probably misunderstand something here, maybe the "refresh rate" means something different with LED monitors and such, ie. merely how often at best it can change the pixels on the screen, not that it is really constantly refreshed that often?

EDIT: Maybe the long replies just above mine explain all this, I guess I need to read them through.
Post edited October 10, 2016 by timppu
You are absolutely correct: the way our modern displays still receive and display data is an artefect of the CRT era.
Ie, Vertical blank, horizontal blank, scanline from top to bottom, left to right with horizontal blanks... vertical blank again.
That's how CRT displays a picture, and sadly the same method was chosen for our LCDs, LEDs, and OLEDs as well for compatibility/convenience reasons :)
V-sync for a modern display simply means that you change the scanout content at the time when the display electronics starts to change the pixels according to the new buffer content, but still using the same update method as CRTs without buffers used to have.

In reality, modern displays could switch an individual (sub)pixel at any given time within a certain amount of time and the pixel is driven from a frame buffer in the monitor, as that is needed for scaling/filtering of the various resolutions supported. This is not really possible with a serial interface though, so you just get the CRT-style update instead...
So practically, you wouldn't need all the synchronization magic to get perfect screen updates, but due to the way the data gets updated we actually simulate a CRT refresh method. It is also cheaper this way :)

Freesync/G-Sync work around these limitations, but ideally we'd need displays where you could simply update a pixel individually, instead of updating all of them at set intervals. It would be too expensive to change all the devices, so probably would never happen.
avatar
timppu: I have to admit I am quite puzzled by refresh rates, when it comes to non-CRT monitors (and TVs). I mean... wasn't the refresh rates specifically something relevant for CRTs, as the picture had to be constantly refreshed? Weren't these newer LED screens and such supposed to be completely "flicker-free", ie. they are not constantly redrawn, and only if something changes on the screen, then relevant pixels get changed on the screen etc.?

Yeah I probably misunderstand something here, maybe the "refresh rate" means something different with LED monitors and such, ie. merely how often at best it can change the pixels on the screen, not that it is really constantly refreshed that often?

EDIT: Maybe the long replies just above mine explain all this, I guess I need to read them through.
You are correct, but how they implemented it... is a different matter. See my post above.
Post edited October 10, 2016 by IFW
avatar
timppu: I have to admit I am quite puzzled by refresh rates, when it comes to non-CRT monitors (and TVs). I mean... wasn't the refresh rates specifically something relevant for CRTs, as the picture had to be constantly refreshed? Weren't these newer LED screens and such supposed to be completely "flicker-free", ie. they are not constantly redrawn, and only if something changes on the screen, then relevant pixels get changed on the screen etc.?

Yeah I probably misunderstand something here, maybe the "refresh rate" means something different with LED monitors and such, ie. merely how often at best it can change the pixels on the screen, not that it is really constantly refreshed that often?

EDIT: Maybe the long replies just above mine explain all this, I guess I need to read them through.
Actually what some people refer too as flickering would be more like what happened when using interlaced mode on CRT, having 2 different pictures on the screen.

And a lot of the problems nowadays is, that the interlaced problem is kind of back.

And a non-CRT refreshes the same way (speaking) as a CRT. Don't forget whatever happens all the data is still transmitted sequentially. Hence the monitor updates in a similar way. The reaction time of the cells is to be taken into account as well. But for this the old CRT's were actually better and quicker. Hence you should always look for a fast time (less then 4ms which translates over to 250 Hz.

And one problem with the above is, that even if the screen can run 144 Hz, but the reaction time is to high, the screen won't change. reaction time 8ms = 125 Hz.