It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
I hate screen tearing so I usually leave vsync on or AA - whichever gives the best result. I don't really notice input lag so that's not an issue for me but once that invisible samurai begins to slice and dice up the screen into the Settings I go and turn on either vsync or AA.

I do have a 144hz monitor but sadly it's only freesync and not g sync so there is that.
avatar
jepsen1977: I hate screen tearing so I usually leave vsync on or AA - whichever gives the best result. I don't really notice input lag so that's not an issue for me but once that invisible samurai begins to slice and dice up the screen into the Settings I go and turn on either vsync or AA.

I do have a 144hz monitor but sadly it's only freesync and not g sync so there is that.
isnt AA just for making zigzags into lines?
avatar
timppu: - How exactly does vsync cause input lag? To me those two things sound unrelated, so I am trying to understand how they are connected.
avatar
Sarafan: This is what I've found: "V-Sync adds input lag because it delays frames from being shown on screen, making the time between when you do something and when it appears on screen longer."
That is basically how it works, yes. It is most noticeable in fast paced games of course. Also visible in games with continuous input. Grim Dawn for example where you hold down mouse 1 to move. With V-Sync, there is a distinct delay in the character following your cursor (turning the other way especially). Without it, the character is always on point with the cursor and reactions are much more snappy. It can also cause mouselook issues, where it becomes much less smooth and in FPS games like UT and such, it can feel like you are playing with a 40 ping even in single player.
avatar
timppu: - How exactly does vsync cause input lag? To me those two things sound unrelated, so I am trying to understand how they are connected.
avatar
Sarafan: This is what I've found: "V-Sync adds input lag because it delays frames from being shown on screen, making the time between when you do something and when it appears on screen longer."
So does that mean the lag is, in the very worst case, 1/60 of a second (or 1/30, or 1/15, or in whichever framerate the game happens to be running at the time)?

If that is the "vsync lag", then I couldn't care less about it. Then again in competitive action gaming where you are trying to win the $1000000 grand prize and every millisecond counts to your chance of success, I can understand it.
Post edited May 20, 2021 by timppu
avatar
Sarafan: This is what I've found: "V-Sync adds input lag because it delays frames from being shown on screen, making the time between when you do something and when it appears on screen longer."
avatar
timppu: So does that mean the lag is, in the very worst case, 1/60 of a second (or 1/30, or 1/15, or in whichever framerate the game happens to be running at the time)?

If that is the "vsync lag", then I couldn't care less about it. Then again in competitive action gaming where you are trying to win the $1000000 grand prize and every millisecond counts to your chance of success, I can understand it.
Basically depends on how much V-Sync throttles your GPU. What V-Sync does is that it introduces a buffer (double buffering) that stores a frame.

So, you see a frame on your 60 Hz display and your GPU would output 200 FPS if unrestrained. The input should go through about every 5ms, even though you are seeing only 60 FPS but you might see tearing due to the big mismatch. Now you enable V-Sync which does two things:

It allows a frame to be shown on the screen only once per refresh cycle by pulling up a frame from a buffer, meaning 16.67 ms in 60 Hz case.

Second, it stores the follow up frame in the buffer (because the GPU is faster than your monitor) and holds on to it until the previous cycle is complete to show it. This allows the GPU to idle (buffer is full) until a slot in the buffer frees up which causes the reduced load on it. However, the frame with the input that is in the buffer will have been old by the time it is shown on the screen, because you forced a delay on it and you have to wait until the new refresh cycle starts. Most people say input lag, but in reality, it is more of an output delay. And you are already looking at at least 33 ms because of 2 frames, which is already 6x more than without it. You also have to account for the times it takes to access a frame in the buffer and send it through as opposed to the frame just going straight to the display.

Add other delays like processing times, input delay of peripherals, speed of the game engine etc. and it can stack up to noticeable levels pretty fast. How quickly one notices depends on the person. It might not seem like much but as I said, becomes very noticeable in fast paced games.
Post edited May 20, 2021 by idbeholdME
avatar
timppu: I found it odd as it is an old game that the laptop should handle easily, and it turned out that because vsync was off, the game was calculating something like 2000 frames per second (checked with the game's internal FPS counter), even though of course it could display only 60 frames per second on the laptop screen.
Alot of game designs, especially older, use vsync for the timing. Everyone with experience in coding with things with a double buffer knows of the "draw loop design," in which you have a single thread that draws then enters a wait state until vsync clears again.
I keep hearing about the "vsync lag", and I can't really say I've noticed it, but then maybe I haven't tested it closely enough. I don't quite understand it either, like:
Since I've written my own VBE driver (using the BIOS for mode switching) i'll try to answer these for you.
- How exactly does vsync cause input lag? To me those two things sound unrelated, so I am trying to understand how they are connected.
- Does this input lag occur an all games where vsync can be enabled, and if not, why not?
No. If a game has input threaded separately, it will not. However, that doesn't mean that someone with a fast motion camera won't perceive it, though, because the results of the input may appear next frame if it's during the drawing thread.
- How big is the input lag anyway, or does it depend on a game (severe on some games, unnoticeable on others...)?
Depends on the framerate. The input delay is obviously way, way lower than the human reaction time, but in theory for some games that makes a difference. In practice, not so much unless everyone's using aim bots.
Like when playing that Quake game in vsynced 60 fps instead of "calculated" 2000 fps with vsync off, I don't recall thinking "Wow, the input is so laggy now, near unplayable" when playing with vsync on. As far as I could tell, there was no difference, the game ran smooth and controls were fine regardless of vsync off or on. The only real difference was that the laptop ran much cooler and quieter with vsync on.
It's like people and their physical keyboards. They understand that wire is faster than over the air, but frequencies at which such things run mean that, unless there's a large amount of interference (which would also cause lost input), the amount of latancy added is most likely less than a single frame. But, hey, it's like the cereal industry hating on eggs and pushing propaganda about it. Of course, most people wouldn't likely notice if not told to look for it. It's definitely placebo effect. You see a similar effect with people complaining that a game runs at 60fps vs 120fps. Most people won't notice the difference between 30 and 60 without being told, and while there is a difference between 60 and 120, odds are pretty much everyone would fail a genuine blind test.

avatar
timppu: And yeah I hope my next gaming laptop has either gsync or freesync or whatever is relevant so that I don't have to care about vsync anymore and get the benefits of both vsync on and off (like that the GPU/CPU is not unneededly running at 100% all the time, getting constant 60fps with no input lag etc.).

However, the gsync/freesync options seem to be limited, maybe if those two are two competing "standards". Then again I might usually use an external monitor (or TV) for gaming even with a laptop, so maybe it is enough that the external display has the relevant support.

I don't recall if my 65" LG OLED TV has gsync or freesync support, I seem to recall it should? At least the PC monitor I bought over a year ago on sale does not, it is just a common 60Hz computer monitor.
The whole "freesync" idea is new to me, but i'm gonna go out on a limb here from my quick googling and suggeset that it's not actually new tech. Some years ago I got a chinese knockoff of a Maple board that came with an LCD screen, pretty cheap and low resolution. It had a grounded refresh pin, and a mode to use it. If that refresh pin wasn't grounded, I could've used the refresh pin to set the refresh rate to whatever the hell i wanted, entirely eliminating the double buffer.

The irony is, with this setup, you only reduce the "input lag." In reality, you're still not writing during the screen update phase, so your input lag becomes the time it takes for the sync to finish. IIRC, the reason they also limit screents to certain rates like this is to prevent damage. I foresee some issues in the future with damaged screens or screens not lasting long.
avatar
kohlrak: You see a similar effect with people complaining that a game runs at 60fps vs 120fps. Most people won't notice the difference between 30 and 60 without being told, and while there is a difference between 60 and 120, odds are pretty much everyone would fail a genuine blind test.
I know this is genuinely childish, but "blind test" made me laugh in the context of trying to see something.

I'll go now.


On topic, I do agree that most people would struggle to see the difference, but I think that a lot comes down to what you're used to or acclimatised to. I genuinely find 30fps to be a bit painful now, but that's likely because I'm used to 55-144fps. Also, the g-sync stops working when you drop that low, so you get those microstutters back (which I find trigger headaches).

However, back in the mid 1990s, I could quite happily play things at 20-25fps and not really see it as a problem. I still struggle to see a difference once you go above say 70fps (although interestingly, Quake II feels a lot more responsive with the OpenGL renderer (900+fps) than the RTX renderer (55fps)
avatar
kohlrak: You see a similar effect with people complaining that a game runs at 60fps vs 120fps. Most people won't notice the difference between 30 and 60 without being told, and while there is a difference between 60 and 120, odds are pretty much everyone would fail a genuine blind test.
avatar
pds41: I know this is genuinely childish, but "blind test" made me laugh in the context of trying to see something.

I'll go now.
I knew someone was going to giggle at it when i realized it, but i figured a good joke is what we'll likely need since some people are going to take this topic way, way more personal than necessary. I mean, effectively, the complaints of input lag and such are equivalent to people playing horseshoes (made of iron) complaining about a gentle breeze.
On topic, I do agree that most people would struggle to see the difference, but I think that a lot comes down to what you're used to or acclimatised to. I genuinely find 30fps to be a bit painful now, but that's likely because I'm used to 55-144fps. Also, the g-sync stops working when you drop that low, so you get those microstutters back (which I find trigger headaches).
I think it would still work at those low framerates, but the issue in particular is that the stutter becomes very noticeable below 30.
However, back in the mid 1990s, I could quite happily play things at 20-25fps and not really see it as a problem. I still struggle to see a difference once you go above say 70fps (although interestingly, Quake II feels a lot more responsive with the OpenGL renderer (900+fps) than the RTX renderer (55fps)
I mean it's possible to feel a difference above 55, but we're talking about special people at this point. I mean, the whole idea was that a full screen refresh at about 30 was around what alot of people believed was the refresh rate of the human eyeball (or rather, the imprint rate: remember that motion blur is a real thing for real eyeballs, hence "air writing"). IIRC, science has found out that it can be much higher depending on focal points (suggesting this could be a brain thing, more than an eye thing). That said, you can also use intentional blurring to lower framerates to a degree that almost no one perceives it (DVDs use this trick). However, not everyone agrees and i did see your number pop up.
Never had, never will, because capping the frame rate to the monitor's refresh rate minus 1 or 2 frames, is a much better solution to eliminate screen tearing, and RTSS (RivaTuner Statistic Server) is the best tool for doing that.

I'm actually a bit surprised about how many use Vsync on. I'm not even using Freesynch, though the monitor supports it, but I never experimented much with it. Was something that I din't like about it, when I first tried it. Some sort of discomfort to my eyes maybe, so I just disabled it and never bothered with it since.

I'm not into competitive gaming and frame rates in the 70-120 range will do just fine for me. Generally, I set Radeon Chill to 75-118, and RTSS to 120, for the rare situations in which Chill doesn't cap the frame rate properly, like cutscenes, level loading or main menus.

Here are a few videos about frame rate capping and input lag:
How To Fix Stutter In Games - Frame Rate, Frame Time & RTSS
NVIDIA's NEW FPS Limiter vs. RTSS & In-Engine Limiters / Input Lag Results
AMD's Chill vs. RTSS & In-Engine Limiters / Input Lag Test
Does Capping Your Frame Rate Really Reduce Input Lag?
avatar
ariaspi: Never had, never will, because capping the frame rate to the monitor's refresh rate minus 1 or 2 frames, is a much better solution to eliminate screen tearing, and RTSS (RivaTuner Statistic Server) is the best tool for doing that.
Have you tried taking screenshots? I suspect alot of tearing at certain points. It'll slowly drift out of alignment and back into alignment with a complete cycle of 1 minute for 60fps.
I'm actually a bit surprised about how many use Vsync on. I'm not even using Freesynch, though the monitor supports it, but I never experimented much with it. Was something that I din't like about it, when I first tried it. Some sort of discomfort to my eyes maybe, so I just disabled it and never bothered with it since.
What were the rates you were getting with that? Your first link explains why that might be. A variable frame rate on a game that doesn't have separate threading is going to have some noticeable issues with pacing. Moreover, it more or less requires games to cap themselves. It also sounds like the frame-rates for these freesync monitors are very low while in freesync mode. I'm also seeing how some people could get the input delay from some of these framerate limiters, which, of course, comes down to the fact that a game without limited framerate is not going to have the same delays per frame for single-threaded experiences. The issue comes down to how the game handles wait states as well as input: if you're using a single thread, you'll get that delay with a driver limiter or something due to the fact that the game is likely getting frozen by the driver, whereas things like triple buffering features (more resources but runs alot smoother) more or less don't even cap the game's framerate (and thus some games might go too fast with triple buffering since some games rely on refresh-rate for game logic).

An easier way to explain this is thinking of a 2d game when you hit a jump key. The game logic can do one of multiple things:

A) Instantly change an "upward momentum value" and also take the time to immediately add that to your character position.

B) Instantly change an "upward momentum value" and simply rely on game logic phase to add to the character position as needed.

C) Instantly mark input as keydown, and rely on game logic phase to then keep adding until you either let go or the "height limit for jump" has been reached.

D) Check for keyup-keydown.

E) Separate game logic (which will likely implement C or D) and drawing threads.

Situations A and B will likely leave inconsitent jump heights. However, A will have 1 frame input lag in worst case scenario, but most likely average 0. Situation B will likely have 0-2 frame input lag. Keep in mind, situations A and B have input on separate threads.

C and D are more or less the same, but C allows for threading input like A and B, however it will still manifest exactly as D, unless playing a fighting game locally and there's a timestamp throw into the calculation that accounts for the threading. In this situation, the frame delay is likely to be the same as situation B, except that there's the added (although statistically improbable) chance that an input can be lost entirely because key down and key up events happened within a single frame, which would mean input is lost because a single instance of game logic never occured.

Situation E is actually probably ideal, with an input lag of about 2 frame worst case scenario, but very, very unlikely. It'll seem to most people like A.

Now, in reality, coders have trouble with threading, IRL, and it's not an easy thing to really master, so more than likely you'll see C and D in the majority of games. In this majority of games, if the game logic and refresh rate is out of sync, then the input lag can easily be doubled, with an output frame delay of 1 (which also happens with triple buffering).

The easiest way to identify C and D scenarios is to intentionally limit a game's frame rate, and see if the game logic slows down and everything moves in slow motion. However, alot of games implementing C and D will do some tests first to see what the actual frame rate is and compensate mathematically, preventing a visible slowdown, my using a multiplier to all calculations. The Mafia game here on GOG notoriously fails to do this properly, and the "racing section" is said to be worse the better computer you have.

Now, while A, B, E sound ideal, some issues can arise when the game logic sections are at a similar rate to the frame rate and they desync. Moreover, you can have some compounding logic lag issues resulting from multiple events happening at the same time (which can also potentially cause freezing). As such, you'll almost never see these, even if, at a first glance, they sound like the ideal situation. E has the potential to have the least performance impact, however, and will likely work best on a computer that's way over minimum requirements, because it's likely to have alot of wait states cutting down on power consumption.

The fundamental issue is asynchronous events (user input) in connection with synchronous game logic. When things need sync, and when they go way out of sync, your frame issues can very quickly end up multiplied. IMO, to be realistic, one should focus on finding C and D as the norm, and accept the input lag (since it's all still smaller than the human reaction time which is easily over 10 frames) as it provides the most overall stability. E can be exceptional in the long run, especially if implemented correctly. A and B seem tempting at first, but will likely result in frame inconsistency in the long run.

Notes:

-All this assumes some kind of frame limiting is on (vsync is one such, and my base assumption, but numbers will likely increase multiplicatively for using a framerate other than vsync if vsync is enforced at the driver level).

-Sitautions C and D will likely spiral out of control if you try to force a lack of frame-rate control, while A, B, and E are more or less likely to appear to freeze while the game logic continues with a single black frame stuck on the screen, or some massive tearing.

-All this is worst case scenario. Although bugs can make some scenarios worse. I'm more than happy to try to analyze any video someone gives me.

-Certain games have some hidden logic, like intentional frame delays, which can sometimes disappear as a form of a bug. This is not often understood by many players, and i do recommend devs abandon this practice.

-Situation A would likely result in a moon-jump cheat with any turbo-enabled controller.
I used to use v-sync all the time until I noticed a few games that had horrendous framepacing and was a microstuttery mess, eventhough telemetry reported a smooth 60 fps and stable frametimes.
So I started to look around for different options as with uncapped framerates many games will melt my poor gpu.
I was SOL because my display wasn't g-sync/freesync compatible, which should always be the first option to choose over using v-sync.

Then I came across this guide.
I knew beforehand RTSS had a scanline sync option but I never knew how to use it until now.
There are some shortcomings though, as the scanline sync you set (e.g. -30) mostly works for only one viewpoint/camera, so for games that switch the camera a lot it still isn't ideal and you'll suffer some tearing but so far it has worked pretty good for me.

As for input lag, I'm not sure I notice a lot of difference between using v-sync and uncapped framerates although maybe the games I've played weren't too sensitive about it. I think you'll notice it mostly in fastpaced games like Quake and Unreal tournament and I remember v-sync made Re-Volt almost unplayable.
Damn... After turning V-sync off in Quake Champions, now I see input lag in literally every game when it's enabled! Turn it off at your own risk! :D
avatar
kohlrak: Have you tried taking screenshots? I suspect alot of tearing at certain points. It'll slowly drift out of alignment and back into alignment with a complete cycle of 1 minute for 60fps.
At uncapped frame rate? Not really, I had no reason to. The truth is, I don't remember ever seeing tearing in my games. I always cap the frame rate to have a cooler and quieter GPU, not because of tearing. And I have no use for hundreds of FPS.

I experience some tearing with youtube videos rarely on my laptop's screen, but not on the monitor connected to it. I assume the laptop's panel has a much slower response time, 15ms - 20ms probably.

avatar
kohlrak: What were the rates you were getting with that?
I tested it only with Dying Light and it was above 120 FPS, usually. The camera moved way too quickly for my visual comfort and I didn't like it. But like I said, I didn't bother much with it, probably like half an hour. Maybe it wasn't even related to the Freesync and was just the high frame rate. My understanding is that adaptive sync is really useful when the frame rates drop below 60 FPS.

I might try again sometime with some fast paced shooters (the Unreal Tournamet games would be good, I guess), but nowadays I'm playing mostly open world games and strategies. For me, the 90-120 FPS seem to be the sweet spot, so I have no reason to consume more power to run games at 144 FPS or more. And I prefer silence over high frame rates, as I don't game too often with headphones on.
avatar
kohlrak: Have you tried taking screenshots? I suspect alot of tearing at certain points. It'll slowly drift out of alignment and back into alignment with a complete cycle of 1 minute for 60fps.
avatar
ariaspi: At uncapped frame rate? Not really, I had no reason to. The truth is, I don't remember ever seeing tearing in my games. I always cap the frame rate to have a cooler and quieter GPU, not because of tearing. And I have no use for hundreds of FPS.

I experience some tearing with youtube videos rarely on my laptop's screen, but not on the monitor connected to it. I assume the laptop's panel has a much slower response time, 15ms - 20ms probably.
No, capped. Putting a cap on that is not tied to vsync is likely to result in tearing. Now something you can do to reduce this is to use some sort of forced vsync in addition to the manual capping (through drivers), but, depending on the game, this could lower the framerates more than specified depending on implementation. The reason for the potential tearing is becaused the timer used can (and likely will) go out of sync with vsync, and thus you'll end up with the majority of frames torn.
avatar
kohlrak: What were the rates you were getting with that?
I tested it only with Dying Light and it was above 120 FPS, usually. The camera moved way too quickly for my visual comfort and I didn't like it. But like I said, I didn't bother much with it, probably like half an hour. Maybe it wasn't even related to the Freesync and was just the high frame rate. My understanding is that adaptive sync is really useful when the frame rates drop below 60 FPS.

I might try again sometime with some fast paced shooters (the Unreal Tournamet games would be good, I guess), but nowadays I'm playing mostly open world games and strategies. For me, the 90-120 FPS seem to be the sweet spot, so I have no reason to consume more power to run games at 144 FPS or more. And I prefer silence over high frame rates, as I don't game too often with headphones on.
I would still suggest vsync (with a normal screen), because that'll cap it at the monitor's refresh rate. For free-sync, i suggest going to 70 or 75. If you're worried about power and noise, your stated FPS is probably a little high.
avatar
Strijkbout: I used to use v-sync all the time until I noticed a few games that had horrendous framepacing and was a microstuttery mess, eventhough telemetry reported a smooth 60 fps and stable frametimes.
So I started to look around for different options as with uncapped framerates many games will melt my poor gpu.
I was SOL because my display wasn't g-sync/freesync compatible, which should always be the first option to choose over using v-sync.

Then I came across this guide.
I knew beforehand RTSS had a scanline sync option but I never knew how to use it until now.
There are some shortcomings though, as the scanline sync you set (e.g. -30) mostly works for only one viewpoint/camera, so for games that switch the camera a lot it still isn't ideal and you'll suffer some tearing but so far it has worked pretty good for me.

As for input lag, I'm not sure I notice a lot of difference between using v-sync and uncapped framerates although maybe the games I've played weren't too sensitive about it. I think you'll notice it mostly in fastpaced games like Quake and Unreal tournament and I remember v-sync made Re-Volt almost unplayable.
It sounds to me like, in particular, your games were either trying to run too fast or you had bugged vsync.Like i said earlier in the post, there have been alot of broken vsync in drivers in recent history.
Post edited May 20, 2021 by kohlrak
avatar
Sarafan: Damn... After turning V-sync off in Quake Champions, now I see input lag in literally every game when it's enabled! Turn it off at your own risk! :D
wish i could do that with most of my games , but they get screen tearing and it is very frustrating
as others i prefer delay over tearing