It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
low rated
avatar
kohlrak: And this is why people like me don't believe people like you. People make all sorts of claims giving advice on technology, but they can't even do the bare minimum to verify their claims.
avatar
Sarafan: The majority of people posting in this thread claim that the lag is present and that's why they don't use V-sync. If you don't believe it, that's your problem. I'm not planning to build a device that measures lag only to satisfy your doubts. Sorry.
The offer's up for anyone, and i'm sure some of the people making the claims have the parts laying around or something similar they could do.
avatar
kohlrak: They just defer to someone else who's complaining, and it becomes indistinguishible from mass hysteria. Until someone can verify to me that a video card and driver combination that is working properly actually introduces significant input latency, i'm just going to go with people turning around and saying "I couldn't throw a strike 'cause the wind was blowing." Without some sort of physical evidence that something is there, i'm just going to assume it's all in your heads (we've only been working with that standard for the past 200 years).
Then explain to me why I'm getting better results in online shooters when I turn V-sync off? Is it only my head again?
At this point, i do honestly believe so. I've been trying to be charitable in this topic, but i'm running out of patients for "facts" with no evidence.
avatar
kohlrak: I don't. And i do play them from time to time (i especially like fallout and i play ranged characters in elder scrolls, but i'll even play the twitchy fraggers here and there). I get more input lag from games using a hard disk drive than i do from vsync.
Skyrim is another great example. I have a massive mouse cursor lag in this game, but it disappears almost completely when I look around with my character. So only the mouse cursor is affected in a meaningful way.
Let that sink in for a second. vsync has no connection to mouse or keyboard, so why would these manifest differently for you?
Yes, almost all the time.
avatar
kohlrak: Let that sink in for a second. vsync has no connection to mouse or keyboard, so why would these manifest differently for you?
They don't. I was talking about looking around with mouse. In the menu screen there's mouse lag. When I'm looking around with my character with mouse the lag is almost non-existent. I'm not talking this isn't strange, but it's not the first time I've encountered such a thing. And it all comes to V-sync eventually. There are of course people who don't have such a massive mouse cursor input lag in the Creation Engine games. It's all connected to hardware and possibly drivers. I haven't checked how the game performs on my current hardware though. The lag was present on my old Gigabyte GeForce GTX 1060.
Yes. I have gsync monitor, so I turn it off in the game, and force it on via NVidia control panel.

vsync limits tearing, and I notice tearing more than I would notice any lag introduced from using vsync. So my experience is improved visuals, and no downside. Plus, since a monitor cannot show more frames than its refresh rate, vsync reduces heat and power consumption by not rendering useless frames.
Post edited May 24, 2021 by qwixter
I always use V-sync because I hate screen tearing so much.
avatar
kohlrak: Let that sink in for a second. vsync has no connection to mouse or keyboard, so why would these manifest differently for you?
avatar
Sarafan: They don't. I was talking about looking around with mouse. In the menu screen there's mouse lag. When I'm looking around with my character with mouse the lag is almost non-existent. I'm not talking this isn't strange, but it's not the first time I've encountered such a thing. And it all comes to V-sync eventually. There are of course people who don't have such a massive mouse cursor input lag in the Creation Engine games. It's all connected to hardware and possibly drivers. I haven't checked how the game performs on my current hardware though. The lag was present on my old Gigabyte GeForce GTX 1060.
If it's drivers and hardware, it's not vsync itself. Though, i think you should think a little harder on this one. If vsync were the issue, you should have the exact opposite problem: cursor should be the only thing that doesn't lag, or they should lag equally. The thing with skyrim in particular is, i happen to know some things about it's engine that create lag in these scenarios that isn't tied to vsync. Had a really interesting issue, once, where texture size was creating massive amounts of lag. Apparently, checking my I/O, it was trying to use my hard-drive for every frame. Skyrim is one of those games that tries to load and unload assets on the fly.... from disk... There are lots of other really similar issiues in skyrim's engine, as well, such as the lack of occlusion, which a mod that fixed that which gave me the FPS i needed to actually play skyrim on my computer.
avatar
Dark_art_: Yes. See my previous post attached picture. You can limit the frame rate to any numer, including decimals or use the Scanline Sync.
Note that it limits the framerate on any aplication (Directx 9+ and OpenGl, don't know about others), so you may want to have profiles for individual games.
Ok thanks I guess I need to read up on that then. I guess it would be an improvement if it really frame-limited absolutely everything, as currently I am in a situation that I can't comprehend why e.g. Dungeon Siege doesn't appear to be frame-limited, even though I have enabled vsync both in GPU drivers, and the "DirectX configuration" utility that comes with the game. Yet, I get up to 200fps and my GPU fan is running at full speed to cool it down, for such an old game.
avatar
Dark_art_: Yes. See my previous post attached picture. You can limit the frame rate to any numer, including decimals or use the Scanline Sync.
Note that it limits the framerate on any aplication (Directx 9+ and OpenGl, don't know about others), so you may want to have profiles for individual games.
avatar
timppu: Ok thanks I guess I need to read up on that then. I guess it would be an improvement if it really frame-limited absolutely everything, as currently I am in a situation that I can't comprehend why e.g. Dungeon Siege doesn't appear to be frame-limited, even though I have enabled vsync both in GPU drivers, and the "DirectX configuration" utility that comes with the game. Yet, I get up to 200fps and my GPU fan is running at full speed to cool it down, for such an old game.
It could simply be broken for the ingame stuff, and the GPU drivers even for me limit it to opengl. And interestingly, trying to find out what could be causing it, i think i found the culprit to everyone's claims of input lag in this strack overflow post. One of the replies suggest that DirectX in particular likes to buffer several frames in order to prevent framedrop spikes. How many frames? Good question, but this would present as major input lag in specifically direct-x games. Moreover, there seems to be an industry standard to create 1 frame input lag by handling input after the frame is drawn. The solution to said input lag, then, would be to find out how limit or disable this multi-frame buffer. This might also explain why drivers might not offer the framebuffer thing. On the flip side, this could mean that triplebuffering and disabling vsync will eliminate lag for the people complaining (since it might be the case that it'll still vsync the 3rd buffer). I'm going to look more into this and see what can be done. Obviously this is there to stabalize FPS, but it's very bad for first person shooters as well as energy and heat efficiency.

Those of you whom are not afraid of reverse engineering should find this an interesting place to start: MSDN. It seems the input lag for vsync is intentional. If this is where this is implemented, then this is entirely in the control of the developer.
Post edited May 24, 2021 by kohlrak
avatar
kohlrak: If vsync were the issue, you should have the exact opposite problem: cursor should be the only thing that doesn't lag, or they should lag equally.
Not necessarily; lag on a cursor can be much easier to see compared to moving the camera around. If the game uses a software cursor, then it must update at the same rate as the game, whereas a hardware cursor is independent and will always update at the screen refresh rate. You can see the difference in the attached gifs, where the software cursor (block) keeps up with the hardware cursor (arrow) without vsync, but visibly lags with vsync.

I personally absolutely can't tell one frame of lag when playing a FPS game, so I always use vsync. If garbage drivers or vsync implementation bugs are somehow adding multiple frames, then yeah, that would be a problem. But I can usually tell when the devs are using a software cursor.
Attachments:
novsync.gif (109 Kb)
vsync.gif (153 Kb)
avatar
kohlrak: If vsync were the issue, you should have the exact opposite problem: cursor should be the only thing that doesn't lag, or they should lag equally.
avatar
eric5h5: Not necessarily; lag on a cursor can be much easier to see compared to moving the camera around. If the game uses a software cursor, then it must update at the same rate as the game, whereas a hardware cursor is independent and will always update at the screen refresh rate. You can see the difference in the attached gifs, where the software cursor (block) keeps up with the hardware cursor (arrow) without vsync, but visibly lags with vsync.

I personally absolutely can't tell one frame of lag when playing a FPS game, so I always use vsync. If garbage drivers or vsync implementation bugs are somehow adding multiple frames, then yeah, that would be a problem. But I can usually tell when the devs are using a software cursor.
That's not input lag, though, but output lag. This is demonstrated by the "hardware cursor" keeping up. And, in light of my recent discovery, I have to ask what APi was used to produce that? SDL? DX? OpenGL? It's starting to look like an issue with Microsoft in particular attempting to create a long back-buffer to present frame stability which, in turn, results in output delays of several frames. Nice to see someone trying with the "hardware cursor," though as this means if i used FFMPEG i could easily find the exact delay.

EDIT: Vsync has 2 frames of output delay, and the visuals are clearly exacerbated by the input sampling rate. WIthout vsync, there's still a 1 frame delay.

I do find it curious that the dramatic back and forth movements are not replecated with the non-vsync version.

EDIT again: novsync has 1 or 2 frames of delay, vsync version has 4 frames of delay. Just caught that the recording is at 30fps unlike the output of the program.

Final edit: i'm gonna guess that the lack of tearing during all this demonstrates that there is some vsync going on at another abstraction layer, too. At over 500fps, i should not be seeing that delay, yet i am.
Post edited May 25, 2021 by kohlrak
avatar
kohlrak: That's not input lag, though, but output lag. This is demonstrated by the "hardware cursor" keeping up. And, in light of my recent discovery, I have to ask what APi was used to produce that? SDL? DX? OpenGL?
Metal, by way of Unity, so nothing to do with Microsoft. There's no difference if switching to OpenGL. I would avoid trying to do too much analysis, though, since that was put through a GIF converter at 30fps and hence does not match the actual video. Tearing isn't possible regardless of vsync since they were made with the built-in OS screen recorder; I'd have to take a video using an external source for that to occur.
avatar
kohlrak: That's not input lag, though, but output lag. This is demonstrated by the "hardware cursor" keeping up. And, in light of my recent discovery, I have to ask what APi was used to produce that? SDL? DX? OpenGL?
avatar
eric5h5: Metal, by way of Unity, so nothing to do with Microsoft. There's no difference if switching to OpenGL. I would avoid trying to do too much analysis, though, since that was put through a GIF converter at 30fps and hence does not match the actual video. Tearing isn't possible regardless of vsync since they were made with the built-in OS screen recorder; I'd have to take a video using an external source for that to occur.
I've had screen tearing before with built-in recorders.

As for unity, that could be adding quite a bit of lag, too. Unity is trash when it comes to efficiency, and probably adds it's own buffer to the mix, hence the novsync showing up with a noticeable lag as well, despite the framerate. Here's a post on the unity forums with some useful info, about a separate but related issue. Apparently different versions of unity handle vsync differently, which should not be the case. I'm suspecting more and more that these things keep storing framebuffers.

The issue he's having in particular is outside of the scope of this topic, but it's rather interesting to see that OBS is triggering artificial vsync signals.
Careful about assumptions there. I've been playing Dusk recently (uses Unity), which is very responsive and feels identical with vsync either on or off. It does have some visible tearing with vsync off, so I leave it on. Compare that to, say, Shadow Warrior Classic Redux, which is pure C++ as far as I know...if vsync is on, the title screen has some clearly noticeable "software cursor vsync lag" as demonstrated with the vsync gif, and worse still, that lag persists in the game when it comes to camera movement. Native code won't save you from poor input programming, and using a game engine doesn't force it on you.
avatar
eric5h5: Careful about assumptions there. I've been playing Dusk recently (uses Unity), which is very responsive and feels identical with vsync either on or off. It does have some visible tearing with vsync off, so I leave it on. Compare that to, say, Shadow Warrior Classic Redux, which is pure C++ as far as I know...if vsync is on, the title screen has some clearly noticeable "software cursor vsync lag" as demonstrated with the vsync gif, and worse still, that lag persists in the game when it comes to camera movement. Native code won't save you from poor input programming, and using a game engine doesn't force it on you.
The thing here is that there is a very clear indication that output lag is low priority, anymore, especially by the very nature of a sudden tendency to put input and game logic after the drawing code (which means there's guaranteed to be at least one frame of delay). Then the notion that by default directX saves an unspecified number of finalized buffers before putting them to the screen? Of course vsync is going to have delay issues under these conditions, because vsync is only going to display frames that are several frames behind the input. I don't know the exact location of where this is happening in your case (thanks to the loads of levels of abstraction in things anymore), but it's clear that rendering above the framerate has become necessary to deal with all the back buffers, and in turn this means "input lag." I don't recall this issue in any of my programs, especially the toy OS i've written which i ran straight to hardware. There's a clear indication that something is algorithmicly wrong with the current systems. And to make matters worse, things like OBS simualting vsync implies that "higher programs" are able to add to the "lag."

And here is a post on steam where someone is trying to address it, and gives really, really useful info confirming that nVidia will keep multiple back buffers, too.

Of course, the whole thing explains the separation we're seing between users, too.
avatar
Orkhepaj: btw i dont get this why would vsync drop to 30fps if your fps is below 60 for a 60hz monitor?
imho this is false info , could anyone link if this is true or not?
avatar
fr33kSh0w2012: No but you CAN limit it to HALF your refresh rate though why you would want to do that is beyond me!
Had to do that (1/2 refresh rate with V-Sync On) a long time ago for Homefront: The Revolution.

Explanation incoming - in one of the patches/updates, something went wrong w/ the shadowing process. Don't recall if an NVidia drive broke it or if a patch from DamBuster broke it, or what; been a while and all.

This was NOT the game's final patch, IIRC.

So, back then - at 60fps, with V-Sync On or even Fast-Sync on, you were getting shadows processed wrong; blank frames showing up on-screen (w/ FastSync on here, given how it isn't supposed to show blank frames - yet it was); etc etc.

This could really mess up stuff, as it was impossible to play the game b/c you'd get blank screens shown often, shadows processed wrong and some stuff wasn't showing up on-screen, etc etc.

The solution - use NVidia Control Panel and half the refresh rate w/ V-Sync On. Go figure, it worked - as everything was displaying properly w/ shadows, frames, and whatnot. [shrug]

Wasn't ideal and all, running at 30fps and w/ V-Sync On - but at least it worked back then. [shrug]