It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
ariaspi: At uncapped frame rate? Not really, I had no reason to. The truth is, I don't remember ever seeing tearing in my games. I always cap the frame rate to have a cooler and quieter GPU, not because of tearing. And I have no use for hundreds of FPS.

I experience some tearing with youtube videos rarely on my laptop's screen, but not on the monitor connected to it. I assume the laptop's panel has a much slower response time, 15ms - 20ms probably.
avatar
kohlrak: No, capped. Putting a cap on that is not tied to vsync is likely to result in tearing. Now something you can do to reduce this is to use some sort of forced vsync in addition to the manual capping (through drivers), but, depending on the game, this could lower the framerates more than specified depending on implementation. The reason for the potential tearing is becaused the timer used can (and likely will) go out of sync with vsync, and thus you'll end up with the majority of frames torn.

I tested it only with Dying Light and it was above 120 FPS, usually. The camera moved way too quickly for my visual comfort and I didn't like it. But like I said, I didn't bother much with it, probably like half an hour. Maybe it wasn't even related to the Freesync and was just the high frame rate. My understanding is that adaptive sync is really useful when the frame rates drop below 60 FPS.

I might try again sometime with some fast paced shooters (the Unreal Tournamet games would be good, I guess), but nowadays I'm playing mostly open world games and strategies. For me, the 90-120 FPS seem to be the sweet spot, so I have no reason to consume more power to run games at 144 FPS or more. And I prefer silence over high frame rates, as I don't game too often with headphones on.
avatar
kohlrak: I would still suggest vsync (with a normal screen), because that'll cap it at the monitor's refresh rate. For free-sync, i suggest going to 70 or 75. If you're worried about power and noise, your stated FPS is probably a little high.
My FRAPS folder has now around 2000 screenshots (many were deleted over the years) and none had any tearing. Besides, the tearing will appear only on the screen/monitor, not on the frames which the GPU outputs, or the ones that a screenshot program will save.

I'm not sure why you keep suggesting to use Vsync when it is a known fact that it will introduce additional input lag. Its only usefulness is to prevent screen tearing, which I don't have.

As for the power consumption, my comment was about the uselessness of running games above the monitor's refresh rate, for people like me who don't need more than 100-120 FPS, so that's why I keep the monitor set to 120 Hz. I understand that are people who can see the difference between 100 Hz and 144 Hz, but I don't and I'm glad for that. :) Striving for very high refresh rates can be a huge pain, trying to compromise between visual quality and high performance settings in games, or having to fork a lot of money for the most powerful graphics cards.

Further reading on blurbusters, an excellent site.
low rated
avatar
kohlrak: No, capped. Putting a cap on that is not tied to vsync is likely to result in tearing. Now something you can do to reduce this is to use some sort of forced vsync in addition to the manual capping (through drivers), but, depending on the game, this could lower the framerates more than specified depending on implementation. The reason for the potential tearing is becaused the timer used can (and likely will) go out of sync with vsync, and thus you'll end up with the majority of frames torn.

I would still suggest vsync (with a normal screen), because that'll cap it at the monitor's refresh rate. For free-sync, i suggest going to 70 or 75. If you're worried about power and noise, your stated FPS is probably a little high.
avatar
ariaspi: My FRAPS folder has now around 2000 screenshots (many were deleted over the years) and none had any tearing. Besides, the tearing will appear only on the screen/monitor, not on the frames which the GPU outputs, or the ones that a screenshot program will save.
The GPU is going to sync output either way. It more or less has to. What you get out of the screenshot is going to be what the screen gets.
I'm not sure why you keep suggesting to use Vsync when it is a known fact that it will introduce additional input lag. Its only usefulness is to prevent screen tearing, which I don't have.
Because the input delay is negligible, and even then it's not actually input delay in 90% of scenarios, but output delay. Input goes in asynchronous, while output does not. When you mess with mouse or keyboard, the device creates an IRQ which then immediately triggers an ISR which is then sent to the driver itself. This will wake a CPU that's stuck in a wait loop. The output delay is triggered by how much time it takes for the game to draw the next frame after that process occurs. The fact that this is what's going on, yet it's still considered to be input delay and "game changing" to many is indicative of, indeed, none of this makes any practical difference. Keep in mind, we're talking 2 frames where as the average human reaction time (based on college athletes, not the general population which is lower) is 4 frames at 30fps (rounding down), 9 at 60fps, or 19 at 120 FPS. This "input lag" is the same number of frames regardless of framerate, so a higher framerate reduces the "lag." I'm gonna go out on a limb and say that if you vsynced a handful of gamers in a tournament, they not only wouldn't notice, they also wouldn't have any statistical differences in outcomes.
As for the power consumption, my comment was about the uselessness of running games above the monitor's refresh rate, for people like me who don't need more than 100-120 FPS, so that's why I keep the monitor set to 120 Hz. I understand that are people who can see the difference between 100 Hz and 144 Hz, but I don't and I'm glad for that. :) Striving for very high refresh rates can be a huge pain, trying to compromise between visual quality and high performance settings in games, or having to fork a lot of money for the most powerful graphics cards.

Further reading on blurbusters, an excellent site.
Without scientific evidence, i'm gonna just say anyone seeing over 100 is complete BS.Science is apparently looking at no higher than 80.
I almost never experience screen-tearing, but often find V-Sync has a detrimental effect on framerate, so I almost always turn it off. Then again, I have a 144Hz monitor with FreeSync. And as for the GPU running unnecessarily high, I limit the framerate in the AMD Adrenalin driver for all games, there's no need to rely on the game itself to limit framerate, you can set it globally in your graphics driver.
Sometimes I really can't figure out how a game decides whether it uses vsync or not.

Yesterday I decided to try the GOG versions of Dungeon Siege 1 and 2 on my Windows 10 laptop (I have added the missing expansion packs to the GOG versions too, according to the instructions obtained here).

The laptop has a NVidia GPU, and in the NVidia control panel I have set:
- All games should prefer NVidia discrete GPU over Intel HD GPU.
- Vsync should always be enabled.

I checked also the Intel Command Center options, but I didn't find anything related to vsync there, just in case a game tries to use Intel HD GPU anyway.

Also in the Dungeon Siege 1 options, I recall there was a DirextX Options or something like that, and there I also enabled vsync. In the game itself, I don't see any vsync or framerate options.

Either way, when I run Dungeon Siege 1, it appears it is running with vsync off. The internal FPS counter shows it running at like 70-200 frames per second, depending what is on the screen. Also the laptop fan seems to be running at full speed, which indicates that the GPU (and/or CPU) are running hot too, obviously as they are trying to run the game as fast as they can, with ludicrous framerates I never asked for.

I am not even sure if Dungeon Siege is selecting the NVidia or Intel GPU, there is no apparent way to tell that.

But when I run Dungeon Siege 2, it seems to be using vsync because its framerate seems ot be locked at pretty constant 60 fps. And yes, the laptop fan is much quieter too when running DS2.

So, yeah, I just wish there was some damn way to make 10000000000% sure vsync is on, always. None of this bullshit "it works on this game, but not that one". The only logical thing I can think of is that for some reason Dungeon Siege still selects Intel GPU which hasn't enabled vsync (as I didn't find such an option in its drivers), while DS2 used NVidia where vsync is always forced. But then, DS "DirectX options" still has vsync enabled, why doesn't it work then?

I am unsure if using that "riva tuner" that someone mentioned would achieve that, framerates always locked to max 60 fps, no questions asked. Then again the Riva Tuner configuration page seemed to suggest you should use it only if your GPU is able to always run higher than your monitor refresh rate. Will it cause more problems than vsync in such cases?

Yeah I understand that vsync will drop the framerate to 30 fps if your GPU can't keep the framerate 60 FPS or over, which sucks of course. The worst scenario being that it constantly jumps between 30 and 60 fps.

Overall this isn't a big problem as I have already finished the first Dungeon Siege + expansion (on a different computer; I don't recall if I was able to use vsync on it) and I'm not going to replay it, but still I'd like to figure this out.
Post edited May 21, 2021 by timppu
I have to because I'm sporting a MSI RTX 3090 and without it games are hyper TURBO!

I can SMELL the ENVY a MILE AWAY!
Attachments:
kewl.jpg (175 Kb)
Post edited May 21, 2021 by fr33kSh0w2012
For AMD there is this neat Radeon software.
You can set all sort of stuff for games.
And Vsync and frame limit is one of them and the driver forces them to use it.
Tbh I havent tested the vsync part as i usually just set them in the games but frame limit works well.

Dunno if this is available on Linux or only for us win users.
Ah you have nvidia , probably it has similar software for win only.
Post edited May 21, 2021 by Orkhepaj
So is vsync like: "If you don't use it, you lose it."?
avatar
Orkhepaj: For AMD there is this neat Radeon software.
You can set all sort of stuff for games.
And Vsync and frame limit is one of them and the driver forces them to use it.
Tbh I havent tested the vsync part as i usually just set them in the games but frame limit works well.

Dunno if this is available on Linux or only for us win users.
Ah you have nvidia , probably it has similar software for win only.
It does!
avatar
timppu: So is vsync like: "If you don't use it, you lose it."?
If you don't use it your eyesight suffers!

I set Max framerate to 60 which is my Android TV's refresh rate!
Post edited May 21, 2021 by fr33kSh0w2012
avatar
Orkhepaj: For AMD there is this neat Radeon software.
You can set all sort of stuff for games.
And Vsync and frame limit is one of them and the driver forces them to use it.
Tbh I havent tested the vsync part as i usually just set them in the games but frame limit works well.
Not sure if you are referring to Riva Tuner, but Geforce drivers also have an option to force vsync. And yes this is in Windows 10.

But then in most laptops there are both NVidia and Intel HD GPUs side by side, so I am unsure how that is handled (that Intel GPU will always use vsync as well). At least I didn't find a vsync option in Intel Command Center.
btw i dont get this why would vsync drop to 30fps if your fps is below 60 for a 60hz monitor?
imho this is false info , could anyone link if this is true or not?
avatar
Orkhepaj: btw i dont get this why would vsync drop to 30fps if your fps is below 60 for a 60hz monitor?
imho this is false info , could anyone link if this is true or not?
No but you CAN limit it to HALF your refresh rate though why you would want to do that is beyond me!
avatar
Sarafan: What about you? Do you use V-sync? What is your experience with using this option?
There are games I do have to use V-sync. I only just recently run into a problem with Wizardry 8 where all of a sudden my party couldn't move. After an extensive search I found out that it was V-sync causing this issue. I have set it to fast for another game and hadn't turned it back to on.

In older games such as UT2004 which I still play from time to time there is no way around activating it. Running in the thousands of fps when turning it off the game would be unplayable. In some emulators like C-64/Amiga Forever (Cloanto) and also DOS-Box (SVN-Daum) it is set to on.

In most modern games I tend to turn it off running G-Sync without issues like tearing. This was an issue I would personally blame on the monitor not being able to keep up. I activate or deactivate it on a case-by-case basis and usually create a profiles for my games in the control panel to achieve a balance between visual quality and performance.
Post edited May 21, 2021 by Mori_Yuki
avatar
Orkhepaj: btw i dont get this why would vsync drop to 30fps if your fps is below 60 for a 60hz monitor?
imho this is false info , could anyone link if this is true or not?
Well, if the game is calculating only e.g. 58 frames per second and your monitor refresh rate is 60 frames per second, and the GPU barely misses every second refresh cycle and has to wait for the next one... then common sense says it can display only 30 frames per second, as it has to skip every second refresh, with a monitor running at 60 Hz (Hz = "times per second").

When vsync is off, the GPU doesn't care what the monitor can display, it just pushes new frames to the monitor as fast it can without a fear in the world. "La la laa, here are some more frames for you, dum dee dum." And the monitor displays what it can, even tearing frames which consist of two separate (calculated) frames.

At least that is my understanding of it, I am not a games or graphics driver programmer or a GPU/monitor engineer.

GSync and Freesync (monitors and GPUs) are supposed to be free of that problem, they can run the game (or more precisely, the monitor) in various framerates/refresh rates also between 60 and 30 FPS (or whatever the maximum refresh rate of the screen is), without screen tearing and still being able to lock the game to the maximum framerate, not unneededly calculating extra frames that can never be displayed on the monitor anyway.

Or that is how I understand it...
Post edited May 21, 2021 by timppu
avatar
timppu: So is vsync like: "If you don't use it, you lose it."?
avatar
fr33kSh0w2012: If you don't use it your eyesight suffers!
My eyesight suffers from the lack of vsync?!?

Why is that? What are you, an eye doctor (an "optologist" or "cyclops" or whatever they like to call themselves)?
Post edited May 21, 2021 by timppu
avatar
fr33kSh0w2012: If you don't use it your eyesight suffers!
avatar
timppu: My eyesight suffers from the lack of vsync?!?

Why is that? What are you, an eye doctor (an "optologist" or "cyclops" or whatever they like to call themselves)?
Probably not yours but mine do can't stand that screen tearing it makes me nearly vomit!