It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Hello sweet people,

I'd like to buy a PC gaming monitor compatible with my specifications, but there are so many models that I need some advice, so if you are using or know good models, please reply!

My PC specifications :
Windows 10 Home 64 bits
AMD Ryzen 5 2600X Six-Core Processor 3.60 GHz
RAM Memory : 16 Gb
Nvidia GeForce GTX 1070 Ti

My criteria are :
32 or 34 inches
Budget : 600 EUR maximum
Resolution : above 1920*1080, ideally 2560x1440 (or more if my card isn't struggling)
Refresh frequency : more than 60 Hz I guess
Brand : I don't care as long as it's serious

Besides, I have some questions:

1. I don't understand which monitor Sync technology is the best for my graphic card, they say some Sync are card specific but then they say you can use the Sync from the competition with certification too (AMD/Nvidia) or even a neutral Sync?

2. I'm never playing the latest graphic monsters, the most demanding game I own must be The Witcher 3. But, if I get a monitor native resolution of e. g. 3440x1440 or 2560x1440, and the card is struggling to display a demanding game, can I temporarily downgrade the resolution and the frequency rate (let's say to good old 1920*1080 @60Hz) and the PC won't suffer and the game won't be too pixellated or show side effects?

3. For web sites and work applications, do you see any noticable graphic differences between e. g. 3440x1440 and 2560x1440 (if you are not a graphic expert)?

4. I'm used to 16:9 display ratio, but there are a lot of 21:9 monitors now; would they stretch the picture horizontally or can they be configured to display the traditional 16:9 ratio?

Thank you in advance for your lights and your recommendation of monitors!
This question / problem has been solved by AB2012image
avatar
r8V9b1X3u9VcA12p: My criteria are : 32 or 34 inches
Resolution : above 1920*1080, ideally 2560x1440 (or more if my card isn't struggling)
Refresh frequency : more than 60 Hz I guess
32" is usually 16:9 (2560x1440). 34" is usually Ultrawide (2560x1080 or 3440x1440). 75Hz is increasingly prevalent for budget monitors (almost becoming the new 60Hz).
avatar
r8V9b1X3u9VcA12p: 1. I don't understand which monitor Sync technology is the best for my graphic card, they say some Sync are card specific but then they say you can use the Sync from the competition with certification too (AMD/Nvidia) or even a neutral Sync?
Freesync = Both nVidia and AMD compatible. GSync = nVidia only.
avatar
r8V9b1X3u9VcA12p: 2. I'm never playing the latest graphic monsters, the most demanding game I own must be The Witcher 3. But, if I get a monitor native resolution of e. g. 3440x1440 or 2560x1440, and the card is struggling to display a demanding game, can I temporarily downgrade the resolution and the frequency rate (let's say to good old 1920*1080 @60Hz) and the PC won't suffer and the game won't be too pixellated or show side effects?
Yes though it won't look as good as native resolution. For 2560x1440 16:9 you could downgrade to 1920x1080. For 3440x1440 Ultrawide, you'd downgrade to 2560x1080 (to maintain the same shape). The refresh rate would remain whatever maximum your monitor supports (60Hz, 75Hz, 144Hz, etc).
avatar
r8V9b1X3u9VcA12p: 3. For web sites and work applications, do you see any noticable graphic differences between e. g. 3440x1440 and 2560x1440 (if you are not a graphic expert)?
Web sites are mostly vertical so Ultrawide's don't benefit that much horizontally. Applications that do benefit are audio / video editing (horizontal time-bars), multiple pages in Word / PDF readers, more columns in Excel, and games that support it. Most movies in 2.35:1 can also be viewed fullscreen without black bars, though most TV shows will show black bars (the opposite of 16:9). Multi-tasking also benefits, eg, viewing two documents / web pages side by side will be less squashed than on a 16:9.
avatar
r8V9b1X3u9VcA12p: 4. I'm used to 16:9 display ratio, but there are a lot of 21:9 monitors now; would they stretch the picture horizontally or can they be configured to display the traditional 16:9 ratio?
For games that support Ultrawide plus the Windows desktop, etc, you'd use the screen's native Ultrawide resolution. For games that don't support Ultrawide, you'd use the nearest 16:9 size of identical height (1920x1080 on 2560x1080 screen or 2560x1440 on 3440x1440 screen) and have it centered with vertical black bars to prevent any stretching. 16:9 content viewed on a 34" 21:9 basically looks exactly the same shape & size as 16:9 content viewed on a 27" 16:9 monitor. For old 4:3 games, it's the same principle but 1440x1080 or 1920x1440 centered as it is for 16:9 displays.
Post edited June 18, 2021 by AB2012
avatar
r8V9b1X3u9VcA12p: Hello sweet people,

I'd like to buy a PC gaming monitor compatible with my specifications, but there are so many models that I need some advice, so if you are using or know good models, please reply!

My PC specifications :
Windows 10 Home 64 bits
AMD Ryzen 5 2600X Six-Core Processor 3.60 GHz
RAM Memory : 16 Gb
Nvidia GeForce GTX 1070 Ti

My criteria are :
32 or 34 inches
Budget : 600 EUR maximum
Resolution : above 1920*1080, ideally 2560x1440 (or more if my card isn't struggling)
Refresh frequency : more than 60 Hz I guess
Brand : I don't care as long as it's serious

Besides, I have some questions:

1. I don't understand which monitor Sync technology is the best for my graphic card, they say some Sync are card specific but then they say you can use the Sync from the competition with certification too (AMD/Nvidia) or even a neutral Sync?

2. I'm never playing the latest graphic monsters, the most demanding game I own must be The Witcher 3. But, if I get a monitor native resolution of e. g. 3440x1440 or 2560x1440, and the card is struggling to display a demanding game, can I temporarily downgrade the resolution and the frequency rate (let's say to good old 1920*1080 @60Hz) and the PC won't suffer and the game won't be too pixellated or show side effects?

3. For web sites and work applications, do you see any noticable graphic differences between e. g. 3440x1440 and 2560x1440 (if you are not a graphic expert)?

4. I'm used to 16:9 display ratio, but there are a lot of 21:9 monitors now; would they stretch the picture horizontally or can they be configured to display the traditional 16:9 ratio?

Thank you in advance for your lights and your recommendation of monitors!
Well, I am not an expert on the matter, for reference I have an Asus 34” uhwd curved monitor and it is great.

https://www.amazon.com/Acer-X34-Pbmiphzx-UltraWide-Technology/dp/B079FV8S5M/ref=sr_1_2?crid=1ZJPSIJRI05KB&dchild=1&keywords=acer+predator+curved+monitor&qid=1624037394&sprefix=acer+predator+curved%2Caps%2C284&sr=8-2

Had it for a few years and really happy.
You have a nvidia card, so I would highly recommend getting gsync. A 1070ti should be able to push out most games at 3440x1440, my previous card was a 3080ftw and that ran fine for almost everything.
I would recommend a 1ms response time. And higher hz is good, mine goes up to 120, but I limit it to 80.
In terms of size, I find over 35” is too big, you would need to be across the room to not mess your eyes up on that wide.
So your questions:
1) gsync, it’s the best option with a nvidia card.
2) you should be fine at 3440 with your card.
3). It’s more about real estate for web pages and apps. The difference is 3x he (bit like three hd montiors). If you want lots of windows or tabs open across the screen or in their own windows, then more resolution fits more of the screen. It won’t change really what is displayed.
4) they can be setup to do multiple things. If I use the PS4 with it, I get black bars at the sides as that is only hd output. But there are stretch and such like options. Don’t know how good they are though, might also depend on the software renderer.

Now a question you haven’t asked:
What type of monitor, there is IPS, TN, VA. https://www.howtogeek.com/658701/tn-vs.-ips-vs.-va-whats-the-best-display-panel-technology/

I would say grab an older model curved uwhd, you should be able to get one near your budget, mine was £800 back when. >60mhz, 1ms response time, gsync, IPS.
I'm very happy with the LG i bought last black friday. The 27GL850. gsync compatible which works great, 2k which most of the time the 2060super handles fine though for some modern games i need to dial down the fps quite a bit. For webbrowsing and movies, netflix and such or youtube it is a pleasure for the eye. so yea i would recommend this LG and its closest comparisons from other brands . oh the stand btw is adjustable in height, and you can turn the screen over the vertical axle though it is a bit wobly

and to answer your other question more clearly , yes with this monitor and display port i'm able to use quite low fps without any negative results though under 48 fps it becomes clear for games tuned for at least 60 fps
Post edited June 18, 2021 by Zimerius
You can use PCPP to find a monitor with your specs. If you're not going to be upgrading your GPU anytime soon, it can comfortably play AAA games at 1080p 70-130 FPS on various settings and games.

1. GTX 1070 Ti uses HDMI 2.0b, so you can't get Freesync over HDMI like you would with newer cards having the new HDMI 2.1 standard. So you have to use DP*. G-sync is better and is more expensive, but some people might not recognize the difference between the two. Just make sure to check each monitor's specific range of Freesync specs to cover the Hz you'll be working at.

* There are handshaking issues between the monitor, GPU, and Windows, so you might get general unresponsiveness if you turn off your monitor and turn it on again that requires unplugging and replugging back into GPU, so I would just leave them on all the time.

2. Yes, just change the in-game resolution settings. If there's an option in the game, you can also change the refresh rate OR use RTSS to cap your frame rate. The former is better for input lag, but RTSS is better for stutter-free. Only you can decide how bad the blurring looks by visiting a PC shop and demoing it.

3. Can't comment.

4. You can change aspect ratio settings inside Nvidia Control Panel under [Display] > [Adjust desktop size and position] and through in-game resolution/aspect ratio by tinkering around in there. If you keep the settings at default (with refresh rate set to monitor's spec and G-sync / Freesync enabled), then you'll only be seeing black bars similar to launching a 4:3 game on a 16:9 monitor.

This is a filtered list of 16:9 (2560x1440) and 21:9 (3440x1440) IPS monitors with adaptive sync. You can also include VA panels or lower to 27" for a bigger selection if it looks OK to you too.

https://fr.pcpartpicker.com/products/monitor/#r=344001440,256001440&P=2&A=2,3,6,5&sort=price&F=800100000,863600000&X=0,60385

EDIT: fixed reversal resolutions, thanks teceem.
Post edited June 18, 2021 by Canuck_Cat
ngl, based on the username I thought this was spam :P
avatar
Canuck_Cat: You can use PCPP to find a monitor with your specs. If you're not going to be upgrading your GPU anytime soon, it can comfortably play AAA games at 1080p 70-130 FPS on various settings and games.

1. GTX 1070 Ti uses HDMI 2.0b, so you can't get Freesync over HDMI like you would with newer cards having the new HDMI 2.1 standard. So you have to use DP*. G-sync is better and is more expensive, but some people might not recognize the difference between the two. Just make sure to check each monitor's specific range of Freesync specs to cover the Hz you'll be working at.

* There are handshaking issues between the monitor, GPU, and Windows, so you might get general unresponsiveness if you turn off your monitor and turn it on again that requires unplugging and replugging back into GPU, so I would just leave them on all the time.

2. Yes, just change the in-game resolution settings. If there's an option in the game, you can also change the refresh rate OR use RTSS to cap your frame rate. The former is better for input lag, but RTSS is better for stutter-free. Only you can decide how bad the blurring looks by visiting a PC shop and demoing it.

3. Can't comment.

4. You can change aspect ratio settings inside Nvidia Control Panel under [Display] > [Adjust desktop size and position] and through in-game resolution/aspect ratio by tinkering around in there. If you keep the settings at default (with refresh rate set to monitor's spec and G-sync / Freesync enabled), then you'll only be seeing black bars similar to launching a 4:3 game on a 16:9 monitor.

This is a filtered list of 16:9 (3440x1440) and 21:9 (2560x1440) IPS monitors with adaptive sync. You can also include VA panels or lower to 27" for a bigger selection if it looks OK to you too.

https://fr.pcpartpicker.com/products/monitor/#r=344001440,256001440&P=2&A=2,3,6,5&sort=price&F=800100000,863600000&X=0,60385
hah, might as well say nothing

;) everybody does it once in a while so no worries ;p
Post edited June 18, 2021 by Zimerius
avatar
Zimerius: snip
I got ninja'd preparing my response to ensure accuracy. :(
avatar
Canuck_Cat: 16:9 (3440x1440) and 21:9 (2560x1440)
= 16:9 (2560x1440) and 21:9 (3440x1440)
avatar
r8V9b1X3u9VcA12p: 4. I'm used to 16:9 display ratio, but there are a lot of 21:9 monitors now; would they stretch the picture horizontally or can they be configured to display the traditional 16:9 ratio?
Stretch / scale + keep aspect ratio / original size: is a setting in your Nvidia control panel. (I believe the GTX1070 can also do integer scaling)

I've only seen non-preventable stretching in some older laptops...

Anyway: if higher-than-60Hz refresh rate isn't important, you can buy some high quality 1440P screens for a nice price. The big question here is: do you care about "fast/twitchy" action?
The Witcher 3 should play fine on your system on 1440P with high graphics settings. I does on mine (GTX970), with 40-50fps.
Are you planning to buy a monitor for the next 10 years or more, or to upgrade a lot sooner? What do you care about the most - (ultra) fast action or graphic fidelity/quality. Trying to combine those aspects will put you in a high price range.

In 2015/2016, I decided to get a 1440P (27") monitor and not a 4K one. Main reason: I still have quite a few 2000s 3D games in my backlog. On 1440P I can play them in native resolution without the (non-scalable) UI getting to small. Impossible on 4K in native res.
4K on a 32" is better/larger (the non-scalable UI), but still too small.
Maybe this all sounds too "technical" for you... simple conclusion = you play "older" 3D games -> they look their best on maximum 1440P (+minimum 27")! (or a REALLY big 4K screen..../TV)
avatar
r8V9b1X3u9VcA12p: 3. For web sites and work applications, do you see any noticable graphic differences between e. g. 3440x1440 and 2560x1440 (if you are not a graphic expert)?
Resolution, by itself, doesn't say anything about "graphics" (sharpness). PPI (pixels per inch) is what your looking for...
1080P might look very sharp on a 5" smartphone screen... but very differently on (let's say) a 5m x 10m screen (exaggerating to make a point). PPI is an absolute number to measure sharpness, taking in account the size of the screen. In the end, all that matters is, "how big is a pixel?". Higher ppi -> smaller pixel -> sharper image.
About your example: 2560x1440 (27") vs. 3440x1440 (34"), they usually have the same PPI. Those extra pixels in 3440 are used in the extra horizontal space.

Here's a calculator:
https://www.calculatorsoup.com/calculators/technology/ppi-calculator.php
Higher ppi / smaller dot pitch = sharper image

Final note: viewing distance to screen is not unimportant, but usually not a variable when you're sitting at a desk.
I've found that 27" (or 34" ultra-widescreen) is big enough to comfortably view a desk-screen, but you might feel different about it. You could put a bigger screen further away on your desk (depending on how wide it is) or wall-mount it. Me, I like my screen close-by.
Post edited June 18, 2021 by teceem
avatar
Canuck_Cat: 16:9 (3440x1440) and 21:9 (2560x1440)
avatar
teceem: = 16:9 (2560x1440) and 21:9 (3440x1440)
avatar
r8V9b1X3u9VcA12p: 4. I'm used to 16:9 display ratio, but there are a lot of 21:9 monitors now; would they stretch the picture horizontally or can they be configured to display the traditional 16:9 ratio?
avatar
teceem: Stretch / scale + keep aspect ratio / original size: is a setting in your Nvidia control panel. (I believe the GTX1070 can also do integer scaling)

I've only seen non-preventable stretching in some older laptops...

Anyway: if higher-than-60Hz refresh rate isn't important, you can buy some high quality 1440P screens for a nice price. The big question here is: do you care about "fast/twitchy" action?
The Witcher 3 should play fine on your system on 1440P with high graphics settings. I does on mine (GTX970), with 40-50fps.
Are you planning to buy a monitor for the next 10 years or more, or to upgrade a lot sooner? What do you care about the most - (ultra) fast action or graphic fidelity/quality. Trying to combine those aspects will put you in a high price range.

In 2015/2016, I decided to get a 1440P (27") monitor and not a 4K one. Main reason: I still have quite a few 2000s 3D games in my backlog. On 1440P I can play them in native resolution without the (non-scalable) UI getting to small. Impossible on 4K in native res.
4K on a 32" is better/larger (the non-scalable UI), but still too small.
Maybe this all sounds too "technical" for you... simple conclusion = you play "older" 3D games -> they look their best on maximum 1440P (+minimum 27")! (or a REALLY big 4K screen..../TV)
avatar
r8V9b1X3u9VcA12p: 3. For web sites and work applications, do you see any noticable graphic differences between e. g. 3440x1440 and 2560x1440 (if you are not a graphic expert)?
avatar
teceem: Resolution, by itself, doesn't say anything about "graphics" (sharpness). PPI (pixels per inch) is what your looking for...
1080P might look very sharp on a 5" smartphone screen... but very differently on (let's say) a 5m x 10m screen (exaggerating to make a point). PPI is an absolute number to measure sharpness, taking in account the size of the screen. In the end, all that matters is, "how big is a pixel?". Higher ppi -> smaller pixel -> sharper image.
About your example: 2560x1440 (27") vs. 3440x1440 (34"), they usually have the same PPI. Those extra pixels in 3440 are used in the extra horizontal space.

Here's a calculator:
https://www.calculatorsoup.com/calculators/technology/ppi-calculator.php
Higher ppi / smaller dot pitch = sharper image

Final note: viewing distance to screen is not unimportant, but usually not a variable when you're sitting at a desk.
I've found that 27" (or 34" ultra-widescreen) is big enough to comfortably view a desk-screen, but you might feel different about it. You could put a bigger screen further away on your desk (depending on how wide it is) or wall-mount it. Me, I like my screen close-by.
That last point is a good one. I spend most of my gaming at slightly further away than standard sitting desk position, I have a sit stand desk, so am either a foot or more further away (with controller standing) or half a foot, keyboard and mouse. Go slightly smaller for closer. Or bigger for couch play. Also depends of course on your field of view, mine is average, females tend to have far better.
avatar
nightcraw1er.488: That last point is a good one. I spend most of my gaming at slightly further away than standard sitting desk position, I have a sit stand desk, so am either a foot or more further away (with controller standing) or half a foot, keyboard and mouse. Go slightly smaller for closer. Or bigger for couch play. Also depends of course on your field of view, mine is average, females tend to have far better.
I've noticed that; nowadays, computer desks (marketed as such), are rarely wider than 60 cm (23 inch). Even though my screen is relatively close to me - I like a wider desk, for things like: *big* speakers, having power sockets on the desk behind my screen (not on the floor).

(a bit) off-topic: my 65" TV is standing on a cabinet and is not mounted to the wall. Why? Because it's not saving space when an AV receiver, subwoofer, UHD Blu-ray player, consoles (not for me), etc. are connected to it.

Desktop computers, screens, big TV, ... many of them are all in our living room. We live in our living room - it's not the show-our-decorative-"superpowers"-to-visitors room.
I'm sure that man-in-the-mancave and woman in charge of decorating the rest is still a popular stereotype. Excrements on stereotypes, I say! :-P
Post edited June 18, 2021 by teceem
avatar
nightcraw1er.488: That last point is a good one. I spend most of my gaming at slightly further away than standard sitting desk position, I have a sit stand desk, so am either a foot or more further away (with controller standing) or half a foot, keyboard and mouse. Go slightly smaller for closer. Or bigger for couch play. Also depends of course on your field of view, mine is average, females tend to have far better.
avatar
teceem: I've noticed that; nowadays, computer desks (marketed as such), are rarely wider than 60 cm (23 inch). Even though my screen is relatively close to me - I like a wider desk, for things like: *big* speakers, having power sockets on the desk behind my screen (not on the floor).

(a bit) off-topic: my 65" TV is standing on a cabinet and is not mounted to the wall. Why? Because it's not saving space when an AV receiver, subwoofer, UHD Blu-ray player, consoles (not for me), etc. are connected to it.

Desktop computers, screens, big TV, ... many of them are all in our living room. We live in our living room - it's not the show-our-decorative-"superpowers"-to-visitors room.
I'm sure that man-in-the-mancave and woman in charge of decorating the rest is still a popular stereotype. Excrements on stereotypes, I say! :-P
Yep, my sit stand (which is almost all stand now) is probably half a meter to a meter deep and quite wide. Monitor is right at the back, so even seated is further back. Plus I mostly stand back a ways. Otherwise with a wide or large monitor you end up with damage to the periphery of vision area, ie flickering and shadows at edge of view.
avatar
nightcraw1er.488: ... probably half a meter to a meter deep and quite wide....
This is getting confusing! When I said "width" (/wide), I meant the shortest side (of a desk - the opposite of length)... the same as depth, I guess. ;-D

Edit: I just checked it with my resident language expert: width/length is about dimensions, depth is about perspective. ;-)
Anyway... not that important... details, semantics, etc.
Post edited June 19, 2021 by teceem
You might want to consider ultra wide or even 4k rather than high framerate. I have a 144hz monitor and love high framerate, but it's very rare I get to really use it because so many games either break above 60fps or can't be run much above that because of processor limitations. It's annoying to go back and forth from 60 to 144, to the point I mostly just game at 60 now no matter what for consistency.

So in retrospect I wish I got an ultrawide 1440 monitor, or even 4k. Though you could also get normal 1440p and play at max settings with a mid-range card like yours without much worry.
Some of my thoughts on the topic:
Anything larger than 27 inches will have shitty pixel density with only 1440p. Unless you are sitting like a meter from your display, there will be noticeable pixelation.

Yes, you can downgrade the resolution but generally when it comes to monitors, anything other than highest native resolution will look pretty bad.

Most content is still made for 16:9. Meaning you will have to deal with black bars and potentially FOV issues if you go ultra-wide. For example, my phone uses a 1080 x 2280, 19:9 ratio, but the extra width is completely wasted when watching a Youtube video (black bars).
avatar
r8V9b1X3u9VcA12p: 1. I don't understand which monitor Sync technology is the best for my graphic card, they say some Sync are card specific but then they say you can use the Sync from the competition with certification too (AMD/Nvidia) or even a neutral Sync?
avatar
AB2012: Freesync = Both nVidia and AMD compatible. GSync = nVidia only.
It should also be noted that yes, GSync is NVidia only, but also functions better than Freesync on an Nvidia card. There are also several tiers of G-Sync, up to Ultimate. It influences the supported framerate range and how it behaves at different framerate break-points.
Post edited June 19, 2021 by idbeholdME