It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
I guess everything except GOG is shitty at once, dependable on how you look at it.

Many games simply will not support a custom FPS at their own; the driver is not always capable setting a custom FPS, the OS may as well not be capable setting a desired FPS... at least not for every game. However, the GOG version most likely got the nearly same build such as any other version, so it can not be worse than another version.

In general i would let the OS run in 120 Hz, so it should sync in 120 Hz. The monitor is compatible with. Above 120 simply is killing pretty much any GPU and even the CPU and with the exception of competitive shooters no game is in need of even more frames.

However, it would mean to lower quality settings... there is no GPU strong enough handling it with modern games at 4k, and not even 1080P in many cases.

Performance simply always comes at the cost of quality, unless it is synced to 60 FPS, then a mid range GPU usually can deliver maximum settings in 1080P (and the 5090 can go up to 4k this way.) This is what i use, and thanks to my prehistoric Plasma it still works well.

For 4k i would need a GPU twice that strong of what i got, for 4k 120 even 4 times as strong (which does not exist...)... so it is pretty much game over in such a "performance settings": Unless i would use fake frames... so either native at 60 FPS or fake frames at 120+ FPS... of which not every game is fully compatible with, but thats another story.
Post edited April 19, 2025 by Xeshra
The AfterMath Pt.2

-Everything is Shit-

As life showed us, every shit problem always has at least 2 solutions and a 3d unknown. Can we 'trick' a system? Of course, we can, since programs are programmed by Humans, and humans are both part of life and humans are faulty by design.

We are part of the same system that brought forth, pollination, sonar and flight without even twisting a muscle, or so they say. Not by purpose, no. But by silly accidents, called mutations.... Let that fact sink in for a while.... we progress by accident!!!

Now in the case of Grim Dawn, we have been offered a solution, out of the blue, without addressing anyone, and, we found a solution our selves, by deed of memory... this is also a thing, nature also came up with 'memory'. A system consisting out of 2 seperate assets. One system delivered to you on birth or conceiving and the other one stores 'important stuff needed for survival'. We also know there is probably 1 hidden factor that also includes a solution.

The offered solution, while effective does have it's drawbacks in other situations. For example, in my memory i find earlier experiences that prove that this solution only works in certain cases. By fact there are other titles that work a bit wonky if you change the desktop refresh rate to anything other that the supported monitor refresh rate.

Honesty, yet another miraculous result of evolution, does require me to provide a similar painted description for my own find, limiting the powerdraw on the GPU. I can be quick about this, limiting your GPU may result in good results in most situation but might work against you in others.

As far as contemplations Go

Life needs to be shit, no matter how you put it, or never put it.....
Post edited April 20, 2025 by P. Zimerickus
it's weird, alot of people go for such brute force.
1.6KW power house.

and in a week i'll solder a few chips that do the same thing at 200W

XD
Post edited April 20, 2025 by XeonicDevil
avatar
XeonicDevil: it's weird, alot of people go for such brute force.
1.6KW power house.

and in a week i'll solder a few chips that do the same thing at 200W

XD
Seriously?

That is absolutely astonishing! You must be a master of your trade
avatar
XeonicDevil: it's weird, alot of people go for such brute force.
1.6KW power house.

and in a week i'll solder a few chips that do the same thing at 200W

XD
avatar
P. Zimerickus: Seriously?

That is absolutely astonishing! You must be a master of your trade
truth is as some hardware devs go forward, it becomes too easy to press copy paste over and over.. leading to one very hungry gpu/cpu.
Replaced my Sony M3 inline 27" HD for this Alienware model

https://www.dell.com/en-us/shop/alienware-27-4k-dual-resolution-gaming-monitor-aw2725qf/apd/210-bnjj/monitors-monitor-accessories

-gains

Both 4k and HD on a switch of a button.
Display HDR 600
Dolby Vision

My main 2k monitor already found a new destination, i'm putting up my old HD monitor for sale, of course the 165 hz 4k refresh will probably serve around movie time or very old games. I found the price to be quite fair. At least not as expensive as a OLED, but yea, i also need to choose a electric before the end of the year so i can enjoy a lesser level of road tax
initial feelings

- if this is 4k you can shove it up your **ss
seriously, on this size, 27" i rather see 2k than 4k. When i open games the wattage almost exactly doubles in weight, with the previous 2k settings not to mention the so called enhanced sharpness etc... but who knows, it took some time to get it right yesterday. It wasn't as simple as plug and play. Tried out different cables and settings

- i do like the 1080p switch..... It feels almost the same as 4k .... for one third of the cost of 4k
4k is a massive overkill for 27 inchers. You really need to go 30-32 plus for it to even start having a point. Otherwise, it's just tanking your FPS for almost zero gain.

Though on the other end, I'd call 1080p woefully insufficient for a 27 incher. Meaning the linked monitor is something I'd never even consider. PPI below 100 is a big no-no.

27 inch 1440p remains the perfect sweet spot to me.
Post edited June 13, 2025 by idbeholdME
I for one always wanted to skip FHD completely, so while my trusty old 1280x1024 monitor still works I'm sticking to it, and while I'm fond of the non-widescreen aspect ratio and always really wanted to upgrade it to one that has 1600x1200 or 2048x1536, or possibly even 2560x1920 or 2560x2048 if they'd be really large, since those don't seem to exist anymore even in used electronics stores, when I occasionally check the market to see what would be available if I would need to replace this one quickly, I tend to look for ~32" 2560x1440 or 2560x1600, since I even consider that resolution to be too much for 27", and possibly even for 30". 4K strikes me as way too much at any reasonable monitor size, only suitable for screens for home cinemas or such things.

For example at this point the pick, if necessary, may be https://www.philips.co.uk/c-p/32M2C5501_00/curved-fast-va-gaming-monitor-quad-hd-gaming-monitor though I don't know how I'll deal with a curved one.
Post edited June 13, 2025 by Cavalary
I'm still actually quite fine with how at least the HD function looks and feels. Older titles, such as Mass Effect Andromeda or Nier Automata do show a difference between 4k and HD but newer titles, Forspoken or Dragon Age : Veilguard tend to look quite nice either way.

Also, the colouring is a step-up. And i start to notice small differences in how refined the image sometimes looks in 4k.

I was hoping to experiment with DSR super resolution, another upscaling technique from nvdia but so far DSR seems to have left the building wich may or may not be a result from using HDR.. guess i have some researching to do!

The next step does seem clear though, if i ever have enough money and enough desktop space next time we go for OLED with ofc true blacks and at least 35 inch!

Edit

DSR - as it turns out, DSR is only available for any resolution lower than 4k. Been rummaging about and had satisfying results in several titles
Post edited June 14, 2025 by Mr. Zim
Had my fun again with testing different situations.....

Degrading resolution from 4k, upgrading resolutions from 1080P
tried out the different genre's. I found the total wars and other manager games such as Foundation or Railway Empire better suited at HD than 4k. Especially if the graphical choice is more comical than realistic.

For day to day use and streaming, the 4k stance is most certainly preferred and maybe for gaming too but not with this GPU (3090Ti)

All in all the monitor is feature packed. Take for example HDR. If i don't enable HDR in Windows the monitor automatically takes over with different settings such as Dolby Vision or Display HDR 600, this can be turned off also. Then we have picture in picture, the ability to turn the monitor 90 degrees and of course the resolution switch, and probably many other little features i can't even name.

With this monitor, i'm also able to better understand the whole 'realism' discussion that's been going on concerning fake frames and upscaling techniques. I mean it says enough if you try modern games on three different resolution types only to find that the differences, at least on a mid-high tier gpu, are almost negligible and while i understand the need for doubling in computer power every said generation it becomes more and more evident that smart choices or well researched options are becoming more and more important with the ever-increasing demand that developers put on the shoulders of, on average, very young gamers.
Meaning : a console is good enough for children!

edit

Cyberpunk....... it works in 4k at reasonable terms.... ray tracing enabled, neural network super performance mode and voila below 300W... still hot but, especially compared to the 'others', acceptable to say the least. Disable Ray trace and you have a very enjoyable 4k experience
Post edited June 15, 2025 by Mr. Zim
The thing i wonder the most: Which monitor?!
It is the opposite of advertisement.. kinda fun.
Personally, I'm still sitting on a TN display. A high end one, but TN:
https://www.displayspecifications.com/en/model/f9762045

Reason? I can see everything fine without having to have the brightness at eye-searing values. Usually operating at around 20-25% brightness (90-110/450 OSD brightness), depending on the situation. Usually around the 110 range for gaming, 90 for non-gaming use.

And the specific linked monitor looks staggeringly good, considering it's "only" TN. Been using it aince 2020 and no concern for any other issues like bleeding or burn-in. The bad rep that TN has for whatever reason is vastly overblown in my opinion.
Post edited June 16, 2025 by idbeholdME
I dunno... i do not care rep, i only care facts based on specs i adore. One of this spec is the ms between different gray pixels, the internal input lag of the screens electronics (gray 1 ms or lower, input lag 10 ms or lower), and the gamut level which is nowadays 100% DCI P3°° at least (this is simply the modern demand and standard from high end screens) and a non dynamic peak nit of 1000 or more, while offering near perfect black levels. Hz i am not to demanding because in fact i got no game that will benefit from over 120 Hz but because OLED works differently than Plasma = no "picture decay" it means a OLED is way more sensitive to moving picture sharpness if there is not enough frames to be delivered. Although a modern OLED can always interpolate stuff, at the cost of input lag. Sure, more Hz is better but realistically at 4k even the best GPUs will bite the dust above 120 FPS°°°, and this is required for syncing it = no tearing or other issues.°°In some very far future maybe even Rec2020 but the technology is still far away from this level. °°°Unless i lower picture quality drastically, as a non competition gamer this is not an option.

Although, i do not use a usual monitor... way to small for my needs. I currently use 50 inch and my next TV will be at least 77 inch, not any lesser, with the specs i already mentioned. A modern LG OLED can already handle it but as long as the Plasma is still going strong i can wait for a even better TV and a even better GPU, which can actually handle 4k at 120 FPS and high settings (which is currently not the case for most modern games).

I never had any interest in LCD and alike using this technology. In fact the only screens i still own who ever used this technology (TN is with the same basic technology) is a now almost broken Surface 7 Notebook and my very first mobile phone which is now for over a year "out of order". On a big TV, i never had any LCD... not even one.
Post edited June 16, 2025 by Xeshra
avatar
Xeshra: I dunno... i do not care rep, i only care facts based on specs i adore.
Same.

The most important specs to me are:
- Not having to play panel lottery (especially for IPS)
- Easily viewable at ~20-25% peak brightness in a well lit room. As someone with sensitive eyes, most displays have to be insanely bright to reach the stated specs. TN happens to have the best low brightness performance I find, being able to see at 20% brightness what other displays (IPS, OLED etc.) need to be at like 80% to even start being able to see when it comes to dark areas and anything non-dark is burning my eyeballs out at that point.

- High refresh rate. I've settled on 240, don't see much point in higher. Still slightly noticed the bump from 165 to 240 in very fast paced games. But yes, slightly, so no point in going higher for me.

- 27 inch max size. Sitting close to me at a table, I already find 27 to be the top comfortable limit. Would have to move my head constantly with anything bigger.

- Minimal input lag. Same for ghosting and other visual artifacts. The Lenovo I have seems to have none. G-Sync Ultimate probably helps a lot in this regard.

- PPI not below 100 (meaning QHD on a 27 incher). Pixel density is one of the most important things when it comes to image quality. Used to have a 24 inch QHD display, but find 27 QHD to be basically perfect. Great clarity without the performance nuke that is 4K.

Don't really care much about what some would call "key specs", like peak brightness, HDR or color space coverage. Colors, I find, are really subjective in the end.
Post edited June 16, 2025 by idbeholdME