It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
As I said in another thread, I got a new gaming laptop built by Asus. So far so good: I'm pretty much in love with the thing with no issues whatsoever. Now, there's something that bugs me: the PC's native screen has a standard 60Hz refresh, but I'm using it with a 144Hz, G-Sync powered gaming monitor by AOC. Therefore I can "push" the GPU way beyond the original screen's refresh, and I regularly play at ~144Hz/fps with almost all the games I'm committed to right now (Diablo III, Portal 2, ...)

As I said everything works fine with no issues, the temperatures seem to be perfectly OK for this kind of systems, no overheat, no throttling, no "screaming" fans, nada de nada... But I fear that, in the future, this 144Hz refresh thing would be too much of a strain for the GPU (a GeForce GTX 1060) with negative consequences for the longevity of my hardware.

So I'm asking: is my doubt legit or I'm just troubling myself with a pile of worthless crap? :-P
Forget about the future, it can't do 144Hz now.

Diablo 3 is 6 years old. Portal 2 is 7 years old. No wonder it has no problems running those games. Both games are from companies that make very well optimized PC games too. Try getting 144Hz on a brand new AAA game, or worse, a brand new unoptimized console port. Maybe with lower graphics settings, but that won't last long.

By all means, I've had the same gaming laptop for the past 5 years and I'm happy with it. I play around medium and get 60fps on well-optimized games, 30fps on the poorly optimized ones. But if you're dead set on 144Hz on a laptop, unless you're only interested in playing games from the last generation, yeah, prospects aren't good.
"So I'm asking: is my doubt legit or I'm just troubling myself with a pile of worthless crap? :-P"

It definitely may effect hardware to run it full tilt, if you're unlucky, though it isn't usually the full tilt itself that is the problem so much as it's thermal cycling (heating up/ down) instead- thermal throttling ought to cut in before any heat damage occurs. A laptop with a 1060 ought to be designed with that in mind, and there's little point to buying one and not gaming on it on the chance that it fails in 48 months time instead of 44 so I wouldn't personally worry about it.

A laptop 1060 will always run worse than a desktop one, and a desktop 1060 (even the full 6GB version rather than the gimped 3GB one) will struggle to get close to 144Hz 1080p on a lot of games unless graphics fidelity is reduced. That is largely the point of adaptive sync though, it doesn't matter so much if you don't hit 144Hz.

IMO 100+Hz monitors are worth it for being kinder on the eyes even when on desktop anyway.
avatar
KingofGnG: But I fear that, in the future, this 144Hz refresh thing would be too much of a strain for the GPU (a GeForce GTX 1060) with negative consequences for the longevity of my hardware.
Your hardware will last as long as it lasts, not due to any strain caused by a monitor. Also it will help in the future if your monitor is a gsync one. It makes lower frames look like higher ones.
if you have a 144hz monitor if you use vsync does it go to 60 or 144?

Gonna get one of those 144hz 4k monitors when they drop but I always use vsync on my games cause I hate screen rips.
Post edited May 07, 2018 by DreamedArtist
I know I'm in no hurry to get a 144hz 4k monitor because I'm in no hurry to pay for a rig that could effectively use it.
avatar
DreamedArtist: if you have a 144hz monitor if you use vsync does it go to 60 or 144?

Gonna get one of those 144hz 4k monitors when they drop but I always use vsync on my games cause I hate screen rips.
I can't say for sure as I don't have one but it should go to 144 with vsync. Some games also have a framerate limit which is usually set to 60 so you would have to turn that off. It's a good idea to use adaptive vsync because with most games you won't have a 144 fps.
I'm in no hurry for either 4K or 144hz, considering a large number of companies are struggling to get stable 1080/60.

That, and I feel diminishing returns are in full effect.
Post edited May 07, 2018 by Darvond
avatar
DaCostaBR: Forget about the future, it can't do 144Hz now.

Diablo 3 is 6 years old. Portal 2 is 7 years old. No wonder it has no problems running those games. Both games are from companies that make very well optimized PC games too. Try getting 144Hz on a brand new AAA game, or worse, a brand new unoptimized console port. Maybe with lower graphics settings, but that won't last long.

By all means, I've had the same gaming laptop for the past 5 years and I'm happy with it. I play around medium and get 60fps on well-optimized games, 30fps on the poorly optimized ones. But if you're dead set on 144Hz on a laptop, unless you're only interested in playing games from the last generation, yeah, prospects aren't good.
Yeah, I know, but my backlog is pretty full of games that could go way beyond 144 fps on my hardware so I was asking for that...
avatar
DreamedArtist: if you have a 144hz monitor if you use vsync does it go to 60 or 144?

Gonna get one of those 144hz 4k monitors when they drop but I always use vsync on my games cause I hate screen rips.
Without G-Sync, it will always go at 144Hz. If you have G-Sync, and enable V-Sync from NVIDIA Control Panel (and disable it in-game), the monitor and the GPU will go in sync up to 144Hz (I don't remember the lowest limit of G-Sync, maybe it's 40Hz)...

Ultra HD monitors are worth shit right now, imo, because they have really too many pixels to push even for the most powerful GPUs out there.
Post edited May 07, 2018 by KingofGnG
avatar
Darvond: I'm in no hurry for either 4K or 144hz, considering a large number of companies are struggling to get stable 1080/60.

That, and I feel diminishing returns are in full effect.
well I can see 4k being a thing instead of 144hz. I got a 1080 ti and use a 4k monitor and I can almost max out any new game at 60fps max settings with it. witcher 3 no problem, even crisis 3 overwatch and so forth.

I think the 11 series from NVidia will make the mid range capable of doing light 4k gaming and the high end to allow true 4k gaming like the 1080ti.
avatar
Darvond: I'm in no hurry for either 4K or 144hz, considering a large number of companies are struggling to get stable 1080/60.

That, and I feel diminishing returns are in full effect.
avatar
DreamedArtist: well I can see 4k being a thing instead of 144hz. I got a 1080 ti and use a 4k monitor and I can almost max out any new game at 60fps max settings with it. witcher 3 no problem, even crisis 3 overwatch and so forth.

I think the 11 series from NVidia will make the mid range capable of doing light 4k gaming and the high end to allow true 4k gaming like the 1080ti.
4k for computers makes a certain amount of sense. Or at least will make some sense when the hardware gets powerful enough to properly power it. But, 144hz monitors don't make much sense unless you're wanting to do some strange 3d effects with shutter lenses.

Considering that movie theaters screen things at much lower frame rates and that 50-70fps has been sufficient for even twitch shooters going back decades, I fail to see how much benefit you could really get with doubling the frame rate, especially when it comes at the cost of reducing the other graphical elements to hit the refresh rate.
avatar
hedwards: But, 144hz monitors don't make much sense unless you're wanting to do some strange 3d effects with shutter lenses.
You have never seen one in person, have you? A high-refresh monitor is a fantastic piece of hardware, firstly for the overall computing experience and lastly for gaming. Now that I have a 144Hz monitor, when I have to go back to a 60Hz one my eyes scream in agony - without touching any gaming first...
Post edited May 08, 2018 by KingofGnG
It would be nice if you tell us what temps do you get for CPU, GPU and HDD. Personally, I don't feel comfortable running the CPU & GPU at temps above 80°C (the fans' noise doesn't help either), even though I know it's perfectly safe until 90°C. I am more concerned about the other components than CPU & GPU, in this case. Like capacitors, fans, cables, connections and other things failing or melting. Ideally, the temps for HDD should be under 40°C. Above 45°C I would start to worry.

First thing and easiest to do is to use a frame rate limiter, like Rivatuner Statistics Server, (which is included in MSI Afterburner) or NVIDIA Inspector, and set it to 144 FPS. In case of more GPU demanding games you can set it to lower FPS, like 120, 100 or whatever you like.

Another thing is to use a program to control the fans, like NoteBook FanControl. I'm using it and am very happy with it, making the fans more silent during low and normal load. There might be other similar programs, but I don't know any, except SpeedFan, which is very good, but I'm not sure if it works well with laptops.

If you have the knowledge to do some undervolting, then this is definitively helpful. I managed to do it for the CPU, resulting in lower temps (like 4-6°C for CPU and 2-3°C for GPU) while gaining a bit of extra performance. I didn't try or research to do it for the GPU too.

In a few years it is probably worth it to change the thermal paste. There's usually said that thermal paste starts losing some of its efficiency after 4 years. I recommend using Cooler Master MasterGel Maker Nano or Noctua NT-H1, these performing usually the best among the more common and not so pricey thermal pastes.

Some links on the subject for anyone interested.
avatar
hedwards: But, 144hz monitors don't make much sense unless you're wanting to do some strange 3d effects with shutter lenses.
avatar
KingofGnG: You never saw one in person, did you? A high-refresh monitor is a fantastic piece of hardware, firstly for the overall computing experience and lastly for gaming. Now that I have a 144Hz monitor, when I have to go back to a 60Hz one my eyes scream in agony - without touching any gaming first...
I disagree with that. For certain types of twitch games with certain color schemes, you might notice a difference, but in general you won't. It's the response rate that really matters. These days, the games that I'm playing just do not refresh quickly enough to even ghost on my monitors. Spending money on a 144hz monitor would be a waste of money. Sort of like how spending money on a 4k monitor is a waste of money for watching TV. Regardless of whether the content is in 4k or not.

Back when people were using CRT monitors where the pixels would start to fade the moment that the cathode rays were no longer hitting the pixel it was a much bigger deal.

These days though, I just don't see noticeable ghosting in most games.

Personally, I prefer to focus on the actual game and not obsess over the small amount of ghosting that appears on older monitors.
if your laptop has the ports that will drive a 144hz monitor then it is specced to do that.

now, if you're asking for statistical trends for manufacturing quality against observed longer failure rates for particular use I can't tell you anything concrete.

I do not believe there would be any propensity for hardware stress. making high refresh-rate displays is expensive which is more why you wouldn't find one on a laptop. the silicon in that laptop likely will be the same used in cards just packaged differently, and the rest of the components will have been designed to operate at what the port has been specced for.

that's on both ends now. the ports on both your monitor and your laptop will need to be specced for something that will do 144hz.

assuming everything is in spec, it will be fine. you're not going to be stressing anything any more than the speeds at which your cpu and gpu and the transistors and transformers that run them are operating at. far more dangerous are the temps getting out of control.