It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
ariaspi: It would be nice if you tell us what temps do you get for CPU, GPU and HDD. Personally, I don't feel comfortable running the CPU & GPU at temps above 80°C (the fans' noise doesn't help either), even though I know it's perfectly safe until 90°C. I am more concerned about the other components than CPU & GPU, in this case. Like capacitors, fans, cables, connections and other things failing or melting. Ideally, the temps for HDD should be under 40°C. Above 45°C I would start to worry.
You can see some screenshots with temps on my original post. They are perfectly fine, and I don't want to mess with low-level hardware stuff like CPU voltages after I saw how bad laptops react to changes after some time (my previous system, also a laptop, got wrecked after my failed attempt at CPU upgrade).
avatar
hedwards: I disagree with that. For certain types of twitch games with certain color schemes, you might notice a difference, but in general you won't. It's the response rate that really matters.
Lol, are you sure we are talking about the same stuff? Twitch?!? I don't give a crap about people watching me play, I just want to play without on-line bullshit and that's all :-P

And my 144Hz, I can assure you, has incredibly fast response times. As a matter of fact, I once saw a comparison with a good CRT and my LCD got the lead in ms. "Ghosting" is a word I don't know the meaning for anymore.
Post edited May 08, 2018 by KingofGnG
avatar
johnnygoging: if your laptop has the ports that will drive a 144hz monitor then it is specced to do that.

Now, if you're asking for statistical trends for manufacturing quality against observed longer failure rates for particular use I can't tell you anything concrete.
That's the thing I'm more worried about: more fps spat out by the GPU, more chances to "force" it to last shorter.
avatar
KingofGnG: You can see some screenshots with temps on my original post. They are perfectly fine, and I don't want to mess with low-level hardware stuff like CPU voltages after I saw how bad laptops react to changes after some time (my previous system, also a laptop, got wrecked after my failed attempt at CPU upgrade).
Lol, are you sure we are talking about the same stuff? Twitch?!? I don't give a crap about people watching me play, I just want to play without on-line bullshit and that's all :-P

And my 144Hz, I can assure you, has incredibly fast response times. As a matter of fact, I once saw a comparison with a good CRT and my LCD got the lead in ms. "Ghosting" is a word I don't know the meaning for anymore.
He means quick reaction gaming like high level FPS play where your twitch reflexes run high. It's where the Twitch site derived their name.
avatar
hedwards: I disagree with that. For certain types of twitch games with certain color schemes, you might notice a difference, but in general you won't. It's the response rate that really matters.
avatar
KingofGnG: Lol, are you sure we are talking about the same stuff? Twitch?!? I don't give a crap about people watching me play, I just want to play without on-line bullshit and that's all :-P

And my 144Hz, I can assure you, has incredibly fast response times. As a matter of fact, I once saw a comparison with a good CRT and my LCD got the lead in ms. "Ghosting" is a word I don't know the meaning for anymore.
I hate to say it, but you fell for marketing bullshit. Unless you're playing one of those twitch games, you're just not going to see any meaningful difference. Especially on mostly static screens like when you're cruising the web. That's just not how monitors work.

Also, this is the 21st century, it's not like looking a single word up is very difficult.

My guess is that most of the things you're crediting the refresh rate for have nothing to do with that and everything to do with the fact that these are more expensive monitors with better electronics than what you'd get out of a cheap monitor.

The refresh rate is just an indicator of the maximum time it takes for the screen to update, not necessarily the time it takes while using the computer. It's not like in the past where the two times were the same.

But, even my decade old monitors don't have any issues with gaming apart from the odd ghosting when there's a lot of movement and a large difference in colors.
Hey!

Just for the tech point of view, most games will start in 144Hz without issues, but might not support this refresh rate. In some cases, framerate is linked to some in-game calculations and might - even on recent games - cause bugs. =)
Post edited May 08, 2018 by Virgile
avatar
paladin181: He means quick reaction gaming like high level FPS play where your twitch reflexes run high. It's where the Twitch site derived their name.
Oh, my bad then. First time I see the term :-P
avatar
ariaspi: It would be nice if you tell us what temps do you get for CPU, GPU and HDD. Personally, I don't feel comfortable running the CPU & GPU at temps above 80°C (the fans' noise doesn't help either), even though I know it's perfectly safe until 90°C. I am more concerned about the other components than CPU & GPU, in this case. Like capacitors, fans, cables, connections and other things failing or melting. Ideally, the temps for HDD should be under 40°C. Above 45°C I would start to worry.
avatar
KingofGnG: You can see some screenshots with temps on my original post. They are perfectly fine, and I don't want to mess with low-level hardware stuff like CPU voltages after I saw how bad laptops react to changes after some time (my previous system, also a laptop, got wrecked after my failed attempt at CPU upgrade).
If you refer to the temps displayed in the systray, they don't tell much if we don't know which is which, and besides that, the system is barely loaded at 14.20%. Look at the temps when the system is in full load, run some benchmarks (Unigine Heaven is a good start) and most demanding games you have.

Apparently, you have Intel XTU running on your system, so if you're familiar with it, you can safely drop the Dynamic CPU Voltage Offset to -50 mV and do some tests.

Final note, the post on your site is from January. Most certainly, now and in the summer the temperature in your room is higher, which will increase your system's temps.



avatar
hedwards: I disagree with that. For certain types of twitch games with certain color schemes, you might notice a difference, but in general you won't. It's the response rate that really matters. These days, the games that I'm playing just do not refresh quickly enough to even ghost on my monitors. Spending money on a 144hz monitor would be a waste of money. Sort of like how spending money on a 4k monitor is a waste of money for watching TV. Regardless of whether the content is in 4k or not.
avatar
hedwards: I hate to say it, but you fell for marketing bullshit. Unless you're playing one of those twitch games, you're just not going to see any meaningful difference. Especially on mostly static screens like when you're cruising the web. That's just not how monitors work.
Regarding the 60Hz vs xxxHz discussion, if you've used a high refresh rate monitor and seen no difference, well, then that's just you, no everyone perceives things the same way. If you didn't use and say those things based on your own assumptions, then you are wrong.

I didn't use either, and I don't plan to until I get a new system with a more powerful GPU, because I know I'll regret having a +100Hz monitor but not a GPU to push the FPS above 100 in modern games. The high refresh rate is also perceivable in everyday use, not just games.

In case you're wondering what makes me say you are wrong while not using such monitor myself, well, first it's because I've seen enough videos on tech and gaming channels, and related articles on the internet (here are some short, simple ones).

Second, I've used to think like you on different computer related subjects, until I've been proved wrong and learned my lesson. And now I know not to argue on a subject that I have not experienced myself or not have scientific facts to back it up. My 2 cents on the monitors subject , on which I will not continue.
avatar
johnnygoging: if your laptop has the ports that will drive a 144hz monitor then it is specced to do that.

Now, if you're asking for statistical trends for manufacturing quality against observed longer failure rates for particular use I can't tell you anything concrete.
avatar
KingofGnG: That's the thing I'm more worried about: more fps spat out by the GPU, more chances to "force" it to last shorter.
if anything was gonna break because of your 144hz monitor I think it would be the gpu, the cpu, and their supporting electronics before anything else. and I think what would break it would be the temperatures and energy being consumed by the computation process first and foremost. it isn't some situation where the cpu, gpu, vrms and associated controllers and chokes and so on are ultimate-tier and everything else is utter shit. they cut it as close as they can everywhere. in general though, it all works like it should most of the time.

I see that this laptop has a miniDP out. that makes it at least displayport 1.2, and might very well be higher than that. the hdmi port is probably 2.0, but no guarantees. displayport was designed as a new standard for high-capability displays and can do 144hz at 1440p.

shoot an email at asus and ask them what the ports are specced for. if they're anything higher than HDMI 2.0 and DisplayPort 1.3 I think it's really think it's unlikely your outputs are going to fail. if they do, it'll be from defect more than anything. if you're so worried that this halo brand ROG thing can't actually work, why did you buy it? if this monitor is 1080p, then you're not even bleeding edge on the older spec.
Post edited May 08, 2018 by johnnygoging
avatar
Virgile: Hey!

Just for the tech point of view, most games will start in 144Hz without issues, but might not support this refresh rate. In some cases, framerate is linked to some in-game calculations and might - even on recent games - cause bugs. =)
Yeah, that can happen. In my experience thus far with Windows 10, retro gaming and 144Hz screens, only one game (Oni) forced me to set up a lower refresh rate (60-85Hz) to avoid laggy and/or unresponsive controls. Other games (Diablo) are capped internally at very low frame rates, while others (Portal, Portal 2) let you go wild with the fps (I got way over 200 in Portal) but show annoying tearing graphics. Luckily I've got a G-Sync panel and I can use V-Sync without its usual downsides :-P
avatar
hedwards: I disagree with that. For certain types of twitch games with certain color schemes, you might notice a difference, but in general you won't. It's the response rate that really matters.
avatar
KingofGnG: Lol, are you sure we are talking about the same stuff? Twitch?!? I don't give a crap about people watching me play, I just want to play without on-line bullshit and that's all :-P

And my 144Hz, I can assure you, has incredibly fast response times. As a matter of fact, I once saw a comparison with a good CRT and my LCD got the lead in ms. "Ghosting" is a word I don't know the meaning for anymore.
twitch, as in games needing fast reactions and responses (a twitch is a quick. short, sudden, jerking movement), not the streaming platform.

twitch gaming is a term
avatar
ariaspi: Regarding the 60Hz vs xxxHz discussion, if you've used a high refresh rate monitor and seen no difference, well, then that's just you, no everyone perceives things the same way. If you didn't use and say those things based on your own assumptions, then you are wrong.

I didn't use either, and I don't plan to until I get a new system with a more powerful GPU, because I know I'll regret having a +100Hz monitor but not a GPU to push the FPS above 100 in modern games. The high refresh rate is also perceivable in everyday use, not just games.

In case you're wondering what makes me say you are wrong while not using such monitor myself, well, first it's because I've seen enough videos on tech and gaming channels, and related articles on the internet (here are some short, simple ones).

Second, I've used to think like you on different computer related subjects, until I've been proved wrong and learned my lesson. And now I know not to argue on a subject that I have not experienced myself or not have scientific facts to back it up. My 2 cents on the monitors subject , on which I will not continue.
You're entitled to your opinion. But, I hear this kind of BS all the time about technology X being such a massive improvement over the previous technology. And yes, sometimes it is true, DVD was a massive improvement over the previous technology. 4K is generally a wasteful gimmick and if it didn't come with other things that weren't just extra pixels, I doubt anybody would be able to tell it apart from HD without close examination. And don't get me started on the people who claim to be able to hear the difference between a properly encoded MP3 file and the original source.

In this case, had the claim just been restricted to games and possibly video, I wouldn't bother to be arguing about it. But, to extend that to computer work in general, is way too far. Refresh rate for general computing hasn't been an issue in decades. Well, a bit with high rez CRT monitors, but definitely not with modern flat panel monitors.

LCDs and the like don't constantly change pixels the way that the older technology had to. This is why stuck pixels can be an issue in a way that they weren't previously. They're stuck because the shutter mechanism hasn't been moving and got stuck like that.

The difference people see in the monitors has far more to do with the other expensive components than the refresh rate when it comes to general computing. Certain tasks, especially those that involve scrolling will see some benefit from increased refresh rates, claiming that this is somehow game-changing is way overselling it. I'm not sure the last time I scrolled anything slowly enough to notice or care about that.

EDIT: And BTW, you're own links confirm what I've been saying, if you cherry pick older FPS games and the like and manage to pump the frames to a ridiculous degree, then you see a marginal improvement. I don't think that it's unreasonable to note that this isn't a common thing to be doing with such expensive equipment.
Post edited May 10, 2018 by hedwards
avatar
hedwards: You're entitled to your opinion. But, I hear this kind of BS all the time about technology X being such a massive improvement over the previous technology. And yes, sometimes it is true, DVD was a massive improvement over the previous technology. 4K is generally a wasteful gimmick and if it didn't come with other things that weren't just extra pixels, I doubt anybody would be able to tell it apart from HD without close examination.
And this is where I stopped reading: you don't know WHAT THE HECK you are talking about. UltraHD a gimmick? And why don't call bullshit on that "useless" Full HD, not that much better compared to SVGA? Or the color TV versus the black&white old days? 16-bit graphics are the best graphics, etc. etc. etc.

C'mon, I hate marketing bullshit as much as anyone, but *some* technology improvements are just that: improvements, plain and simple. And a high refresh monitor is one of the best technology upgrades I've experienced in ages. And not just for older games: I can run the new DOOM at over 100 fps, without losing a single frame thanks to the 144Hz refresh. The difference, comparing to the laptop 60Hz monitor, is striking. Not as day and night, but pretty noticeable - to say the least.
Post edited May 10, 2018 by KingofGnG