It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
So, this is not exactly a serious issue (so far, at least), but it's been on the back of my head since I noticed it a couple of days ago, and I thought I might as well ask for the gognards' opinion on the matter. Thing is, some of my games apparently refuse to run with the dedicated graphics card despite my firmly and unequivocally commanding them to do so (or maybe they are just following orders against their will from the evil integrated graphics card). The LED that indicates which card is being used does show the orange colour of righteousness briefly when the game is run, but it quickly changes back to the blue colour of doom afterwards and stays like that.

Which games, you ask? Three of them, specifically, that I've noticed: The Lord of the Rings: The Battle for Middle-Earth, and GOG's own Baldur's Gate: The Original Saga and Sacred. And what do those three games have in common, you're probably wondering? They all run in 4:3 rather than my lappie's native 16:9, meaning that scaling is in order. Seems like scaling is done exclusively by the integrated card on my lappie, as there are no config options for it in the dedicated card's config menu. So I'm guessing the game runs with the dedicated card as instructed, but as soon as the integrated card notices there's some scaling to be done, it takes over and doesn't let go.

Again, not really an issue for me, since all of those games seem to run just fine with the integrated card, but I'm wondering if there's any way around it. Bear in mind that, due the misadventures I chronicled in this thread (which you can also check for my specs since I'm to lazy to copy and paste them here), I'm extremely reluctant to mess with scaling options in my setup, lest it borks again. I'd still like to hear your thoughts, since this seems to be a rather common issue is one is to trust Google and the apparent want of solutions to it. So yeah...
No posts in this topic were marked as the solution yet. If you can help, add your reply
Disable the integrated card
in BIOS at Integrated Peripherals or
in Control Panel -> Hardware/Device Manager -> Display Adapters -> right click and disable or
change settings in NVIDIA / AMD control panel -> Manage 3D settings -> Power management mode -> Prefer maximum performance
[I can't seem to be able to post a reply for some reason. If there are multiple replies written by me it's GOG forum's fault.]

This is a problem I've had for a while and a one that made me realise that the opinion that I've held a decade ago (there is no such thing as a laptop that can be used for gaming) is correct. It's not a problem for new games, but most things released before 2004 will have this problem. For desktop you can disable the integrated graphics card. For laptops there is no way around (performance settings are outright ignored in some cases). As a result Dungeon Keeper 2 runs better on my very old 1.6 GHz desktop with Radeon 2400 Pro than it does on my current laptop with I7 and Radeon R7 M260.
avatar
bela555: Disable the integrated card
in BIOS at Integrated Peripherals or
Will not work for laptops.

in Control Panel -> Hardware/Device Manager -> Display Adapters -> right click and disable or
change settings in NVIDIA / AMD control panel -> Manage 3D settings -> Power management mode -> Prefer maximum performance
Will usually not work for older games.
Post edited August 31, 2016 by Paradoks
avatar
Paradoks: [I can't seem to be able to post a reply for some reason. If there are multiple replies written by me it's GOG forum's fault.]

This is a problem I've had for a while and a one that made me realise that the opinion that I've held a decade ago (there is no such thing as a laptop that can be used for gaming) is correct. It's not a problem for new games, but most things released before 2004 will have this problem. For desktop you can disable the integrated graphics card. For laptops there is no way around (performance settings are outright ignored in some cases). As a result Dungeon Keeper 2 runs better on my very old 1.6 GHz desktop with Radeon 2400 Pro than it does on my current laptop with I7 and Radeon R7 M260.
avatar
bela555: Disable the integrated card
in BIOS at Integrated Peripherals or
avatar
Paradoks: Will not work for laptops.

in Control Panel -> Hardware/Device Manager -> Display Adapters -> right click and disable or
change settings in NVIDIA / AMD control panel -> Manage 3D settings -> Power management mode -> Prefer maximum performance
avatar
Paradoks: Will usually not work for older games.
hmm maybe if you decrease the ram allocation used for integrated graphics through bios it will force the system to use the dedicated one but i'm not sure
Post edited August 31, 2016 by bela555
Interesting... Well, I guess the statement that a laptop's no good for gaming would be a bold one (I've been doing mostly fine for two years with mine, and okayish with my previous one for several more years). The idea that you can't use your laptop's dedicated graphics card at all for certain games would definitely be a pain in the ass, though, if proven right. I'm curious for someone else to confirm or debunk that idea. I guess there's gotta be a better way to get around this than nuking the integrated card in the BIOS or disabling it in the OS (which would be useful if proven to work, but definitely overkill). I'd try and test around myself but, again, I'm scared :-P.
avatar
Chandoraa: Interesting... Well, I guess the statement that a laptop's no good for gaming would be a bold one (I've been doing mostly fine for two years with mine, and okayish with my previous one for several more years). The idea that you can't use your laptop's dedicated graphics card at all for certain games would definitely be a pain in the ass, though, if proven right. I'm curious for someone else to confirm or debunk that idea. I guess there's gotta be a better way to get around this than nuking the integrated card in the BIOS or disabling it in the OS (which would be useful if proven to work, but definitely overkill). I'd try and test around myself but, again, I'm scared :-P.
maybe this works, i dunno.... i dont have laptop to test it sadly
http://www.pcadvisor.co.uk/how-to/pc-components/how-set-default-graphics-card-3612668/
avatar
Chandoraa: Interesting... Well, I guess the statement that a laptop's no good for gaming would be a bold one (I've been doing mostly fine for two years with mine, and okayish with my previous one for several more years). The idea that you can't use your laptop's dedicated graphics card at all for certain games would definitely be a pain in the ass, though, if proven right. I'm curious for someone else to confirm or debunk that idea. I guess there's gotta be a better way to get around this than nuking the integrated card in the BIOS or disabling it in the OS (which would be useful if proven to work, but definitely overkill). I'd try and test around myself but, again, I'm scared :-P.
As I said, laptops may be fine (although really overpriced) for new games, but for old ones...

Let me explain why disabling integrated card in BIOS will not work for laptops. Dedicated graphic cards in laptops with switchable graphics are not "real" graphics cards. They are not connected to display and don't have a video output ports. They are however connected to the integrated card and that card handles the display. You could say that whenever dedicated card is used its output is streamed through integrated chip. So, when you disable integrated chip you are also disabling dedicated card. That is obviously not the case for desktops.
But that means that if the game refuses to recognise the dedicated chip despite being explicitly told to use it - you are out of luck unless the game's creator does something about it.
Post edited September 03, 2016 by Paradoks
Your graphics driver should have a setting for which card to use, it's usually under 3D settings or similar.
avatar
Paradoks: As I said, laptops may be fine (although really overpriced) for new games, but for old ones...
There are some laptops though which don't have an "integrated" (ie. Intel HD) graphics chip at all. For instance my ASUS G75VW has only the NVidia Geforce GTX 670M graphics, so it is used for everything.

However, my experience with using Intel HD 4000 for old games has been quite good (on another laptop which has only that). I would say it has better compatibility with old games that the Geforce GTX 670M, or at least that is what it feels like.

Off the top of my head, I can't recall any older game that would work better on the 670M than the Intel HD 4000, but e.g. Gorky 17 and Blood Omen: Legacy of Kain work better on the Intel HD laptop. Gorky 17 has to be run pretty much in SW rendering mode on the Geforce due to graphics problems with HW rendering, while on the Intel HD it works pretty much flawlessly with HW rendering. In Blood Omen, it doesn't display the signposts and other pop-ups with Geforce for some reason, while on the Intel HD it does.
avatar
timppu: There are some laptops though which don't have an "integrated" (ie. Intel HD) graphics chip at all. For instance my ASUS G75VW has only the NVidia Geforce GTX 670M graphics, so it is used for everything.
I don't think they make laptops like that any more. They made ones years ago, but everything I've seen in the last 3 years has integrated chips (including so called gaming laptops). For example ASUS ROG GL752VW (which I assume is a newer version of what you have) has an integrated Intel HD 530 alongside GF GTX 960M.
avatar
timppu: However, my experience with using Intel HD 4000 for old games has been quite good (on another laptop which has only that). I would say it has better compatibility with old games that the Geforce GTX 670M, or at least that is what it feels like.

Off the top of my head, I can't recall any older game that would work better on the 670M than the Intel HD 4000, but e.g. Gorky 17 and Blood Omen: Legacy of Kain work better on the Intel HD laptop. Gorky 17 has to be run pretty much in SW rendering mode on the Geforce due to graphics problems with HW rendering, while on the Intel HD it works pretty much flawlessly with HW rendering. In Blood Omen, it doesn't display the signposts and other pop-ups with Geforce for some reason, while on the Intel HD it does.
I can't speak of Nvidia because GeForce 4 MX that I had effectively "cured" me from anything Nvidia related for at least 25 years. However I don't recall having any compatibility issues with ATI/AMD at least for the games that I've played. Intel however was giving me some occasional problems (like in No One Lives Forever) but that was many years ago, maybe it got better. What I can say however is that Intel's performance is game killing in most cases. Even Settlers 4 slows down if I set object rendering to hardware. Other games like Rome: Total War are outright unplayable (interestingly, slightly newer Medieval:TW 2 recognises dedicated card without problems).

And you have a PC copy of Blood Omen. Lucky you.
avatar
timppu: There are some laptops though which don't have an "integrated" (ie. Intel HD) graphics chip at all. For instance my ASUS G75VW has only the NVidia Geforce GTX 670M graphics, so it is used for everything.
avatar
Paradoks: I don't think they make laptops like that any more. They made ones years ago, but everything I've seen in the last 3 years has integrated chips (including so called gaming laptops). For example ASUS ROG GL752VW (which I assume is a newer version of what you have) has an integrated Intel HD 530 alongside GF GTX 960M.
I am unsure what the "GL"-series is, but at least I don't see any Intel HD GPU mentioned in the specs of this (G-series, not GL-series):

https://www.asus.com/ROG-Republic-Of-Gamers/ROG-G752VY/specifications/

The only GPU mentioned for it is NVidia Geforce GTX 980M. However, I agree though that most laptops have both Intel HD and some dedicated GPU, or the cheapest laptops have only Intel HD Graphics (like my HP laptop). Not sure about the AMD chipsets, my only laptops with AMD/ATI graphics are some very old IBM or Lenovo ThinkPads.

avatar
Paradoks: I can't speak of Nvidia because GeForce 4 MX that I had effectively "cured" me from anything Nvidia related for at least 25 years. However I don't recall having any compatibility issues with ATI/AMD at least for the games that I've played. Intel however was giving me some occasional problems (like in No One Lives Forever) but that was many years ago, maybe it got better.
It actually did got better a year or two ago, with new Intel HD drivers. Before them I had compatibility issues with quite many games (e.g. the GOG version of Empire Earth, there were some extra polygons constantly flickering on the screen), but with the new drivers that, and many others, were completely fixed.

With NVidia my experience is usually the opposite: newer drivers could break backwards-compatibility with some older games. At least in that case the new Intel HD 4000 drivers really improved backwards compatibility with old games.

avatar
Paradoks: What I can say however is that Intel's performance is game killing in most cases. Even Settlers 4 slows down if I set object rendering to hardware. Other games like Rome: Total War are outright unplayable (interestingly, slightly newer Medieval:TW 2 recognises dedicated card without problems).
Yes, but we were talking about the compatibility with older games, where I don't see much of a performance issue. With newer games, the optional NVidia GPU is more likely to kick in and take care of the performance needs.

I've heard newer Intel HD graphics GPUs actually are quite capable also performance-vise, but I have no personal experience with them, and I presume dedicated NVidia and AMD gaming mobile GPUs are still faster. The Intel HD 4000 I have in one laptop is quite an old GPU, so I wouldn't try to run The Witcher 3, or even 2, with it. The first Witcher game... maybe, maybe I have even done that, I don't recall for sure.
Post edited September 04, 2016 by timppu
avatar
Paradoks: As I said, laptops may be fine (although really overpriced) for new games, but for old ones...
avatar
timppu: There are some laptops though which don't have an "integrated" (ie. Intel HD) graphics chip at all. For instance my ASUS G75VW has only the NVidia Geforce GTX 670M graphics, so it is used for everything.
Frankly, I highly doubt Intel makes any special i3-i7 chips like that. It may simply be disabled in the PC's BIOS. By the way, I happen to have that very same model as my Windows PC. Groovy.
avatar
bela555: maybe this works, i dunno.... i dont have laptop to test it sadly
http://www.pcadvisor.co.uk/how-to/pc-components/how-set-default-graphics-card-3612668/
Sure, that's how you'd normally do it. The game (or program, or whatever) just picks a card by default, but you can override it like that, except for the games I mentioned. It often goes something like this:

Integrated card: Everybody be cool, I got this.
Me: *maxes out the in-game graphic options, is appaled at the crappy framerate* Yeah, like hell you got this! *Sets it to use the dedicated card instead*
Dedicated card: Come get some!
avatar
Paradoks: Let me explain why disabling integrated card in BIOS will not work for laptops. Dedicated graphic cards in laptops with switchable graphics are not "real" graphics cards. They are not connected to display and don't have a video output ports. They are however connected to the integrated card and that card handles the display. You could say that whenever dedicated card is used its output is streamed through integrated chip. So, when you disable integrated chip you are also disabling dedicated card. That is obviously not the case for desktops.
But that means that if the game refuses to recognise the dedicated chip despite being explicitly told to use it - you are out of luck unless the game's creator does something about it.
I sure am learning a lot in this thread. Shame we haven't found a "solution" (there doesn't seem to be one), but glad the issue wasn't all that pressing to begin with, since, in these cases, my integrated card can take the heat.
avatar
timppu: Off the top of my head, I can't recall any older game that would work better on the 670M than the Intel HD 4000, but e.g. Gorky 17 and Blood Omen: Legacy of Kain work better on the Intel HD laptop. Gorky 17 has to be run pretty much in SW rendering mode on the Geforce due to graphics problems with HW rendering, while on the Intel HD it works pretty much flawlessly with HW rendering. In Blood Omen, it doesn't display the signposts and other pop-ups with Geforce for some reason, while on the Intel HD it does.
Come to think of it, I have seen that kind of behaviour when my ten-years-old XP-toting desktop does a lousy job of running some late-90s-to-early-00s games, even with the latest nVidia drivers. Some have to be run in software mode entirely in order not to look like an acid trip. Tough nut to crack indeed for those games we feel like playing and are unlikely to ever come to GOG :-(. Not that I have an Intel to try on that one.
If these games use Dosbox (at least BG is), make sure to let Dosbox be run with your NVidia Card. I don`t have it here because I am ou of town for anoher week, bu will check when coming home.Other 4:3 games give me the so loved orange ligh on my GS70 ;)