It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Hello, lets say I go on a gog spending spree and buy a bunch of old school games...of course any modern graphics card should be able to handle these games just fine. I currently have two GTX 580's in SLI

My question is, if I wanted to save power/electricity, could I go into nvidia CP, disable SLI, then disconnect the power cord from secondary 580?
You could pretty much do that anyway. Seriously, do you have a multiple monitor setup or something? Only I have to wonder what possible game on the market is going to tax one 580, never mind two. Witcher 2 runs fine maxed out on my 260 and that's about the best looking game I've seen by a margin.

Edit: oh and I'll add that it's probably wise to completely remove the card rather than just the power to it. Could cause you problems otherwise as it's possible for cards to draw power through the motherboard and that might be what it will try and do if not powered directly.

Probably not, but not the kind of thing you'd want to take a chance on.
Post edited September 21, 2011 by Navagon
avatar
Navagon: You could pretty much do that anyway. Seriously, do you have a multiple monitor setup or something? Only I have to wonder what possible game on the market is going to tax one 580, never mind two. Witcher 2 runs fine maxed out on my 260 and that's about the best looking game I've seen by a margin.
Red Orchestra 2 chokes on the 260 I gave to my brother, as well as the 590 that I use with my 2560x1440 monitor. Bad Company 2 also doesn't run very nice with half of my 590, and is acceptable but not completely blown away by the 590 in SLI.

EDIT: But RO2 is probably not fully optimized like many games, to be fair. I'd assume Bad Company 2 at least is a reasonable example.
Post edited September 21, 2011 by PhoenixWright
avatar
PhoenixWright: Red Orchestra 2 chokes on the 260 I gave to my brother, as well as the 590 that I use with my 2560x1440 monitor. Bad Company 2 also doesn't run very nice with half of my 590, and is acceptable but not completely blown away by the 590 in SLI.
Really? Bad Company 2, that is. Runs fine for me. Looks damn good doing it. Maybe there's a problem with the code that it starts to chug on certain resolutions no matter what. It's primarily a console game, so that might make sense.

Red Orchestra 2 pretty much sucks regardless of the system, from what I've heard.
avatar
Navagon: You could pretty much do that anyway. Seriously, do you have a multiple monitor setup or something? Only I have to wonder what possible game on the market is going to tax one 580, never mind two. Witcher 2 runs fine maxed out on my 260 and that's about the best looking game I've seen by a margin.
avatar
PhoenixWright: Red Orchestra 2 chokes on the 260 I gave to my brother, as well as the 590 that I use with my 2560x1440 monitor. Bad Company 2 also doesn't run very nice with half of my 590, and is acceptable but not completely blown away by the 590 in SLI.

EDIT: But RO2 is probably not fully optimized like many games, to be fair. I'd assume Bad Company 2 at least is a reasonable example.
So even if I disable SLI in nvidia CP, and disconnect power, its still not advised to leave the card in there?

Also no I am not running multiple monitors...single 1920x1200...and there aren't many games out there that can bring two 580s to its knees at 1920x1200....but why does it matter? I got them for a smoking deal

PS, I highly doubt you can max witcher 2 with a single 260
and I can see bad company 2 running slow on a single 590 with 2560x1440 ALL maxed....that resolution is no simple feat even for a 590
Post edited September 21, 2011 by dukenuke88
avatar
Navagon: You could pretty much do that anyway. Seriously, do you have a multiple monitor setup or something? Only I have to wonder what possible game on the market is going to tax one 580, never mind two. Witcher 2 runs fine maxed out on my 260 and that's about the best looking game I've seen by a margin.

Edit: oh and I'll add that it's probably wise to completely remove the card rather than just the power to it. Could cause you problems otherwise as it's possible for cards to draw power through the motherboard and that might be what it will try and do if not powered directly.

Probably not, but not the kind of thing you'd want to take a chance on.
Graphics cards have a tendency to complain if they're present but not hooked up to auxiliary power. Depending on the details, having such a card in your system could cause it to interfere with POST or graphics operation. I agree: if you want to not power it, take it out (and put it back in its antistatic bag).

If you leave it in the system, leave it hooked up to auxiliary power, so it has no cause to complain. If you do not use SLI, it will draw only idle power (30W or so, for the 580).
avatar
dukenuke88: PS, I highly doubt you can max witcher 2 with a single 260
You can't. Not uber anyway. It still hovers around 25-30 avg on uber with an overclocked 460 in my experience. Granted, depends on your definition of good frame rate I suppose. Also, if you don't want it drawing power you might as well take it out. It'll probably suck some from the motherboard otherwise and like you mentioned, unless you're playing Metro 2033 or something you might as well leave it at one. Save it for Battlefield 3 or something.

P.S. I'm jealous you have this problem and would be willing to keep one in "storage" for you...
Post edited September 21, 2011 by Slump
avatar
dukenuke88: PS, I highly doubt you can max witcher 2 with a single 260
and I can see bad company 2 running slow on a single 590 with 2560x1440 ALL maxed....that resolution is no simple feat even for a 590
Yeah, the performance I'm getting seems aligned with all of the benchmarks I looked at for the resolution I'm using.
avatar
dukenuke88: I highly doubt you can max witcher 2 with a single 260
It depends on your definition of maxed, I suppose. Naturally I don't have DX11 on a 260 and my monitor supports up to 1680x1050, which I run the game at. So naturally that's less demanding too.
avatar
PhoenixWright: Yeah, the performance I'm getting seems aligned with all of the benchmarks I looked at for the resolution I'm using.
Dropping the resolution down to something a bit less demanding might make all the difference. That's not the kind of game you want a crappy framerate on.
Post edited September 21, 2011 by Navagon
avatar
dukenuke88: PS, I highly doubt you can max witcher 2 with a single 260
avatar
Slump: You can't. Not uber anyway. It still hovers around 25-30 avg on uber with an overclocked 460 in my experience. Granted, depends on your definition of good frame rate I suppose. Also, if you don't want it drawing power you might as well take it out. It'll probably suck some from the motherboard otherwise and like you mentioned, unless you're playing Metro 2033 or something you might as well leave it at one. Save it for Battlefield 3 or something.

P.S. I'm jealous you have this problem and would be willing to keep one in "storage" for you...
its like a cock thing.

"oh. my penis is too big,
my pants don't fit"


*so envious*
avatar
Slump: Also, if you don't want it drawing power you might as well take it out. It'll probably suck some from the motherboard otherwise...
PCI-Express cards can draw a lot of power from the motherboard. It's their primary power source. Those connectors on the back are only auxiliary. They're used because the power that can be drawn from the motherboard is limited: 75W in PCI-e 1.0, 150W in 2.0.

If the card's in the system and not used, it will still draw idle power. If it's in the system, not used, and not connected to auxiliary power, it will probably generate errors when its firmware detects that there's no auxiliary power.

So the only two options are connect it fully, or take it out. If you leave it fully connected and not used, it will still draw only idle power.
Post edited September 21, 2011 by cjrgreen
avatar
Navagon: Dropping the resolution down to something a bit less demanding might make all the difference. That's not the kind of game you want a crappy framerate on.
It's fine in SLI, I just wanted to provide a half-590 description for reference. I'd attach a sweet screenshot but the PNGs I capture at that res run about 2.5MB.
avatar
cjrgreen: So the only two options are connect it fully, or take it out. If you leave it fully connected and not used, it will still draw only idle power.
And that right there is the question answered.
avatar
PhoenixWright: I'd attach a sweet screenshot but the PNGs I capture at that res run about 2.5MB.
Yeah, I bet they do. That's a lot of pixels.
Post edited September 21, 2011 by Navagon
I'm not considering my 580 a waste...because I am still going to be playing modern games maxed out....its just that IF gog throws out a big promo soon, I might just end up going old school...but of course I don't want to just waste electricity like that...so thats why I asked

Okay so it looks like my best bet is to disable SLI in nvidia CP and leave everything as is
avatar
dukenuke88: I'm not considering my 580 a waste...because I am still going to be playing modern games maxed out....its just that IF gog throws out a big promo soon, I might just end up going old school...but of course I don't want to just waste electricity like that...so thats why I asked

Okay so it looks like my best bet is to disable SLI in nvidia CP and leave everything as is
Do you have numbers on how much electricity you actually save by taking out the one card? Because it seems to me that the one card isn't going to make a huge difference in the total power consumption of your building. But that just seems logical to me, and so could be totally wrong.