It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
dukenuke88: I'm not considering my 580 a waste...because I am still going to be playing modern games maxed out....its just that IF gog throws out a big promo soon, I might just end up going old school...but of course I don't want to just waste electricity like that...so thats why I asked

Okay so it looks like my best bet is to disable SLI in nvidia CP and leave everything as is
avatar
SheBear: Do you have numbers on how much electricity you actually save by taking out the one card? Because it seems to me that the one card isn't going to make a huge difference in the total power consumption of your building. But that just seems logical to me, and so could be totally wrong.
If the card is slotted but not used, it will draw the small amount of power needed for it to sit there doing nothing but being ready to respond to commands. The idle power for the GTX 580 is not specified, but has been measured at approximately 30 watts.

So taking it out will save you about 30 watts, or about 1 kilowatt-hour for every 33 hours operation. It will actually be somewhat more than that, because your power supply is not 100% efficient.

The wear and tear on the card from being repeatedly slotted and removed is considerably greater than the cost of electricity for leaving it in. Unless the OP leaves his computer powered up 24/7 and lives in a place with atrociously high electric rates (say, Denmark), his decision to leave it in place and idle is the correct one.
Post edited September 22, 2011 by cjrgreen
thank you cjgreen,

I don't plan to run it 24/7....its only on during 6 hours a day...maybe less actually...and during those hours its gaming about 4? thats all rough estimate

i'm just going to go ahead and leave it in it's slot...and disable it through nvidia CP. I think constantly removing and inserting will be a PITA...and like you said, can possibly increase wear...
avatar
SheBear: Do you have numbers on how much electricity you actually save by taking out the one card? Because it seems to me that the one card isn't going to make a huge difference in the total power consumption of your building. But that just seems logical to me, and so could be totally wrong.
avatar
cjrgreen: If the card is slotted but not used, it will draw the small amount of power needed for it to sit there doing nothing but being ready to respond to commands. The idle power for the GTX 580 is not specified, but has been measured at approximately 30 watts.

So taking it out will save you about 30 watts, or about 1 kilowatt-hour for every 33 hours operation. It will actually be somewhat more than that, because your power supply is not 100% efficient.

The wear and tear on the card from being repeatedly slotted and removed is considerably greater than the cost of electricity for leaving it in. Unless the OP leaves his computer powered up 24/7 and lives in a place with atrociously high electric rates (say, Denmark), his decision to leave it in place and idle is the correct one.
Ok, that makes sense. Thanks for explaining that.

And at dukenuke88, I do a similar thing with my SLI set up, I usually have only one running (turned off through the nvidia controls) because my computer is tons quieter with only the one on actually.
avatar
Slump: You can't. Not uber anyway. It still hovers around 25-30 avg on uber with an overclocked 460 in my experience. Granted, depends on your definition of good frame rate I suppose. Also, if you don't want it drawing power you might as well take it out. It'll probably suck some from the motherboard otherwise and like you mentioned, unless you're playing Metro 2033 or something you might as well leave it at one. Save it for Battlefield 3 or something.

P.S. I'm jealous you have this problem and would be willing to keep one in "storage" for you...
avatar
lukaszthegreat: its like a cock thing.

"oh. my penis is too big,
my pants don't fit"


*so envious*
I had that problem, two words, "parachute pants."
SheBear,

wait...turning off SLI in Nvidia CP turns off the secondary card completely? Including the fan?
avatar
dukenuke88: SheBear,

wait...turning off SLI in Nvidia CP turns off the secondary card completely? Including the fan?
The fan is controlled by the card, which is running off a profile in the card firmware. It's going to run only so fast as the card thinks it needs moving air, which is very little (maybe 10% of full speed) at idle. Thus the extra quiet.
Post edited September 22, 2011 by cjrgreen
avatar
dukenuke88: SheBear,

wait...turning off SLI in Nvidia CP turns off the secondary card completely? Including the fan?
avatar
cjrgreen: The fan is controlled by the card, which is running off a profile in the card firmware. It's going to run only so fast as the card thinks it needs moving air, which is very little (maybe 10% of full speed) at idle. Thus the extra quiet.
okay gotcha, thanks cjrgreen