It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Yes, that's what's always worried me about playing classics on my new pc. I thought this GOG versions were completely optimized, but it doesn't seem they've fixed this behavior with the high cpu usage. This is something that's happened to me with different old games in different computers.
Maybe it's not something to be worried, I honestly don't know... but I use this pc to work, so I have to be cautious.
Thanks for your time unholy.

If you find out something else (or anybody) I'll love to know. I've always wanted to play fallout... darn.
I just looked at my task manager after playing some fallout. My GPU levels seem normal but ut says i'm using 1.35GB of my memory. I have never seen this before, and I don't know if it has anything to do with running fallout previously. Other than that my GPU usage looks normal.
avatar
flickenmaste: I just looked at my task manager after playing some fallout. My GPU levels seem normal but ut says i'm using 1.35GB of my memory. I have never seen this before, and I don't know if it has anything to do with running fallout previously. Other than that my GPU usage looks normal.
Hi. I didn't quite get if you checked your cpu usage AFTER or WHILE playing fallout. You should check it while you're playing (alt+tab). Cpu usage returns to standard once you close the game.
Regarding the memory, I don't remember fallout affecting it substantially, but I can double check again.
Could you post the number your cpu usage reaches while playing fallout and while playing some new game? thanks
Post edited March 19, 2011 by gabocaba
avatar
gabocaba: Hi. I didn't quite get if you checked your cpu usage AFTER or WHILE playing fallout. You should check it while you're playing (alt+tab). Cpu usage returns to standard once you close the game.
Regarding the memory, I don't remember fallout affecting it substantially, but I can double check again.
Could you post the number your cpu usage reaches while playing fallout and while playing some new game? thanks
My CPU while playing fallout was averaging about 50% of my CPU.
While running Eve my CPU hit about 16% but stayed at around 5% CPU.

My computer runs fine while playing fallout, don't know why it is using so much power.
Post edited March 20, 2011 by flickenmaste
avatar
gabocaba: Hi. I didn't quite get if you checked your cpu usage AFTER or WHILE playing fallout. You should check it while you're playing (alt+tab). Cpu usage returns to standard once you close the game.
Regarding the memory, I don't remember fallout affecting it substantially, but I can double check again.
Could you post the number your cpu usage reaches while playing fallout and while playing some new game? thanks
avatar
flickenmaste: My CPU while playing fallout was averaging about 50% of my CPU.
While running Eve my CPU hit about 16% but stayed at around 5% CPU.

My computer runs fine while playing fallout, don't know why it is using so much power.
Hi, thanks for the info. Seems like everyone is getting this kind of results...
My experience is that this is common with old games in new systems, but most gamers aren't even aware of this (not everyone alt+tabs to see how the cpu is responding).
This has happened to me with many old games. I think starcraft had that same problem, but they released a patch.

Is some official response from gog about this?
Don't know why old games cause such high CPU on newer systems. I don't think it should hurt your computer if you have dual cores or more, and if your not running any other high usage programs.

I've only had fallout freeze on me once when i put items from my inventory into a locker.
Yeah, I don't know about hurting your computer.. maybe it won't, but maybe it will... I don't think it can be good to have one of your cores running at max speed for hours and hours...
What the hell is going on here!? I leave this question solved and when I come back I see all this nonsense!? Haha, just kidding just kidding. Thanks for all you're input, it's good to know I'm not the only one experiencing this problem. I actually stopped playing fallout due to an interesting bug that would not let me save my game. Good thing I save my game every 2 or 3 minutes, (10 years plus of playing would tell any logical fallout junkie to do the same) damn these fallout bugs! I restarted the game and it fixed the problem but I was getting bored anyhow. I'll play it again after a year or so. Perhaps I will emerge myself into the wonderful world of Arcanum?
Post edited April 02, 2011 by mikenike
I had the same issue with Fallout 2.
IIRC (it was a long time ago), that was under Windows 7 x64 on my laptop (Intel T9300 dual core CPU).
It is important to know that most laptops have poor cooling capabilities. On my laptop, the big stress Fallout put on 1 core out of 2 (50% overall CPU load) was enough to cause unnecessary heat and fan noise.

I downloaded a 3rd party app', to try to throttle the Fallout process.
There exists several of these apps (you may have already heard of CPU grab, for instance). After trying a few of them, I settled on a free and open-source app called "Battle Encoder Shirase", which worked the best for me.
I had to tweak the CPU load settings a little bit, to find the lowest CPU share I to give to Fallout while still keeping it running at normal speed.
Once I found the right settings, my dual core CPU was still used up to 50% (shared between BES + Fallout) but the heating was much lower, because BES puts a more gentle kind of load on the CPU than Fallout does. The game was running a tiny bit slower but it was very hard to notice.

This didn't kill the source of the problems, but at least removed the consequences.
Post edited April 03, 2011 by Kaede
Mikenike, can you change the SOLVED tag on the forum to UNSOLVED?

Please, if you can, report if Arcanum or other old games has high cpu usage.
avatar
gabocaba: Mikenike, can you change the SOLVED tag on the forum to UNSOLVED?

Please, if you can, report if Arcanum or other old games has high cpu usage.
I know I was only joking. :-) But yeah, acrcanum has high cpu usage to. I'm haven't checked on any other games.

I think it's due to my low quality processor although it is still strange regardless. I'm eventually going to upgrade to a quad core, I'll see if that changes anything. I don't know if older games like fallout utilized a quad core processor but the one I'm looking at is much higher quality then my current one. It should help.
Post edited April 10, 2011 by mikenike
avatar
gabocaba: Mikenike, can you change the SOLVED tag on the forum to UNSOLVED?

Please, if you can, report if Arcanum or other old games has high cpu usage.
avatar
mikenike: I know I was only joking. :-) But yeah, acrcanum has high cpu usage to. I'm haven't checked on any other games.

I think it's due to my low quality processor although it is still strange regardless. I'm eventually going to upgrade to a quad core, I'll see if that changes anything. I don't know if older games like fallout utilized a quad core processor but the one I'm looking at is much higher quality then my current one. It should help.
Hi, yeah i know you where joking. A shame about arcanum running with high cpu too, its a great game.
And this is not something that will improve with a better processor. I have an i7, a quad core, and it maxes out one of my cores, so my cpu usage gets to 25% (being four cores)... but it-s the same problem of over using a core.

I wish gog will adress this problem someday
avatar
mikenike: I know I was only joking. :-) But yeah, acrcanum has high cpu usage to. I'm haven't checked on any other games.

I think it's due to my low quality processor although it is still strange regardless. I'm eventually going to upgrade to a quad core, I'll see if that changes anything. I don't know if older games like fallout utilized a quad core processor but the one I'm looking at is much higher quality then my current one. It should help.
avatar
gabocaba: Hi, yeah i know you where joking. A shame about arcanum running with high cpu too, its a great game.
And this is not something that will improve with a better processor. I have an i7, a quad core, and it maxes out one of my cores, so my cpu usage gets to 25% (being four cores)... but it-s the same problem of over using a core.

I wish gog will adress this problem someday
Hmm, this is strange indeed. Well, I suppose one could be confident in a higher quality processor to not get damaged as a way to justify playing these great games. Or one could be safe and avoid them all together. I personally don't think it will run a cpu short of it's normal life span.
Okay I've got the same problem (Virtual Dual-Core at 50%) so I took a look.

There is one rather simple fact in programming which I think is responsible for this phenomenon, that "old" games use up all CPU.

Because most processors run at a different speed (even though the clocking/MHz may be same, speed could be different due to CPU architecture, bus types, operating systems and other factors).

So in order to run at a proper speed, the games have to find a way of determining how fast their processor is running. Most do that by measuring, how many simulation cycles can be performed within a certain time. As most games' logic processing is directly coupled to their rendering (they use only one main loop, not multiple threads), this is what you experience (more or less) as FPS aka Frames Per Second.

Due to restrictions on the graphic framework (DirectX, Glide, OpenGL) usually rendering is "capped" at 60 or 100 FPS, you won't get more FPS unless you cheat your system. This is actually also a good thing, later we will see why.

Now let's say that the came could do 80 simulation (logic & rendering) cycles per second, so you would actually get 80 pictures/frames per second, but your CPU and GPU would run at maximum power. And our human eyes and brains can only process 12 different images per second. But it notices the transitions between those images to up to 24.7 (I think) images per seconds (so you get the feeling that the movie/game is not quite smooth). This is the reason why most video standards settle their FPS somewhere around 25 FPS (PAL etc).

[Why old CRT monitors etc ran at higher "FPS" (60/70/72/90/100) was because of interference of other light sources and lightwaves cancelling and how CRTs worked in general. This topic is quite huge but it has become more or less obsolete with new rendering techniques like plasma/LCD/etc screens.]

So back to the topic: actually the game could run at 80 FPS, but we (our eyes and brains) already accept 25 or 30 FPS as perfectly smooth. So those additional 50 FPS are only wasted energy (yes, your PC/laptop will use more energy for higher performance - the increase of heat production btw is squared to the performance, so twice the speed gives four times the heat, eg uses about 4 times more energy - this is not generally applicable to the whole PC system, but minor effects are still observable).
So instead of wasting energy or just allowing other applications to run in the meantime, games nowadays limit themselves to certain FPS rates. In quite modern games there's usually even an in-game option for this.

The problem is: to limit FPS, the program need the possibility to suspend itself for a very short amount of time (milliseconds). Most people know that as sleeping, in multi-threaded environments is can also be implemented by yielding or waiting.

So back to the early time of computers and computer games: there simply were no proper sleep functions around (and nobody cared about multithreading). So either the sleeping function was not accurate enough - your game stuttered - or the sleeping intervals were too big, with the effect that the FPS dropped significantly below 30 FPS.
[By the way: even today pure C/C++ sleep functions have bad accuracy, somewhere between 20 to 40 ms, so you would get a lot of stuttering. This effect can easily be remedied by calling operating system functions (API) which usually have an accuracy of 4 to 40 NANO-seconds (so 0.004 - 0.04 ms), thus making them perfect for the sleeping task - but you see, little timing problems exist even today.]

So most games just calculated, how many max FPS they could simulate, and just corrected their logic to account for that. So they ran more or less at the same felt speed on every PC, even though they had very different FPS. But the gamer experienced the speed to be the same on all PCs.

One example which did not take this into account was Command and Conquer, where especially in multiplayer you realized this difference. If you system ran twice as fast as your friend's, your bases produced at twice the speed, your units move at twice the speed etc.

Back to "why does it run at 100/50/25 percent of my CPU"?
Simple: because it is simulating as fast as possible, but adjusting the logic so the user does not feel it run faster. And because it can only use one processor, only this processor runs at max speed, so you get a processor usage percentage of 1/c * 100% (where c is the amount of processors on your system).
Actually, old games on modern system could run at 3000 FPS or more. Due to the intended limitations of the graphics frameworks above, it is unusual to get higher rates than 100 FPS, but some games come up with frame rates of 250 or more.

Now modern games do something else: they usually leave the timing completely to the framework and only concentrate on logic speed, because both timing and graphics are handled by the framework.

So what the framework basically does: it runs in a main loop, and calls the game's logic procedure for every frame. From system calls it knows times pretty accurately, down to nanoseconds. A game that limits itself at 30 FPS would look something like this:

timePerCycle = 1000ms / 30 FPS; // so around 33 ms per cycle

mainloop {
startTime = now();

// logic and grapics
processLogic(timePerCycle);
renderGraphics();

// timing
lastCycleLength = now() - startTime; // how long the simulation of the last cycle took, say 6 ms
restWaitTime = timePerCycle - lastCycleLength; // 33 - 6 = 27 ms
sleep(restWaitTime); // now sleep those 27 ms

} // go to the start of the main loop

and modern games only implement the processLogic() function.
So the game actually just used 6 ms for processing and rendering, but slept for 27 ms. So 6/(6+27) = 6/33 = 0.18 = 18% means that the game process would have run at somewhere around 18% of processor speed.

Comparing this to an old game:
Assume that old game also needs 6 ms for logic and graphics.
Then it would run at (1 second = 1000 ms) / 6 ms = 166.7 FPS, but using the full processing power it can get. That's what usually happens in older games.
Post edited December 08, 2012 by JayC667
I limited Fallout 1 and Fallout 2 cpu usage with BES. 5-10% of allowed cpu, game run without problem (and with 5-10% cpu usage).

More info here: http://www.synctocloud.net/?p=711

If you doesn't want to start/stop bes manually each start of game, try out synctocloud app (fallout templates already contains bes cpu fix task)