It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
shmerl: In your test you should also measure decompression time. xz without multithreading is bad for large packages. Such huge wait time during installation is just not user friendly.
You’re right, I’m going to run new tests measuring the decompression times, as well as compiling the last version of xz to have a comparison between mono-threaded an multi-threaded xz compression/decompression.
avatar
shmerl: In your test you should also measure decompression time. xz without multithreading is bad for large packages. Such huge wait time during installation is just not user friendly.
avatar
vv221: You’re right, I’m going to run new tests measuring the decompression times, as well as compiling the last version of xz to have a comparison between mono-threaded an multi-threaded xz compression/decompression.
Thanks! Please post the results here and / or in the request about xz (some tests were already listed there in the comments).
Even with paralel decompression, you might found out xz to be slow. That's the advantage of tar.gz - it offers a pretty solid compression ratio at high decompression speeds. It's pretty similar to why people still use zip instead of 7zip or rar on Windows.
If you have 4 cores CPU with hyperthreading, parallel decompression can work pretty well. Anyway, as was discussed before, it's a compromise between download time and decompression time. For most people download time is still the major part, so for them reducing download time takes precedence over decompression.
And even if that's not the case, you forget there are still people using lousy dual cores or even single cores (or Atoms for that matter). So the benefit is limited to only a part of GOG's customers.
avatar
shmerl: For most people download time is still the major part, so for them reducing download time takes precedence over decompression.
Sorry, but I don't see it that way. Even in backwards parts of world like my own country we have a high bandwidth internet access. And whatever the size of the download, it's still faster than downloading 1mb over dialup on 9600 bps modem. So...
Post edited December 23, 2014 by astropup
One way to decrease Linux tarball download size would be to stop bundling dosbox/wine/dependant libraries in the tarball.

I know GOG have reason for that, but it would be nice if they could think up a way how to not do that. A little brainstorming on part of their Linux team. :)
Post edited December 23, 2014 by astropup
Sorry, single core gaming computers are already antiques. I don't think anyone should worry about those as not to use some technology. It's not early 2000s anymore.

If it's 40 GB download let's say at 25 Mbps - you'll save tons of time by reducing the size even on 1/3. I already posted comparisons in the related request.
avatar
astropup: One way to decrease Linux tarball download size would be to stop bundling dosbox/wine/dependant libraries in the tarball.
That's a tiny percentage comparing with size of modern games like Witcher 2 and etc. Practically irrelevant in the big scope of things.
Post edited December 23, 2014 by shmerl
avatar
shmerl: Sorry, single core gaming computers are already antiques. I don't think anyone should worry about those as not to use some technology. It's not early 2000s anymore.
What about Atom based devices? There are a bunch of those today. And they can run most of the games GOG sells.

avatar
shmerl: Practically irrelevant in the big scope of things.
Well, it's obvious that we find different things more relevant/important. So I don't think further discussion will make any of us change our minds. :)
avatar
astropup: Well, it's obvious that we find different things more relevant/important.
I mean objectively. How large is Wine / DOSBox / ScummVM in comparison with a 20 GB sized game?

I'd expect gaming computers not to be Atom based. Surely not those targeted for high end games which are usually that large.

avatar
astropup: So I don't think further discussion will make any of us change our minds. :)
It's up to GOG to decide what to use anyway :) But as size of games continues to grow, better compression will become more and more useful. How large will be Witcher 3 for instance?
Post edited December 23, 2014 by shmerl
avatar
shmerl: I'd expect gaming computers not to be Atom based. Surely not those targeted for high end games which are usually that large.
Not much of those on GOG (high end games). This site used to be called "Good Old Games", wasn't it? :)

Although not many have high end gaming machines anyway. Even in U.S. Not to mention the rest of the world. :)

avatar
shmerl: It's up to GOG to decide what to use anyway :)
True. :)
Post edited December 23, 2014 by astropup
avatar
astropup: Not much of those on GOG (high end games). This site used to be called "Good Old Games", wasn't it? :)
Used to, but now GOG sells demanding and quite massive games as well. For small games compression method wouldn't produce major difference really. But for large ones difference can be significant.

avatar
astropup: Although not many have high end gaming machines anyway. Even in U.S. Not to mention the rest of the world. :)
I'd expect average gamer to have more powerful computer than casual users who aren't gamers. Of course I mean on average. It doesn't mean all have high end computers.
Post edited December 23, 2014 by shmerl
I'd like to see better compression. For me it really isn't download speed but the fact that I have a monthly bandwidth cap. Better compression means I have more bandwidth to download more games.

My download speed sucks too, but I can wait over night as needed. It's tougher to wait a couple of weeks when I'm getting close to the cap.
avatar
hummer010: I'd like to see better compression. For me it really isn't download speed but the fact that I have a monthly bandwidth cap. Better compression means I have more bandwidth to download more games.
Data caps are really nasty. Usually mobile ISPs have them. But landline ones normally don't use caps. If they do it's a sign of some very bad market (since they have no reason to use caps besides ripping users off). And even with mobile ISPs caps is a very questionable practice.
Post edited December 23, 2014 by shmerl
avatar
hummer010: I'd like to see better compression. For me it really isn't download speed but the fact that I have a monthly bandwidth cap. Better compression means I have more bandwidth to download more games.
avatar
shmerl: Data caps are really nasty. Usually mobile ISPs have them. But landline ones normally don't use caps. If they do it's a sign of some very bad market (since they have no reason to use caps besides ripping users off). And even with mobile ISPs caps is a very questionable practice.
In on fixed wireless. Its the one and only ISP in the area (other than mobile). The cap is reasonable 70GB / month, but I'm generally pretty close to it by months end.
avatar
shmerl: Please post the results here and / or in the request about xz (some tests were already listed there in the comments).
From what I’m seeing so far multi-threading is a huge improvement on compression times.
I’ll post the results here later today, I still need to run a couple tests.

Just a bit of teasing before I get back to it, the different (de)compression methods and levels I’m testing (on a 20G .tar archive):
gzip -6 (default compression level)
gzip -9 (maximum compression level)
xz mono-threaded -6 (default compression level)
xz mono-threaded -9 (maximum compression level)
xz multi-threaded (2 threads) -6
xz multi-threaded (2 threads) -9
xz multi-threaded (4 threads) -6
xz multi-threaded (4 threads) -9

The values measured for each one:
compression ratio
archive size (this one is redundant with ratio, but I find the difference easier to see with file size than with ratio)
compression time
decompression time
Post edited December 23, 2014 by vv221