It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Mnemon: Have 144 games. Fit easily (including all extras and both windows and linux version where available) on a 1 TB drive that has plenty of other things on [loads of raw photos / some films / many other bits] with around 400GB still free.

A lot of the games here are well under 1gb each.
avatar
hedwards: I have over 200 games and they take up less than 200gb. with the extras and several different languages.

I don't buy new games here though in most cases. Especially since the downloader doesn't work very well for Linux.
Try gogrepo :)
avatar
hedwards: I have over 200 games and they take up less than 200gb. with the extras and several different languages.

I don't buy new games here though in most cases. Especially since the downloader doesn't work very well for Linux.
avatar
woolymethodman: Try gogrepo :)
It's gotten somewhat better, but do any of the programs allow you to resume in the middle of a download?
avatar
woolymethodman: It's assumed that you want to download whatever's in your manifest from your update command.. so download doesn't take -os or -lang. I could add that though.
No need, I guess. I just originally went by the first message in this thread (updated in June), assuming the commands there are still valid. But it is always a good idea to check the readme file that comes with the script for the up-to-date commands. Currently the script seems to do pretty much what I want it to do, now that I know its logic.

Still downloading, so far so good, it seems to have downloaded at full speed (10Mbit/s) pretty much all the time (I'm monitoring my network usage in Linux at the same time with ntop, just to see if there are any hiccups in download speeds). It seems downloading my whole GOG library (Windows/English only) with extras takes maybe 10-12 days on my home internet connection.

Too bad I need to go abroad next weekend (8th of Aug), so apparently I will not be able to fully complete this download this time. Oh well, I need to continue afterwards when I come back in September (I don't think I will try to continue the downloads abroad, I'll probably be on a mobile data connection). I so wish this download script would work also at my workplace (corporate network) because then I could probably download them all over a weekend or so, but then I might also get some attention from the IT service for heavy usage of the network. "Sure it is work-related! Over one terabyte of work-critical files!" :) This is the first time I really wish I had the 100Mbit/s connection at home, I just didn't want to pay the extra for it so far (10Mbit/s has been fast enough for now).

By the way, in what order does the script download the games/extras? It seems to have four active downloads (files) at any time, but how does it choose what to download next? It seems to pick files here and there everywhere, instead of trying to download the games in succession? If I can't finish this download cycle in time, I guess I need to check with verify which games are still missing installation files, I can't just assume that e.g. half of the games are fully downloaded already?


Also another clarifying question: what happens if GOG has changed or updated some files while you are downloading, ie. there is old information in the manifest file? I e.g. see that after I downloaded the manifest file, there has been at least one GOG update on some game (I didn't check which one, I wouldn't be surprised it is The Witcher 3).

Will the script try to download something that is not there on the servers anymore, and how will it recover from that error? Just skip it, and then I need to update the manifest file afterwards?
Post edited July 30, 2015 by timppu
There is -skipextras and -skipgames for the download command.

The manifest is sorted alphabetically by the name of the game.

If a file cannot be found is returned (404), it skips it.
If a file request times out, it will retry five times, with 5s in between I think.
Post edited July 30, 2015 by disi
avatar
disi: The manifest is sorted alphabetically by the name of the game.
is that also the order in which it downloads the games and extras? As far as I can tell from the output of the script, it downloads files here and there, e.g. Thief Gold installation file and extras for URU at the same time, etc.. Is this how it is supposed to download them, in no specific order? How does it decide what to download next, when one file download completes?

Just out of curiosity what is the logic behind it, All it matters of course that it downloads all, but as said if the download is interrupted, currently it isn't maybe so easy to tell right away which games/extras are already downloaded, ie. ready for use. E.g. if I know the download was interrupted while downloading Thief Gold files, then everything before that in alphabet is already downloaded.

I guess the verify option tells that though.

for item in sorted(items, key=lambda g: g.title):
Sorted by title from the manifest file.

It also checks on what files are already there. If you imported downloads, it will just skip these and it could mix up the queue.

It will also grab the next item, if for example one thread (worker) is still busy with the previous one.

thread1:
a_game_item1
thread2:
a_game_item2
thread3:
a_game_item3
thread4:
b_game_item1

If thread 4 is finished before thread 2 is, it will grab b_game_item2 or whatever comes next.

In whatever order, you will eventually have all items from the manifest and no need to download again.

If a game updated, check the advanced usage in the readme to update/download single games.
Post edited July 30, 2015 by disi
avatar
woolymethodman: Try gogrepo :)
avatar
hedwards: It's gotten somewhat better, but do any of the programs allow you to resume in the middle of a download?
gogrepo doesn't support resuming yet.. I'll add that soon. It will make the manifest larger to store all the MD5 chunks, but since that data is available from GOG, why not make use of it.
avatar
woolymethodman: It's assumed that you want to download whatever's in your manifest from your update command.. so download doesn't take -os or -lang. I could add that though.
avatar
timppu: No need, I guess. I just originally went by the first message in this thread (updated in June), assuming the commands there are still valid. But it is always a good idea to check the readme file that comes with the script for the up-to-date commands. Currently the script seems to do pretty much what I want it to do, now that I know its logic.
My bad.. I should keep that first post up to date! I edited the post.
Post edited July 30, 2015 by woolymethodman
avatar
hedwards: It's gotten somewhat better, but do any of the programs allow you to resume in the middle of a download?
avatar
woolymethodman: gogrepo doesn't support resuming yet.. I'll add that soon. It will make the manifest larger to store all the MD5 chunks, but since that data is available from GOG, why not make use of it.
That would be awesome. When I tried to download Wasteland 2 that first time, I had to download at least 50gb of data because it would time out at like 95% of the way complete with no way of resuming. And it didn't seem to matter how I went about it, the connection would just die with no way of resuming it.

I'm still annoyed that GOG is wasting money on Galaxy when such things have yet to be addressed at all.
Thanks a lot again for that script. Works great now even with multiple languages and operating systems. As mentioned already, the missing "clean" and resume functionality are also my wishes for future enhancement. I bet i have a lot of wasted space now because of outdated files even so I try and keep my collection clean. Resume would be great because of the daily forced disconnect of my ISP.

This script is pretty much what makes GOG keep their promises, lets see if galaxy will step up and will make offline archiving easy too.
Still working great here too. For the past 2-3 days, I've been downloading my whole GOG game collection over a standard 10Mbit/s connection, uninterrupted. Around 320 GB downloaded so far, 940 GB still to go. :) How I wish right now I had paid extra for a 100Mbps connection...

I'm quite surprised that the download has just continued uninterrupted, that neither the script has failed, nor GOG servers or my ISP would have decided to interrupt the downloads at some point. Or maybe they have, but the script has restarted (or skipped) any failed downloads? I'll know the truth when I run the verification with the script afterwards, who knows if there are some corrupted downloads too. But at least it is pretty straightforward to check for them with this script.

Quite pleased at the moment. :)

avatar
schlulou: Resume would be great because of the daily forced disconnect of my ISP.
I understood it should be possible to perform the download in many parts already now? Not sure if it means you just rerun the download command, and it just continues where you left off (re-downloading only those files which were not downloaded fully in the earlier session)? I presume so, I haven't tried that yet.

Or did you mean proper resume where it resumes even the partial files?

avatar
schlulou: This script is pretty much what makes GOG keep their promises, lets see if galaxy will step up and will make offline archiving easy too.
Frankly, I don't think it ever will, and I kinda understand why. Sadly it may not be in GOG's best interests that lots of people would routinely download everything they have with one push of a button, instead of downloading only those games one at a time they intend to play right there and then. Less stress to the download servers.

So they probably don't want to make mass-downloading too simple and convenient (same goes to other stores too, e.g. I wouldn't want to download 1023 games on Steam, Origin or HumbleBundle either).

Or then I am just being too cynical and GOG will add similar functionality at some point, that would certainly be a nice surprise. Failing that, I am fine third-party tools like gogrepo doing it, as long as GOG will not try to actively prevent them from working. I presume the earlier changes on "JSON APIs" were due to general changes in Galaxy or GOG service, and not really meant to break gogrepo or lgogdownloader functionality.
Post edited August 02, 2015 by timppu
avatar
timppu: Still working great here too. For the past 2-3 days, I've been downloading my whole GOG game collection over a standard 10Mbit/s connection, uninterrupted. Around 320 GB downloaded so far, 940 GB still to go. :) How I wish right now I had paid extra for a 100Mbps connection...

I'm quite surprised that the download has just continued uninterrupted, that neither the script has failed, nor GOG servers or my ISP would have decided to interrupt the downloads at some point. Or maybe they have, but the script has restarted (or skipped) any failed downloads? I'll know the truth when I run the verification with the script afterwards, who knows if there are some corrupted downloads too. But at least it is pretty straightforward to check for them with this script.

Quite pleased at the moment. :)

avatar
schlulou: Resume would be great because of the daily forced disconnect of my ISP.
avatar
timppu: I understood it should be possible to perform the download in many parts already now? Not sure if it means you just rerun the download command, and it just continues where you left off (re-downloading only those files which were not downloaded fully in the earlier session)? I presume so, I haven't tried that yet.

Or did you mean proper resume where it resumes even the partial files?
I ment both here. Every 24 hours my ISP cuts the connection and assigns a ne IP for me. That's done to make it harder for customers to run server services. It happens so fast I dont notice it at all (and I set it up to happen at 4 am). But gogrepo.py can't handle it. Ii will just stop to do anything. You need to close itand start download again and now gogrepo discards all allready started downloads and can't resume them.
So a retry after a connection loss/Ip change and resume of incomplete downloads is maybe more exactly worded.

avatar
schlulou: This script is pretty much what makes GOG keep their promises, lets see if galaxy will step up and will make offline archiving easy too.
avatar
timppu: Frankly, I don't think it ever will, and I kinda understand why. Sadly it may not be in GOG's best interests that lots of people would routinely download everything they have with one push of a button, instead of downloading only those games one at a time they intend to play right there and then. Less stress to the download servers.

So they probably don't want to make mass-downloading too simple and convenient (same goes to other stores too, e.g. I wouldn't want to download 1023 games on Steam, Origin or HumbleBundle either).

Or then I am just being too cynical and GOG will add similar functionality at some point, that would certainly be a nice surprise. Failing that, I am fine third-party tools like gogrepo doing it, as long as GOG will not try to actively prevent them from working. I presume the earlier changes on "JSON APIs" were due to general changes in Galaxy or GOG service, and not really meant to break gogrepo or lgogdownloader functionality.
Yes I understand where gog is coming from. I just don't like that they are trying to make it harder and harder to not use Galaxy. The gogdownloader is even more hidden now. The links for it are buried in some drop down overlay which opens an unformatted and unstructured page which simply looks broken. They "promised" Galaxy will be always optional, now I feel they make even simple individual downloading as hard as the can without breaking their promise. It stinks.

I agree to your point, if they don't actively fight 3rdparty scripts I can live with that.
avatar
schlulou: I ment both here. Every 24 hours my ISP cuts the connection and assigns a ne IP for me. That's done to make it harder for customers to run server services. It happens so fast I dont notice it at all (and I set it up to happen at 4 am). But gogrepo.py can't handle it. Ii will just stop to do anything. You need to close itand start download again and now gogrepo discards all allready started downloads and can't resume them.
So a retry after a connection loss/Ip change and resume of incomplete downloads is maybe more exactly worded.
Oh ok. I was kinda fearing my ISP has something similar, but apparently not as I've been downloading for several days without interruptions, near the maximum 10Mbit/s all the time (checking it with ntop in Linux, which shows also the history data if there are any dips in transfer speeds etc.).

Doesn't that mean also that if you start downloading some humongous GOG game overnight with either Galaxy or web browser, you'll always be greeted by a partial corrupted download in the morning? That sucks.

avatar
schlulou: Yes I understand where gog is coming from. I just don't like that they are trying to make it harder and harder to not use Galaxy. The gogdownloader is even more hidden now. The links for it are buried in some drop down overlay which opens an unformatted and unstructured page which simply looks broken.
I'm fine with Galaxy replacing the legacy GOG Downloader, as long as they keep it possible to download the offline installers with it pretty nicely. And, keep the option to download the installers with your web browser, in case everything else fails (e.g., Galaxy doesn't work at my work place over the corporate network, it will not connect; then again the same applies to gogrepo and Steam as well. What works there for some reason are GOG web downloads, the legacy GOG Downloader, and e.g. EA Origin client.

I've always meant to check why e.g. GOG web downloads and even the GOG Downloader connect and work fine on the corporate network, but gogrepo doesn't. What does it do differently? Uses different ports?

What I meant though was that I am not expecting Galaxy to ever have a button to mass-download all your GOG games with one or few clicks, gogrepo-style. You will most probably have to download your installers one by one also in the future, clicking on each game's and extras' download links. GOG Downloader didn't allow that either, nor do any other official downloaders for other digital stores I've seen. I presume the stores want to push the idea that you download only that you are going to use right then.

But as said, as long as GOG allows third-party tools like gogrepo for mass downloads, I'm fine. When I'm done with this mass download, I won't be bothering GOG download servers that much. Maybe sometimes downloading newer games I haven't downloaded yet, and if anything major has changed in my existing games.

Some optional p2p functionality would be fine too, then GOG could offload big part of the server stress to users who are ready to help there. HumbleBundle does that already with generic torrent clients, but it doesn't work too great due to lack of seeders, especially for their Android apk installers.
Post edited August 02, 2015 by timppu
avatar
woolymethodman: gogrepo doesn't support resuming yet.. I'll add that soon. It will make the manifest larger to store all the MD5 chunks, but since that data is available from GOG, why not make use of it.
avatar
hedwards: That would be awesome. When I tried to download Wasteland 2 that first time, I had to download at least 50gb of data because it would time out at like 95% of the way complete with no way of resuming. And it didn't seem to matter how I went about it, the connection would just die with no way of resuming it.

I'm still annoyed that GOG is wasting money on Galaxy when such things have yet to be addressed at all.
lgogdownloader does support resuming
avatar
hedwards: That would be awesome. When I tried to download Wasteland 2 that first time, I had to download at least 50gb of data because it would time out at like 95% of the way complete with no way of resuming. And it didn't seem to matter how I went about it, the connection would just die with no way of resuming it.

I'm still annoyed that GOG is wasting money on Galaxy when such things have yet to be addressed at all.
avatar
immi101: lgogdownloader does support resuming
That's good to know, I've got so many games in my library that when I'm downloading them I'm not usually watching the console when it gets to the first game it needs to download.
I presume the reason the script can't connect from a corporate network ("Errno 10060") is due to proxy.

So a feature request: proxy support?

As said, GOG Downloader and e.g. EA Origin clients connect fine from behind the proxy, while e.g. GOG Galaxy, Steam and gogrepo.py don't. Automatic proxy configuration (proxy.pac) is in use with the browsers, so I am not quite sure how the GOG Downloader and Origin achieve the connection? Apparently they automatically go read that same proxy.pac file the browsers use in order to get the right proxy server settings? At least I never recall configuring any proxy settings for either client, yet they work fine.

I'd be fine with me manually editing some proxy server/port value into the script myself if needed, I guess I can check the correct proxy server from that proxy.pac file myself. So it doesn't necessarily need to check for a proxy.pac file, if that is harder to implement.

Ps. Is this issue with proxying https in python still on (see the link below)? At the end it suggests an insecure way of achieving that.

https://lukasa.co.uk/2013/07/Python_Requests_And_Proxies/
Post edited August 03, 2015 by timppu