It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
I'll add update resume support and incremental updating as well.
avatar
woolymethodman: I'll add update resume support and incremental updating as well.
Thanks so much. This is incredibly helpful. I really appreciate all the work you put into this.
I wish I could ask for even more options but frankly I do not even know how they would work best without the whole thing getting out of hand.
Like adding an option to cancel out certain games (like the original Strike Suit Zero which has been updated to Directors Cut as a separate game), certain bonus packs (some games have bonuses in certain languages, Flac and MP3 soundtracks from which I only need one version, etc) or using other folder naming conventions (some Star Wars games actually have misleading folders - "republic_commando_copy3" is actually the jedi knight main game). Again, I know this is all just wishful thinking probably but if you ever run out of ideas on what to do, I have a lot. :)

One definite wish I can list right here:
After downloading my collection, I had a few MD5 checks failed. It would be cool to know which files exactly were affected. Even during the verification process there could be an OK|FAIL at the end. Then I could pipe this to a log and see what exactly went wrong. (EDIT: Sorry, I see the verify command DOES indeed does that - I just missed it in the large log . my bad)

P.S.
When it comes to the implementation of the "clean" command - I am very excited for this as I would love to be able to keep old files "gone for good". But I wonder how this exactly works. For some games I downloaded some fixes and fan made patches or mods and I usually store them in a subdirectory of where I download my games (example: deus_ex/mods/). Would the delete command get rid of this subdirectory, too?
Post edited June 22, 2015 by 8BitChris
avatar
woolymethodman: I'll add update resume support and incremental updating as well.
avatar
8BitChris: Thanks so much. This is incredibly helpful. I really appreciate all the work you put into this.
I wish I could ask for even more options but frankly I do not even know how they would work best without the whole thing getting out of hand.
Like adding an option to cancel out certain games (like the original Strike Suit Zero which has been updated to Directors Cut as a separate game), certain bonus packs (some games have bonuses in certain languages, Flac and MP3 soundtracks from which I only need one version, etc) or using other folder naming conventions (some Star Wars games actually have misleading folders - "republic_commando_copy3" is actually the jedi knight main game). Again, I know this is all just wishful thinking probably but if you ever run out of ideas on what to do, I have a lot. :)

One definite wish I can list right here:
After downloading my collection, I had a few MD5 checks failed. It would be cool to know which files exactly were affected. Even during the verification process there could be an OK|FAIL at the end. Then I could pipe this to a log and see what exactly went wrong. (EDIT: Sorry, I see the verify command DOES indeed does that - I just missed it in the large log . my bad)

P.S.
When it comes to the implementation of the "clean" command - I am very excited for this as I would love to be able to keep old files "gone for good". But I wonder how this exactly works. For some games I downloaded some fixes and fan made patches or mods and I usually store them in a subdirectory of where I download my games (example: deus_ex/mods/). Would the delete command get rid of this subdirectory, too?
Incorrectly named games like you've mentioned "republic_commando_copy3" should be reported to GOG.. they'll fix those quickly. I will add support for being able to map your own gameid/gamename to another directory.

I'll plan to make 'clean' somewhat flexible with options like dryrun, orphan (move to or oprhan folder), or delete. I guess exclude filters could be added too, for a way to have your own content ignored (ex. deus_ex/mods/)
Thank you! I currently run your script and the update alone takes ~6-10 seconds per game. Is this normal? :/

Reason: the download extras bit in Galaxy is far too much clicking and I like to have at least all soundtracks locally.

p.s. the GOGDownloader links are much too buried into the website now as well.

00:15:26 | request failed: HTTP Error 404: Not Found. will not retry.
00:15:26 | no md5 data found for patch_system_shock2de_2.1.1.20.exe
pps. could we have a switch to only download extras? Since I use Galaxy for installations?

-skipextras skip downloading of any GOG extra files
Something like:
http://pastie.org/10281496#19
(created just now, not tested)
Post edited July 09, 2015 by disi
avatar
disi: Thank you! I currently run your script and the update alone takes ~6-10 seconds per game. Is this normal? :/

Reason: the download extras bit in Galaxy is far too much clicking and I like to have at least all soundtracks locally.

p.s. the GOGDownloader links are much too buried into the website now as well.

00:15:26 | request failed: HTTP Error 404: Not Found. will not retry.
00:15:26 | no md5 data found for patch_system_shock2de_2.1.1.20.exe
avatar
disi: pps. could we have a switch to only download extras? Since I use Galaxy for installations?

-skipextras skip downloading of any GOG extra files
avatar
disi: Something like:
http://pastie.org/10281496#19
(created just now, not tested)
Yes, there are a few web requests done for each game (game info, md5 data, etc).. however there is a forced 1 second sleep between web requests. This is so you don't get temp banned from GOG for too many requests. You can change this to 0 if you'd like, but you will get banned pretty quickly unless you don't have many games in your collection. I can add an option to skip fetching of MD5 data and that should make things faster.. but MD5 data is nice for verifying the integrity of your files.

Yeah I can add a flag to only download extras.

Note that 404 error is due to GOG not publishing MD5 data for system shock patch. It can be safely ignored, but I should make it more obvious that it's just MD5 data and is considered "normal". GOG's MD5 database isn't perfect.

Thanks for the feedback.
avatar
woolymethodman: Yes, there are a few web requests done for each game (game info, md5 data, etc).. however there is a forced 1 second sleep between web requests. This is so you don't get temp banned from GOG for too many requests. You can change this to 0 if you'd like, but you will get banned pretty quickly unless you don't have many games in your collection. I can add an option to skip fetching of MD5 data and that should make things faster.. but MD5 data is nice for verifying the integrity of your files.

Yeah I can add a flag to only download extras.

Note that 404 error is due to GOG not publishing MD5 data for system shock patch. It can be safely ignored, but I should make it more obvious that it's just MD5 data and is considered "normal". GOG's MD5 database isn't perfect.

Thanks for the feedback.
You are welcome :)
It is great this way and much much less to download.

This works like a charm (-skipgames added):
http://pastie.org/10282354
Attachments:
gogrepo.jpg (87 Kb)
Post edited July 09, 2015 by disi
avatar
woolymethodman: Yes, there are a few web requests done for each game (game info, md5 data, etc).. however there is a forced 1 second sleep between web requests. This is so you don't get temp banned from GOG for too many requests. You can change this to 0 if you'd like, but you will get banned pretty quickly unless you don't have many games in your collection. I can add an option to skip fetching of MD5 data and that should make things faster.. but MD5 data is nice for verifying the integrity of your files.

Yeah I can add a flag to only download extras.

Note that 404 error is due to GOG not publishing MD5 data for system shock patch. It can be safely ignored, but I should make it more obvious that it's just MD5 data and is considered "normal". GOG's MD5 database isn't perfect.

Thanks for the feedback.
avatar
disi: You are welcome :)
It is great this way and much much less to download.

This works like a charm (-skipgames added):
http://pastie.org/10282354
I can take a look tonight or tomorrow... do you want to submit this to the project via github instead? I can review it there.. Up to you.
avatar
woolymethodman: I can take a look tonight or tomorrow... do you want to submit this to the project via github instead? I can review it there.. Up to you.
Done, I hope :)

p.s. never used Github on Windows.
Attachments:
cwa.jpg (196 Kb)
Post edited July 10, 2015 by disi
Another update with the -skipknown switch for updates.

Skipgames in action: http://i.imgur.com/BAcdbmI.jpg
Skipknown in action: http://i.imgur.com/BMeGn4v.jpg
Then after the partial update skipgames in action: http://i.imgur.com/imxTZ17.jpg
avatar
woolymethodman: I can take a look tonight or tomorrow... do you want to submit this to the project via github instead? I can review it there.. Up to you.
Alart :) Can you check the issues, please?
avatar
woolymethodman: I can take a look tonight or tomorrow... do you want to submit this to the project via github instead? I can review it there.. Up to you.
avatar
disi: Alart :) Can you check the issues, please?
Thanks for the reports. I will review them as soon as I can.
Just uploaded a fix to the 'update' command. GOG JSON API changed and caused the script not to find download links. Thanks to disi for reporting
Hello. Here is a feature request. I don't know if the JSON data you get makes this easy or hard for you.

I would like to download ll my collection. To do so, I may need to buy a new HD to store it. How large does it have to be? Can you report how large the download will be? This is similar to the --dry-run on rsync.

I see two use cases for this functionality:
1) to ensure there is enough free space before downloading, and
2) seeing if it will take too long in one go, and do the downloads in smaller chunks, game by game.

Thank you.
avatar
Gede: Hello. Here is a feature request. I don't know if the JSON data you get makes this easy or hard for you.

I would like to download ll my collection. To do so, I may need to buy a new HD to store it. How large does it have to be? Can you report how large the download will be? This is similar to the --dry-run on rsync.

I see two use cases for this functionality:
1) to ensure there is enough free space before downloading, and
2) seeing if it will take too long in one go, and do the downloads in smaller chunks, game by game.

Thank you.
You can interrupt the download at any time (Ctrl+C) and continue later. Just the partial downloaded files will be discarded.
Once you start the download, it tells you every second how much is remaining, i.e. 600GB and then you can check the folder size of already downloaded i.e. 400GB. So you know it will be ~1TB.

p.s. in your case. Run the update, run the download and it shows the overall size required on the drive, as nothing got downloaded yet.
And with -dryrun it will not save anything.
Post edited July 21, 2015 by disi
avatar
disi: ... Just the partial downloaded files will be discarded.
I see. If there is already a -dryrun option (I missed it, sorry), and it mentions how much is left to download, then I suppose it already does what I was asking for. Thank you for informing me.

BTW, I briefly looked at the code, and I had the impression that some of it was handling the incremental download of files (via the file seek() & request() with byte_range). But it seems I was mistaken. If the script does not support resume, then it may be added with some ease.