It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
EdhelDil:
avatar
mrkgnao: You can find a solution for the xml parsing issue two posts above yours, here:
https://www.gog.com/forum/general/gogrepopy_python_script_for_regularly_backing_up_your_purchased_gog_collection_for_full_offline_e/post2876
thaaanks! this works !

but now after 80 of 600 updates fetched, it timeouts on all subsequent ones...

01:31:43 | ( 80 / 610) fetching game details for cultures...
01:31:46 | ( 81 / 610) fetching game details for cultures_2...
01:31:49 | ( 82 / 610) fetching game details for cyberpunk_2077_game...
01:32:27 | request failed: HTTPSConnectionPool(host='gog-cdn-lumen-cp77.secure2.footprint.net', port=443): Read timed out. (3 retries left) -- will retry in 5s...
01:33:08 | request failed: HTTPSConnectionPool(host='gog-cdn-lumen-cp77.secure2.footprint.net', port=443): Read timed out. (2 retries left) -- will retry in 5s...
01:33:48 | request failed: HTTPSConnectionPool(host='gog-cdn-lumen-cp77.secure2.footprint.net', port=443): Read timed out. (1 retries left) -- will retry in 5s...
01:34:28 |
Traceback (most recent call last):
File "C:\Users\Edh\AppData\Roaming\Python\Python310\site-packages\urllib3\response.py", line 438, in _error_catcher
yield
File "C:\Users\Edh\AppData\Roaming\Python\Python310\site-packages\urllib3\response.py", line 519, in read
data = self._fp.read(amt) if not fp_closed else b""
File "C:\Program Files\Python310\lib\http\client.py", line 465, in read
s = self.fp.read(amt)
File "C:\Program Files\Python310\lib\socket.py", line 705, in readinto
return self._sock.recv_into(b)
File "C:\Program Files\Python310\lib\ssl.py", line 1273, in recv_into
return self.read(nbytes, buffer)
File "C:\Program Files\Python310\lib\ssl.py", line 1129, in read
return self._sslobj.read(len, buffer)
TimeoutError: The read operation timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\Edh\AppData\Roaming\Python\Python310\site-packages\requests\models.py", line 758, in generate
for chunk in self.raw.stream(chunk_size, decode_content=True):
File "C:\Users\Edh\AppData\Roaming\Python\Python310\site-packages\urllib3\response.py", line 576, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "C:\Users\Edh\AppData\Roaming\Python\Python310\site-packages\urllib3\response.py", line 512, in read
with self._error_catcher():
File "C:\Program Files\Python310\lib\contextlib.py", line 153, in __exit__
self.gen.throw(typ, value, traceback)
File "C:\Users\Edh\AppData\Roaming\Python\Python310\site-packages\urllib3\response.py", line 443, in _error_catcher
raise ReadTimeoutError(self._pool, None, "Read timed out.")
urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='gog-cdn-lumen-cp77.secure2.footprint.net', port=443): Read timed out.

etc... :(
avatar
phaolo: It's already absurd that we have to rely on third party scripts for that.
I don't think it is "absurd" because it is not necessarily on GOG's best interests that people routinely mass-download all their games. So it is understandable GOG isn't really promoting such ideas, just like no other digital stores are doing that either.

I just wish they don't try to block it, either, as gogrepo is also for me one of the main reasons I like buying games here primarily. The good thing about third party scripts is that we don't have to wait for GOG to fix them, and anyone (with the knowledge) can fix the scripts, add new features etc.

In a perfect world, GOG would offer an open-source mass-download client that relies on peer-to-peer protocols for downloading, so mass-downloaders wouldn't necessarily put pressure on GOG download servers, and it would also guarantee better download speeds for those mass downloaders. But, I don't see GOG using resources for such things either, as nice as it would be. Humble Store has almost that though, they offer the ability to download the offline installers with any bittorrent client, which is a great option.
In the event gogrepo was blocked, an utility that list the files needed to be updated would be good (some games or type of files I would update, others I wouldn't bother).

having a way for gogrepo to list these list of things that are new (updates) in a file, and allowing us to edit this file to disable some lines before we do the download phase, would be nice
Post edited February 18, 2022 by EdhelDil
avatar
timppu: I don't think it is "absurd" because it is not necessarily on GOG's best interests that people routinely mass-download all their games. So it is understandable GOG isn't really promoting such ideas, just like no other digital stores are doing that either.
Yes, and that is one of the reasons my GUI for gogcli.exe by Magnitus did not initially support full library downloading, and then when I finally changed my mind and added it, I put in some limitations that includes forced rest periods.

Another big reason why I delayed, was because of my own experience with crappy download speeds that had previously been okay. Then speeds improved for me and have continued to be better, so part of my reluctance was lessened.

Another reason of course, was my belief any real smart person would have been backing up their games from GOG from the get-go, and not letting a huge backlog build up, so I did not have much sympathy for those who didn't do that ... especially if that resulted in hundreds of games not backed up. I have never understood why someone would spend that kind of money and not ensure they have their purchases backed up offline. I mean, isn't that the whole point of GOG ... anything else is just an added extra.

Still I guess you get all types using GOG now, and many have the GOG versus Steam mindset rather than the DRM-Free versus DRM mindset.

So each to their own I guess, but also another reason why GOG would not entertain doing mass downloading with Galaxy.
avatar
timppu: I don't think it is "absurd" because it is not necessarily on GOG's best interests that people routinely mass-download all their games. So it is understandable GOG isn't really promoting such ideas, just like no other digital stores are doing that either.
Yet, it is a big implied part of their business model.

Until you get a redundant copy of your installers that is independent from your point of purchase, there isn't much of an advantage to purchasing drm-free installers.

avatar
timppu: I just wish they don't try to block it, either, as gogrepo is also for me one of the main reasons I like buying games here primarily. The good thing about third party scripts is that we don't have to wait for GOG to fix them, and anyone (with the knowledge) can fix the scripts, add new features etc.
Yes, the benefits of open-source in a nutshell.

Otherwise, I didn't get the impression they were blocking it as much as the impression they didn't care about it (post Gogdownloader anyways). Do you recall them making any kind of official statement about their intentions for their network api? I don't.

avatar
timppu: In a perfect world, GOG would offer an open-source mass-download client that relies on peer-to-peer protocols for downloading, so mass-downloaders wouldn't necessarily put pressure on GOG download servers, and it would also guarantee better download speeds for those mass downloaders. But, I don't see GOG using resources for such things either, as nice as it would be. Humble Store has almost that though, they offer the ability to download the offline installers with any bittorrent client, which is a great option.
Exactly, if they took a more hands-on approach, they could exert more control over how the clients behave.

Even if they just had a separate api for third-party clients, they'd be able to:
- Put a separate api key authentication so that people wouldn't pull off weird shenanigans to try automate authentication with the login page and cookies
- Have separate apis for clients and rate-limit them
- Optimise the api for client access so that so that they don't have to do quite as many requests to get what they need from the server

avatar
Timboli: Another reason of course, was my belief any real smart person would have been backing up their games from GOG from the get-go, and not letting a huge backlog build up, so I did not have much sympathy for those who didn't do that ... especially if that resulted in hundreds of games not backed up. I have never understood why someone would spend that kind of money and not ensure they have their purchases backed up offline. I mean, isn't that the whole point of GOG ... anything else is just an added extra.
Honestly, given the state of things, I don't think you can blame most users for not being as diligent as they should.

Lets look at the state of things. You have 3 clients I know of:
- One looks pretty mature, but only runs on Linux and requires you to install several dependencies
- One runs both on Linux and Windows, but: You have to figure out which fork to use (not the original repo that is not longer maintained), you still have to install several dependencies and the login can be quite finicky
- One is a self-contained binary that runs both on Linux and windows, but you need to do some fairly low level manipulations with the cookie before you can even use it

All clients require some rampup to use properly on the command line. Two of those can optionally integration your guis (should the potential user do enough research to find out they exist), though those guis will only run on Windows.

I think as far as community efforts go (we're all doing what we can with the time we have and the limitations the GOG api imposes on us), its fine, but I wouldn't yet label that well polished solutions (by commercial standards) ready for broader mainstream consumption.
avatar
phaolo: It's already absurd that we have to rely on third party scripts for that.
avatar
timppu: I don't think it is "absurd" because it is not necessarily on GOG's best interests that people routinely mass-download all their games. So it is understandable GOG isn't really promoting such ideas, just like no other digital stores are doing that either.
Hmm, last I checked, it was only GOG Galaxy that was unable to automatically download only updates to your library of offline installers (disclaimer: it's been a while since I checked, but I have no reason to believe they've learned anything since then). Even the old gogdownloader was smarter than that (and it had checksumming of individual chunks which is not available with the current APIs).

So, yes, it does appear that they advocate that bandwidth-hogging insanity over solutions which minimise bandwidth use.
Post edited February 19, 2022 by mvscot
avatar
mrkgnao: I misunderstood. I thought "my downloader" meant "my copy of gogrepo".

Still, since it's such a small change, can you post here the changes you made in your downloader (before and after of the two lines)? I (or someone else) may be able to apply it to gogrepo and share the fix here.
avatar
racofer: I think I've got it.

The issue is with how gogrepo grabs the .xml file with the md5 for each file in a game's manifest. If you check line 671, it generates an URL for the .xml file based on the URL of each file it downloads (.exe, .bin, etc). Something changed on GOG's end and now this no longer works.

I have managed to get the correct .xml file by changing the following at line 671:

tmp_md5_url = response.url.replace('?', '.xml?')

change it to:

tmp_md5_url = response.url + '.xml'

I have not yet tested for my entire library, but some minor testing with some random games seemed to work.

Edit: small correction, as I had some other changes on my script. The correct line is 671 and not 672 as I had initially mentioned.
Thank you for this!

I wanted to get you something off your wishlist as a thanks, but it's not public. So, here is my hearty public thanks for somehow tracking this down and posting the fix! :)
avatar
racofer: I think I've got it.

The issue is with how gogrepo grabs the .xml file with the md5 for each file in a game's manifest. If you check line 671, it generates an URL for the .xml file based on the URL of each file it downloads (.exe, .bin, etc). Something changed on GOG's end and now this no longer works.

I have managed to get the correct .xml file by changing the following at line 671:

tmp_md5_url = response.url.replace('?', '.xml?')

change it to:

tmp_md5_url = response.url + '.xml'

I have not yet tested for my entire library, but some minor testing with some random games seemed to work.

Edit: small correction, as I had some other changes on my script. The correct line is 671 and not 672 as I had initially mentioned.
avatar
MatrixRaven: Thank you for this!

I wanted to get you something off your wishlist as a thanks, but it's not public. So, here is my hearty public thanks for somehow tracking this down and posting the fix! :)
Hello

This is my first time posting but after I did this change I was able to update but I had issues trying to download . So i looked at the code and made an extra change in line 641 very similar to the change in this post.

So I thought I would share it for people getting issues with the download, again not an expert just fiddling around but it seemed to work for me.

So in line 641 I changed :
chunk_url =response.url.replace('?', '.xml?')

To :
chunk_url =response.url + '.xml'
avatar
racofer: I have managed to get the correct .xml file by changing the following at line 671:

tmp_md5_url = response.url.replace('?', '.xml?')

change it to:

tmp_md5_url = response.url + '.xml'
Thanks for this. I checked the thread after the import command stopped working and my output file kept saying I still needed to download games I just downloaded.

In the gogrepo.py I have the line was 668.

After doing an update command with those specific games and another import attempt it looks like some files weren't downloaded in the earlier run.

Edit:
Keep seeing OverFlowError: size does not fit in an int
Post edited February 20, 2022 by SKARDAVNELNATE
avatar
SKARDAVNELNATE: Thanks for this. I checked the thread after the import command stopped working and my output file kept saying I still needed to download games I just downloaded.

In the gogrepo.py I have the line was 668.

After doing an update command with those specific games and another import attempt it looks like some files weren't downloaded in the earlier run.

Edit:
Keep seeing OverFlowError: size does not fit in an int
Yes, same issue here. Glad it's not just me. Not glad that it's causing issues for all of us getting our offline installers backed up.
Tried Bart_geens fix. Waiting for another download run to finish to see if that resolved it.

Edit:
Failed to download 2 files.

Here's an except from log.

19:41:49 | download setup_shardlight_2.3_(53621).exe
19:41:49 | ------------------------------------------------------------
19:41:49 | preallocating '614634656' bytes for 'C:\GOGrepo\zzDownload\!downloading\shardlight\setup_shardlight_2.3_(53621).exe'
19:41:55 |
Traceback (most recent call last):
File "C:\Users\<User>\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\response.py", line 295, in _decode
data = self._decoder.decompress(data)
File "C:\Users\<User>\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\response.py", line 77, in decompress
ret += self._obj.decompress(data)
zlib.error: Error -3 while decompressing data: incorrect header check

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\<User>\AppData\Local\Programs\Python\Python37\lib\site-packages\requests\models.py", line 749, in generate
for chunk in self.raw.stream(chunk_size, decode_content=True):
File "C:\Users\<User>\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\response.py", line 465, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "C:\Users\<User>\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\response.py", line 437, in read
data = self._decode(data, decode_content, flush_decoder)
File "C:\Users\<User>\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\response.py", line 300, in _decode
"failed to decode it." % content_encoding, e)
urllib3.exceptions.DecodeError: ('Received response with content-encoding: gzip, but failed to decode it.', error('Error -3 while decompressing data: incorrect header check'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "gogrepo.py", line 1870, in worker
chunk_tree = fetch_chunk_tree(response,downloadSession)
File "gogrepo.py", line 639, in fetch_chunk_tree
chunk_response = request(session,chunk_url)
File "gogrepo.py", line 229, in request
response = session.get(url, params=args,stream=stream,timeout=HTTP_TIMEOUT)
File "C:\Users\<User>\AppData\Local\Programs\Python\Python37\lib\site-packages\requests\sessions.py", line 525, in get
return self.request('GET', url, **kwargs)
File "C:\Users\<User>\AppData\Local\Programs\Python\Python37\lib\site-packages\requests\sessions.py", line 512, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\<User>\AppData\Local\Programs\Python\Python37\lib\site-packages\requests\sessions.py", line 662, in send
r.content
File "C:\Users\<User>\AppData\Local\Programs\Python\Python37\lib\site-packages\requests\models.py", line 827, in content
self._content = b''.join(self.iter_content(CONTENT_CHUNK_SIZE)) or b''
File "C:\Users\<User>\AppData\Local\Programs\Python\Python37\lib\site-packages\requests\models.py", line 754, in generate
raise ContentDecodingError(e)
requests.exceptions.ContentDecodingError: ('Received response with content-encoding: gzip, but failed to decode it.', error('Error -3 while decompressing data: incorrect header check'))
Post edited February 20, 2022 by SKARDAVNELNATE
avatar
timppu: I don't think it is "absurd" because it is not necessarily on GOG's best interests that people routinely mass-download all their games. So it is understandable GOG isn't really promoting such ideas, just like no other digital stores are doing that either.
avatar
mvscot: Hmm, last I checked, it was only GOG Galaxy that was unable to automatically download only updates to your library of offline installers (disclaimer: it's been a while since I checked, but I have no reason to believe they've learned anything since then). Even the old gogdownloader was smarter than that (and it had checksumming of individual chunks which is not available with the current APIs).

So, yes, it does appear that they advocate that bandwidth-hogging insanity over solutions which minimise bandwidth use.
That is not what i was talking about.

Neither Galaxy nor the old gog downloader had an option to download ALL your games (or updates), with one "click". If you wanted to do that, you had to (have to) click through all your games in your GOG library (in my case, 2242 games at the moment).

That is what gogrepo is for. With one (or a few) commands, it checks what has changed or has been added since the last time you used the tool, and then downloads them all. Not going through 2242 games manually yourself, trying to figure out yourself which have changed and which not, especially since GOG's own update flag is working erratically so you can't really depend on it either.
avatar
Timboli: Another reason of course, was my belief any real smart person would have been backing up their games from GOG from the get-go
I mostly agree with you (of the DRM-free benefits of purchasing in GOG store), but there are obviously several reasons why people don't do it (especially when their GOG library is growing):

- Without 3rd party tools like gogrepo, it is pain in the arse to download "all your games", and especially trying to track the updates, ie. what games you should download again. I felt it was pain in the arse already when I had only few hundred games on GOG, I lost track of the updates already back then, when trying to do it manually and with the old GOG downloader client.

- People don't have enough free hard drive space to keep local copies of all their GOG games. I have divided my GOG game installers to two 5TB HDDs, so 10TB reserved for them for now. Let's see how long those are enough...

- People still believe GOG will be around so they are not in a hurry to back up their games.

- Some simply don't necessarily care enough, at least for now. Not sure if they'd start caring if GOG was closing down. I've seen some use the argument "If GOG ever closes down, I guess I download the GOG games I lost, and I still care for, from hazy pirate sites"... I guess there is some logic to that argument, but then I am unsure what is the benefit of buying from GOG in that case... but whatever, to each his own.

When it comes to your limitations, forced rest periods etc., I just use common sense myself. I do not try to keep my local GOG library 100% up to date all the time, but I run gogrepoc (getting all the updates and new games that have appeared in between) like once every two or three months, and I avoid doing it on busy periods, like during big sales.

That is enough for me. If I want to install and play some new game that I haven't downloaded with gogrepo yet, I might download it separately with my browser.

I've read some others having set up automatic systems where they get the updates and new games automatically e.g. every day, and frankly I feel that is exaggeration.
avatar
timppu: That is not what i was talking about.

Neither Galaxy nor the old gog downloader had an option to download ALL your games (or updates), with one "click". If you wanted to do that, you had to (have to) click through all your games in your GOG library (in my case, 2242 games at the moment).
OK, maybe we were making slightly different points, both valid.

I remember gogdownloader being "good enough"; I thought I remember it having a "select all" option (obviously, it would only update as necessary - either that or I could just select all games with update flags, which wasn't many if I ran it often enough), but it's been a while (my library is currently just 99 more than yours, so we're in the same boat)...

Galaxy, on the other hand, would just blindly re-download everything you selected.

GOG support (after a lengthy exchange refuting their claim of feature parity I was asking for) literally asked me "why would you ever want to download all your games?". Duh!
avatar
mvscot: I remember gogdownloader being "good enough"; I thought I remember it having a "select all" option
I think it had a "select all" option for one game at a time, ie. you didn't have to click every partial file or extra goodies file separately, to add them to your download queue (like you have to do with e.g. web browser downloads).

But I do not think it ever had an option to add ALL your games to the download queue, with one click. If you had 1000 games, you had to click (more than) 1000 times to add them all into the download queue.

Also, I don't have particularly good memories of the tool. It worked fine... when it worked. But specifically with mass-download situations, ie. where I'd add dozens or even hundred(s) of games to the download queue, and I'd come back to check it again many hours later or the next morning... quite often the client had failed, all the downloads in the queue were greyed out etc. The only way to proceed from there was to kill the application, restart it, and add all the games to the download queue again, one by one.

I guess it was still better than mere browser downloads where you had to click each file separately to add to the download queue... but not by much. gogrepo was godsent, I was so happy I didn't have to do it all manually all over again.
Post edited February 20, 2022 by timppu