It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
I have added ymgve's "extra slashes" solution (see here) to my copy of gogrepoc. It solved all my current MD5 XML issues.

I didn't try to add it to the GitHub repository because:
(a) it's a hack
(b) when I retired, I swore I would never use git ever again

So, if you want it, you can add it to your own local copy.

To do so:
1) Search for the following line:
tmp_md5_url = append_xml_extension_to_url_path(response.url)
2) Add the following two lines after it (with the same indentation!):
extra_slashes = '/' * (datetime.datetime.today().minute + 2) ### RS: Setting replacement string based on current minute 09/06/22
tmp_md5_url = extra_slashes.join(tmp_md5_url.rsplit('/',1)) ### RS: Adding extra final slashes to bypass GOG's XML issues 09/06/22
3) Save

What it does is add extra slashes based on the current minute (i.e. between 1 and 60 extra slashes), so if you you have an XML failure on a specific game, wait a minute and try just that game again.

I had 6 games with outstanding XML failures (most had been failing for weeks). I ran gogrepoc on these 6 games with the fix and 5 were resolved. I ran it on the 6th game again a minute or two later and it was resolved too.

This handles failures during update. There could also be failures during download, but I have very rarely seen any. If you have those, you can use a similar solution there too. I haven't tried it on real life failures, but I did add it to my code nonetheless and it doesn't seem to hurt anything.

To do that (optional):
1) Search for the following line:
chunk_url = append_xml_extension_to_url_path(response.url)
2) Add the following two lines after it (with the same indentation!):
extra_slashes = '/' * (datetime.datetime.today().minute + 2) ### RS: Setting replacement string based on current minute 09/06/22
chunk_url = extra_slashes.join(chunk_url.rsplit('/',1)) ### RS: Adding extra final slashes to bypass GOG's XML issues 09/06/22
3) Save

Thank you, ymgve.

P.S. I would appreciate a code review, if someone is up to it.
Post edited June 09, 2022 by mrkgnao
Thanks mekgnao, I'm going to give that a go.

Shouldn't the ### just be a single # for it to be a comment?
low rated
i wonder who still buys here , each year they make everything more annoying
fe why ruin this downloader...
avatar
ikrananka: Thanks mekgnao, I'm going to give that a go.

Shouldn't the ### just be a single # for it to be a comment?
I believe any number of # would do. I use three as a convention in gogrepoc, so that it's easier for me to search for my own comments (I have made a few other QoL changes elsewhere in the script).
Post edited June 09, 2022 by mrkgnao
avatar
ikrananka: Thanks mekgnao, I'm going to give that a go.

Shouldn't the ### just be a single # for it to be a comment?
avatar
mrkgnao: I believe any number of # would do. I use three as a convention in gogrepoc, so that it's easier for me to search for my own comments (I have made a few other QoL changes elsewhere in the script).
Ah didn't know that - thanks.
avatar
mrkgnao: I have added ymgve's "extra slashes" solution (see here) to my copy of gogrepoc. It solved all my current MD5 XML issues.
I did some further testing on my implementation of ymgve's fix.

I ran a full update on my 1900+-game library (~3.5 hours). This yielded 14 games with one or more XML failures. I then ran another update on just these 14 games (using the -ids flag) without changing anything. All 14 games passed without any failure.

Looks good, I'd say.
avatar
mrkgnao: I did some further testing on my implementation of ymgve's fix.

I ran a full update on my 1900+-game library (~3.5 hours). This yielded 14 games with one or more XML failures. I then ran another update on just these 14 games (using the -ids flag) without changing anything. All 14 games passed without any failure.

Looks good, I'd say.
Cool, do you have some fork or something that we can check?
avatar
mrkgnao: I did some further testing on my implementation of ymgve's fix.

I ran a full update on my 1900+-game library (~3.5 hours). This yielded 14 games with one or more XML failures. I then ran another update on just these 14 games (using the -ids flag) without changing anything. All 14 games passed without any failure.

Looks good, I'd say.
avatar
blotunga: Cool, do you have some fork or something that we can check?
It's available here (together with an excuse as to why there is no fork):
https://www.gog.com/forum/general/gogrepopy_python_script_for_regularly_backing_up_your_purchased_gog_collection_for_full_offline_e/post3167
Post edited June 10, 2022 by mrkgnao
avatar
mrkgnao: (b) when I retired, I swore I would never use git ever again
Out of curiosity, what code versioning solution do you prefer to use in its stead?

Either way, at this point, it has become such a world-wide standard that I just can't image working with anything else (even if another solution was much better, if nobody was using it or integrating with it, what would be the point?).
Post edited June 10, 2022 by Magnitus
avatar
mrkgnao: (b) when I retired, I swore I would never use git ever again
avatar
Magnitus: Out of curiosity, what code versioning solution do you prefer to use in its stead?
For my own private projects: TortoiseSVN.

When I worked in the industry, I had occasion to try a few (before git became the standard). My relative favourite (despite its drawbacks) was AccuRev.

avatar
Magnitus: Either way, at this point, it has become such a world-wide standard that I just can't image working with anything else (even if another solution was much better, if nobody was using it or integrating with it, what would be the point?).
It depends on whom you're working with. I have only ever worked on projects with either:
1) people within the same company I was in, in which case all that mattered was the standard within the company (luckily for me, git was only used in my last couple of years in the industry).
2) myself alone, in which case all that matters is my own taste.

I have never worked on open-source distributed projects, and given that git is indeed the standard, I probably never will.
avatar
mrkgnao: [..] when I retired, I swore I would never use git ever again
[..] I have never worked on open-source distributed projects, and given that git is indeed the standard, I probably never will.
But why is git so bad?
avatar
mrkgnao: [..] when I retired, I swore I would never use git ever again
[..] I have never worked on open-source distributed projects, and given that git is indeed the standard, I probably never will.
avatar
phaolo: But why is git so bad?
It's not "so bad". A lot of people seem to like it.

But in my opinion and from my experience, it is poorly designed, poorly documented and much too easy to accidentally misuse. I have seen enough corrupted code bases and enough delayed releases that it's not something that I want to work with.
Post edited June 10, 2022 by mrkgnao
avatar
mrkgnao: It's not "so bad". A lot of people seem to like it.

But in my opinion and from my experience, it is poorly designed, poorly documented and much too easy to accidentally misuse. I have seen enough corrupted code bases and enough delayed releases that it's not something that I want to work with.
I don't have a lot of experience with other version controls. I used another version control system in 2008-2009 (I don't remember the name), then learned some Mercurial and finally learned Git (because Git won).

I know beginners have tripped over some things and it has led to corrupted code bases.

I think nowadays, git tools with a gui like Github, Bitbucket, Gitlab or Gitea have helped a lot to prevent beginners from tripping up your codebase (you can set branch protection rules and code review requirements that will prevent beginners from messing with your git history).

Otherwise, I've mastered a subset of git functionality (to be modest, maybe around 1/3, I know git is quite vast) and my gitflows have been works of surgical precision for over a decade now. Git has been a strong ally in maintaining analytically tractable codebases, not a liability. Nowadays with gitops, you can track the vast majority of your operations through a version control system (reconcilers listen for changes in git repos and change the state of your system accordingly so you can modify the state of your system simply by committing changes to its code in git repos) and it is a thing of beauty. Really, git is a cornerstone of my automation strategy.

One key strategy for git is this: Some key branches in a git repo are authoritative (usually "main", maybe "dev", some "stable" branches if you need to maintain older versions, etc), meaning they represent a source of truth in your system. You don't mess with the history of authoritative branches, ever. You always add to the history with additional commits, but you never change commits that were previously added to those branches in the central repository. You don't delete commits, you don't alter commits, you don't sandwich new commits between old commits. Those branches are append only. If you follow that rule, you'll be fine. If you don't, there is a special place in hell for you.

Otherwises, beyond that, devs can push transient branches that are works in progress. To keep a sane workflow, they should always keep the work they added to those branches as an appendum to the work of the authoritative branch they eventually want to merge into (meaning they should rebase their branch with the authoritative branch regularly). But that appendum in the history that is specific to their branch, they can mess with that part of the history as much as they like (squash commits, delete commits, whatever) up until the point when they merge it into an authoritative branch. But the base of the branch that is common with the authoritative branch, that part is sacred. If they mess with that, they will have an unpleasant time.
Post edited June 11, 2022 by Magnitus
I guess I'm just lucky to be one of those for whom Git just "clicks".
Being old enough to have worked in DOS for several years, and never since having let go of the command line for any serious work, is no doubt part of it.

I've used Visual SourceSafe, ClearCase, Perforce and Git extensively, and touched CVS, SVN and one more whose name escapes me.
Of all of them, Git is the only one I've enjoyed working with.

The only problem is if you want to version large, changing, binary files. Git *really* doesn't scale well there.
"clone --filter=..." can help, but then you lose the great (IMHO) benefit of complete local history and resulting immediate response.

I understand teams wanting to version control huge binary assets together with their source code are better served by e.g. Perforce.
It's just an opinion, of course, but for text-only (with possibly a few, rarely changing binaries), the workflows Git enables seem absolutely spot on to me.
Even when less Git-enthusiastic colleagues have gotten themselves stuck, it's always been simple enough to untangle.
avatar
brouer: I guess I'm just lucky to be one of those for whom Git just "clicks".
Being old enough to have worked in DOS for several years, and never since having let go of the command line for any serious work, is no doubt part of it.

I've used Visual SourceSafe, ClearCase, Perforce and Git extensively, and touched CVS, SVN and one more whose name escapes me.
Of all of them, Git is the only one I've enjoyed working with.

The only problem is if you want to version large, changing, binary files. Git *really* doesn't scale well there.
"clone --filter=..." can help, but then you lose the great (IMHO) benefit of complete local history and resulting immediate response.

I understand teams wanting to version control huge binary assets together with their source code are better served by e.g. Perforce.
It's just an opinion, of course, but for text-only (with possibly a few, rarely changing binaries), the workflows Git enables seem absolutely spot on to me.
Even when less Git-enthusiastic colleagues have gotten themselves stuck, it's always been simple enough to untangle.
We just stick our large binaries in s3 (you can either use a cloud service or minio on-prem). You can have several versions by putting it in different named path and you can backup your s3 store in case someone messes up. You can also use a specialized repository tool for libraires and artifacts like Nexus.

Otherwise, nowadays, I believe most git services have accompanying functionality to manage release binaries which fills in for a lot of use-cases where you want to produce binaries from a codebase (in a lot of my github repos, I just have a release pipeline that generates a release with accompanying binaries from the codebase with the version of the release being based on the git tag).

The only additional use-case I can see outside of those solutions for binaries is where you have many sizeable binaries that are 50%+ the same and you only want to store the diffs to save space. For that, you would need a more specialized service, but I think that for most people, the above two solutions fits well with their binary use cases.
Post edited June 11, 2022 by Magnitus