It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
I'm very interested to see how this pans out, but meanwhile .....

A few days ago, I purchased the latest five free DLCs for DYING LIGHT, which have only just become available here.

Wanting to download those DLCs and any other update I needed for them, I started up my GOGcli GUI program, and attempted to update the manifest entry for Dying Light: The Following – Enhanced Edition. Alas it failed with an error, and did the same on subsequent retries.

At the moment I am using an older version of gogcli.exe, so not the latest by Magnitus with his work-around.

Now I am well aware, that for a couple of months or so now, many have been having such issues, and I have certainly been bitten probably three times now myself.

About a month ago I started work on my own kind of possible work-around ... then I went away on holiday for a couple of weeks. In the last few days I have continued my work and now have something to test with.

The program I created is called GetGameInfo, which some of you might be aware of. It is a complimentary program to my GOGcli GUI program. GetGameInfo does its own GOG SDK queries, but optionally uses gogcli.exe and a Cookie.txt file to get the file names of downloads as well as size in bytes and any checksum, and creates a manifest file from results. It is a manifest file that only contains entries for one game.

I was able to use GetGameInfo to create a manifest for Dying Light: The Following – Enhanced Edition.

That manifest does not contain any DLC entries, they need to be added to the GetGameInfo database using their own game page URL. That also works, grabbing the required Game ID for each.

So essentially I have several manifest files, which could be combined, but I haven't bothered at this point. All of the downloadable file entries are present, with file names and sizes and (Extras aside) checksums.

I have now adapted my GOGcli GUI program to use external manifest files, one game per file. And it can download the game files and verify them.

I am still at the testing stage for this, but all works well so far.

Perhaps someone among you might find this information interesting and maybe even helpful.

You can access what I have been working on via the following link.

https://www.gog.com/forum/general/gogcli_gui_frontend_downloader_and_validator/post126
Post edited May 12, 2022 by Timboli
avatar
Magnitus: Around 2500 games ...
Man you have a lot of games at GOG. I am currently about 3 short of 1500 games at GOG, but over 200 of those would be demos and prologues.
Well my work-around method was working, fully, now it is only partial ... at least with the main DYING LIGHT game.

It is almost like someone at GOG read my post, then decided to play games.

I can no longer get the file name and size and checksum for the following, and yet before I posted about my solution a day ago, I could get them. And I have a backup manifest to prove it.

/downloads/dying_light_the_following_enhanced_edition/en1installer1
/downloads/dying_light_the_following_enhanced_edition/en1installer6
/downloads/dying_light_the_following_enhanced_edition/en1installer7

All the rest I am still getting the detail for, including extras.

Truly annoying, and makes me totally suspicious.
Why am I suspicious?
Well a few days ago, maybe several, a bunch of new DLCs were added for the game, five of which were free. No changes have happened since then, as far as version number for the game goes or more new DLC. So why some changes out of the blue since then, that cause an issue for my method. One imagines they would be too busy working on updates for other games, so it begs the question of why, for this game?

Is this another attempt to push us all into using Galaxy?

P.S. I've also continued to make good improvements to both GetGameInfo and GOGcli GUI, to support my method, including being able to select multiple manifests in one go, to populate the download list. In fact I downloaded all the DLCs for DYING LIGHT using that ... worked great.
Post edited May 13, 2022 by Timboli
avatar
Timboli: Well my work-around method was working, fully, now it is only partial ... at least with the main DYING LIGHT game.

It is almost like someone at GOG read my post, then decided to play games.

I can no longer get the file name and size and checksum for the following, and yet before I posted about my solution a day ago, I could get them. And I have a backup manifest to prove it.

/downloads/dying_light_the_following_enhanced_edition/en1installer1
/downloads/dying_light_the_following_enhanced_edition/en1installer6
/downloads/dying_light_the_following_enhanced_edition/en1installer7

All the rest I am still getting the detail for, including extras.

Truly annoying, and makes me totally suspicious.
Why am I suspicious?
Well a few days ago, maybe several, a bunch of new DLCs were added for the game, five of which were free. No changes have happened since then, as far as version number for the game goes or more new DLC. So why some changes out of the blue since then, that cause an issue for my method. One imagines they would be too busy working on updates for other games, so it begs the question of why, for this game?

Is this another attempt to push us all into using Galaxy?

P.S. I've also continued to make good improvements to both GetGameInfo and GOGcli GUI, to support my method, including being able to select multiple manifests in one go, to populate the download list. In fact I downloaded all the DLCs for DYING LIGHT using that ... worked great.
It's probably not suspicious, it's that the caching is bad, and that while no one was doing that you were getting a fresh chance, but now that other people are you've got a higher chance of hitting a bad cache.
avatar
Kalanyr: It's probably not suspicious, it's that the caching is bad, and that while no one was doing that you were getting a fresh chance, but now that other people are you've got a higher chance of hitting a bad cache.
No worries, you sound like you know more about such things than I do.

Alas the older manifest I had kept as a backup, had been created before I did my bugfix for bytes, so while the checksums are proving to be correct, I gota a fail for those three BIN files when it came to the size check. I will have to double check those now using UnRAR in my GOGPlus Download Checker program, and then amend that manifest and my database records, using what Windows Properties reports, if all is okay.

No idea why there is a disparity in values sometimes, between the two different methods of getting bytes.

EDIT
UnRAR could not process the BIN files, so 7-Zip was used instead, and no errors occurred.
Post edited May 14, 2022 by Timboli
avatar
Geralt_of_Rivia: Twelve hours???!!!!?!?!

And I thought 2:15 was slow...

How big is your library?
avatar
Magnitus: Around 2500 games (I played about 20% of them to various degrees so far, though admittedly, I'm unsure if I'll play them all, its a toss at this point that have led me to become increasingly selective about my purchases here). The bulk of that time was spend computing md5 checksums for files with bad metadata.
That's quite a lot.

But I'm closing in on 1500 and I'm nowhere near 12 hours. You should really rethink your algorithm.
avatar
Geralt_of_Rivia: That's quite a lot.

But I'm closing in on 1500 and I'm nowhere near 12 hours. You should really rethink your algorithm.
It's logistically the simplest algorithm that gives a universal data consistency guarantee.

If you have faster, you're doing network hops in the cloud or are just discarding the idea of validating your download's checksum.

avatar
Kalanyr: It's probably not suspicious, it's that the caching is bad, and that while no one was doing that you were getting a fresh chance, but now that other people are you've got a higher chance of hitting a bad cache.
That would be strange though. I would expect that if people scraped the cache in the cdn until the data is good (if that is what is happening), it would reduce the amount of bad data other people got from the cache, not increase it.
Post edited May 14, 2022 by Magnitus
avatar
Geralt_of_Rivia: That's quite a lot.

But I'm closing in on 1500 and I'm nowhere near 12 hours. You should really rethink your algorithm.
avatar
Magnitus: It's logistically the simplest algorithm that gives a universal data consistency guarantee.

If you have faster, you're doing network hops in the cloud or are just discarding the idea of validating your download's checksum.

avatar
Kalanyr: It's probably not suspicious, it's that the caching is bad, and that while no one was doing that you were getting a fresh chance, but now that other people are you've got a higher chance of hitting a bad cache.
avatar
Magnitus: That would be strange though. I would expect that if people scraped the cache in the cdn until the data is good (if that is what is happening), it would reduce the amount of bad data other people got from the cache, not increase it.
No, I'm not network hopping at the moment and I collect all md5 from the XML.

The trick we are using is to add slashes to the last slash in the path to create 'new' URLs that point to the same resource on the server. But since the URL hasn't been used before the result wasn't cached. So we get an uncached working XML. But if many people use the same method then there is a chance the same path was already used before and the result (possibly badly) cached.
avatar
Geralt_of_Rivia: No, I'm not network hopping at the moment and I collect all md5 from the XML.

The trick we are using is to add slashes to the last slash in the path to create 'new' URLs that point to the same resource on the server. But since the URL hasn't been used before the result wasn't cached. So we get an uncached working XML. But if many people use the same method then there is a chance the same path was already used before and the result (possibly badly) cached.
I see, so then it doesn't seem to scale all that well and we'll soon get back to where we started once everyone incorporated it in their tools won't we?

Now that you're telling me this, I'm a bit disappointed. It feels like an exercise in futility.

Anyways, I'll still try it and I'll see.

EDIT: Thinking about it though, assuming that each different url is cached separately and caching gets it right some of the time, I guess there is still a chance that at least one of the urls is valid even if they are all cached.
Post edited May 14, 2022 by Magnitus
Well, I tried it (both trying to get the modified content and also making a request with the modified url before falling back to the original url) and I'm afraid I didn't see much of a difference either way: https://github.com/Magnitus-/gogcli/commit/5b4c32a896af3c9ddf86f49bd30fe5a8cd340e0c


The one thing I observed before is that sometimes, if you retry a lot, bad metadata will get fixed on its own, so perhaps it is just that.
avatar
Magnitus: The one thing I observed before is that sometimes, if you retry a lot, bad metadata will get fixed on its own, so perhaps it is just that.
See my posts here:
https://www.gog.com/forum/general/gogrepopy_python_script_for_regularly_backing_up_your_purchased_gog_collection_for_full_offline_e/post3073
https://www.gog.com/forum/general/gogrepopy_python_script_for_regularly_backing_up_your_purchased_gog_collection_for_full_offline_e/post3093
https://www.gog.com/forum/general/gogrepopy_python_script_for_regularly_backing_up_your_purchased_gog_collection_for_full_offline_e/post3100

I continue seeing the same behaviour, now as before.
Post edited May 17, 2022 by mrkgnao
avatar
mrkgnao: I continue seeing the same behaviour, now was before.
I will keep trying then with - Dying Light: The Following – Enhanced Edition

I first discovered the issue on May 9, 2022 when trying to update my manifest for that game, after buying 5 DLCs, and it still gives the same error just now.

So I will wait and see how long it takes to be corrected.

Thanks for the feedback about that.

P.S. If it is a cache issue, I wonder if there is some other way to refresh that or rebuild it ... other than what has been suggested with the URL trick thing.
avatar
Magnitus: EDIT: Thinking about it though, assuming that each different url is cached separately and caching gets it right some of the time, I guess there is still a chance that at least one of the urls is valid even if they are all cached.
That's exactly what I am thinking as well.
avatar
Timboli: P.S. If it is a cache issue, I wonder if there is some other way to refresh that or rebuild it ... other than what has been suggested with the URL trick thing.
Append a query string to the end of whatever the url is

ht tps://gog-cdn-lumen.secure2.footprint.net/token=nva=XXXXXXXXXXXXXXXXXXXXXXXXXXXX/secure/offline/this/that/theo therxxxxxxxxxxxxxxxxxxxx.exe.xml?timestamp=1652962228
avatar
lupineshadow: Append a query string to the end of whatever the url is

ht tps://gog-cdn-lumen.secure2.footprint.net/token=nva=XXXXXXXXXXXXXXXXXXXXXXXXXXXX/secure/offline/this/that/theo therxxxxxxxxxxxxxxxxxxxx.exe.xml?timestamp=1652962228
Thanks for the info, but unfortunately I don't have the knowledge to use it.

When I query details for a download, it is by using Magnitus' gogcli.exe, as in the following manner.

gogcli.exe -c Cookie.txt gog-api url-path-info -p /downloads/dying_light_the_following_enhanced_edition/en1installer0

That gives me filename, checksum and size.

It works for all the download URLs for that game, except three.

Maybe Magnitus could modify his gogcli.exe to do as you say?