It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
shmerl: UPDATE: This is the one: https://www.twitch.tv/videos/45398732?t=49m48s
avatar
WinterSnowfall: Well, what he actually said is that they would *LIKE* to get the APIs documented someday
They had time to do it. I'd argue, that if they would have documented the API properly, some Linux client would have been made already (with support for incremental updates and etc.).
Post edited June 07, 2017 by shmerl
avatar
HunchBluntley: Do you have a link to this 'promise' you keep speaking of? Just curious.
avatar
shmerl: I have to dig it up. Look for one of the community Q&A videos. The old GOG post video was down the last time I looked it up, but I think it might be still available on Twitch.

UPDATE: This is the one: https://www.twitch.tv/videos/45398732?t=49m48s
Yeah, I have to agree with WinterSnowfall -- that was pretty far from being a promise.
Thanks for going to the effort of finding that, though. :)
avatar
shmerl: They had time to do it. I'd argue, that if they would have documented the API properly, some Linux client would have been made already (with support for incremental updates and etc.).
Yeah, I agree. And they also had time to fix the website, forums and other long standing issues, but we all know the usual pacing GOG employs with things that seem important to us mere regular "tech-savvy" users :).
avatar
HunchBluntley: Yeah, I have to agree with WinterSnowfall -- that was pretty far from being a promise.
I don't see it being far. What else could we expect from them, some written contract?
Post edited June 07, 2017 by shmerl
avatar
HunchBluntley: Yeah, I have to agree with WinterSnowfall -- that was pretty far from being a promise.
avatar
shmerl: I don't see it being far. What else could we expect from them, some written contract?
Him having used a term like pledge, promise or word (as in, "I give you all my word"), for a start. :P
"We would like to" is not "we will", and it's certainly not "we promise to."
avatar
HunchBluntley: Him having used a term like pledge, promise or word
Semantics here doesn't matter much if in the end they still didn't do it. I.e. I agree that waiting for them to do it looks pretty pointless.
avatar
HunchBluntley: Him having used a term like pledge, promise or word
avatar
shmerl: Semantics here doesn't matter much if in the end they still didn't do it. I.e. I agree that waiting for them to do it looks pretty pointless.
It's not semantics. Choice of wording is important, especially in interactions with clients or customers (which that Q&A was); in this case, they seem to have chosen their wording well, and not overcommitted (for once). Not every statement of intent is a promise. I know it's disappointing, but your really wanting the API to be released does not transform their statement of intent to do so at some vague future date into a promise.
(The commitment to have a Linux version of Galaxy, on the other hand, was a bit more strongly worded, so you'd be forgiven for being miffed if that never comes out.)

So yeah, Yepoleb and the others working on this are the best bet for now. :)
Post edited June 07, 2017 by HunchBluntley
avatar
HunchBluntley: Not every statement of intent is a promise.
I view this as enough of a promise. You can disagree, it doesn't really change anything in practice, so I don't really see a point in further theoretical arguing about it. The bottom line, I think there was enough time to do it, and it's not likely they'll do it in any near future.
Post edited June 07, 2017 by shmerl
Just pushed a minor update for account endpoints, mostly tags and settings.

Edit: Also hello @shmerl, I was expecting you :D
Post edited June 11, 2017 by Yepoleb
high rated
I just made public my new project called GOG Database, if you're interested in API stuff you'll probably like it: https://www.gog.com/forum/general/gog_database_a_website_that_collects_data_on_gog_games
Very nice work! I really need to do a fresh backup of my collection, and I dread doing it by hand.

Ideally, I'd do a nice gui in electron, but given the amount of time I have, it will probably end up being a simple command line script.

Anybody has the following API info (didn't spot it in the link or with my fooling around with the provided urls)?

- How to get a list of games that got updated since your last downloaded them
- How to get a checksum of the game's installer?

I guess I could work around the first item by keeping track of the downloaded versions in a local store (and then polling all the versions for all the games every single time), but short of installing each game, I won't get around the second item.

Edit: Hmm, wouldn't get super efficient, but I guess I could download a game twice and make sure the checksum of both downloads match...
Post edited June 17, 2017 by Magnitus
avatar
Magnitus: - How to get a list of games that got updated since your last downloaded them
- How to get a checksum of the game's installer?

Edit: Hmm, wouldn't get super efficient, but I guess I could download a game twice and make sure the checksum of both downloads match...
Reinvent the wheels is not a bad thing, but you can save a lot of time by using good old wheels:
https://www.gog.com/forum/general/any_md5_check_available
https://www.gog.com/forum/general/synchronization_tool_for_game_installers_and_goodies
avatar
kbnrylaec: Reinvent the wheels is not a bad thing, but you can save a lot of time by using good old wheels:
https://www.gog.com/forum/general/any_md5_check_available
https://www.gog.com/forum/general/synchronization_tool_for_game_installers_and_goodies
Nice and a Python script too. I guess I'll give that a go first :).

However, in my ideal scenario, I'd like to ultimately sent the files to a database instead of directly the filesystem (maybe PostgreSQL or MongoDB GridFS). I'm about ready to keep my backups in a replicated database and let it handle the lower level storage, replication and data integrity details. Been meaning to do it for a while now.

EDIT: Great effort, but it's a shame the code is not more modular. Makes re-use of parts for my use-case a tad complicated. Thanks for the links though, it's a great reference.
Post edited June 17, 2017 by Magnitus
avatar
Magnitus: Anybody has the following API info (didn't spot it in the link or with my fooling around with the provided urls)?

- How to get a list of games that got updated since your last downloaded them
- How to get a checksum of the game's installer?
gogrepo.py can be used to backup your collection, if you're looking for an existing solution. Otherwise use the products API to get the filesizes and check if they match the ones you have already. Version is very unreliable and hashes would probably get you banned if you scan all of them at once. There's no list of which games have been updated since you last downloaded them. The full checksum can be obtained using the chunklist for a download url, check the docs for an example.


avatar
Magnitus: However, in my ideal scenario, I'd like to ultimately sent the files to a database instead of directly the filesystem (maybe PostgreSQL or MongoDB GridFS). I'm about ready to keep my backups in a replicated database and let it handle the lower level storage, replication and data integrity details. Been meaning to do it for a while now.
Databases aren't made to store big chunks of binary data. MongoDB is probably even worse than PostgreSQL, because it splits them into a massive amount of tiny chunks. If you want to store files with a unique key, use a filesystem, that's what they're designed for. Metadata can be kept in json files that are much easier to manage than database columns. Whatever you do, please just don't store files in a database.
avatar
Yepoleb: gogrepo.py can be used to backup your collection, if you're looking for an existing solution. Otherwise use the products API to get the filesizes and check if they match the ones you have already. Version is very unreliable and hashes would probably get you banned if you scan all of them at once. There's no list of which games have been updated since you last downloaded them. The full checksum can be obtained using the chunklist for a download url, check the docs for an example.
I guess I can use the filesize AND the version to determine whether to update. Not a bad idea.

Thanks for the pointer concerning the checksum.

avatar
Yepoleb: Databases aren't made to store big chunks of binary data. MongoDB is probably even worse than PostgreSQL, because it splits them into a massive amount of tiny chunks. If you want to store files with a unique key, use a filesystem, that's what they're designed for. Metadata can be kept in json files that are much easier to manage than database columns. Whatever you do, please just don't store files in a database.
In my experience, filesystems have a much lower amount of content-integrity checking than databases: files on the filesystem get corrupted all the time without the knowledge of the storage engine until you try to access that particular file. Databases are of course not immune to this (they too run files on a regular filesystem in most scenarios), but they are by their nature a lot more pro-active at letting you know that your content was corrupted at which point you can take action rather than unknowingly sit on corrupted data.

Also, overall, I got a lot of pre-existing knowledge with MongoDB (I took a bunch of online courses and have a double-certification). It's relatively easy for me to add/remove replicas for storage redundancy or just to deprecate or add storage medium.

For me, putting and retrieving files from a remote database is a lot less hassle and more portable than having to manage, replicate and move around a filesystem directory structure.

In a previous place of employment, we originally stored media files in a directory structure and that proved to be a mess to maintain. Moving it to a database proved to be a net gain for us (from there, you could remotely access files, replicate them, shard them and overall handle them just like other database documents).

Now, they were not 1GB+ files and you a probably right that if you want to do a lot of manipulation on the entire file, the chucks may be a drawback, but for storage and retrieval? Why not?
Post edited June 18, 2017 by Magnitus