It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Onox: Hey, I have a question : the external drive I’m now using for backing up the games is formatted with NTFS, which apparently doesn’t support posix_fallocate and in that case, it seems that posix_fallocate writes zeroes to each sector that we want to preallocate. I see that it is indeed very slow, but it’s not really the speed that concerns me… isn’t that physically bad for the drive ?
Yes, which is why i posted some time ago that we should have a better way of doing it, because i'm in your exact same position. If you're using linux (like I am), you could presumably mount a ramdisk to solve that issue, which is on my to do list, but Kalanyr's argument is valid as well: this usually helps prevent (but doesn't always prevent) fragmentation.

It's not super bad, it just means that if this is all yo uuse the drive for, it'll go out twice as fast than if this didn't happen, because it's an extra write per sector. This is an issue even for non-NTFS drives, and the whole point of doing this is for NTFS drives.
avatar
kohlrak:
Thank you ! Yes, I’m using Linux (Fedora) too. I used RAM disks once in the past, and I have plenty of memory to do so on this computer, but how would you use one here ? I’m not sure I understand. I can download to /tmp (which is already in memory), copy the files to my backup folder and then use the gogrepo "download" command, which downloads nothing because the files are already there. But is there a better thing to do ?

Another issue I’m having is that "verify" doesn’t verify anything
avatar
Onox: It doesn’t seem possible to directly update the manifest file, should I just delete the manifest file and use "update" again ? Will this break updates for the games I have already downloaded ?
Yes if all else fails you can always simply delete the old manifest file and let "gogrepoc update" rebuild it from scratch.

And no it will not affect your already downloaded games (not quite sure what you mean by "break updates").
avatar
kohlrak:
avatar
Onox: Thank you ! Yes, I’m using Linux (Fedora) too. I used RAM disks once in the past, and I have plenty of memory to do so on this computer, but how would you use one here ? I’m not sure I understand. I can download to /tmp (which is already in memory), copy the files to my backup folder and then use the gogrepo "download" command, which downloads nothing because the files are already there. But is there a better thing to do ?

Another issue I’m having is that "verify" doesn’t verify anything
I'm using fedora, too. Small world.

The file download to the temp directory before later being moved to their end place. Mount the ramdisk in what would be the temp folder, but keep in mind that you better have enough RAM for the files,which is why i haven't done it yet. Another option is to go into the python code and comment out the fallocate bit, and it'll just create the files on the fly, but then you have the fragmentation issues.

Another interesting idea is to have gog repo download to a cheap flash drive that you can replace as needed. The idea is that GOG doesn't download the games directly to the folder in which they're later stored. By default this is "!downloading/".

Now if I could convince him to verify the files as they're being downloaded, instead, that'd be great. When i finally get a 64bit system to run my server on (right now i'm using an old dell with a pentium 4), i could get enough ram to hold a few files and have it use the RAM disk.

And, yes, verify does seem to do stuff. I'm guessing we're using different versions...
Post edited March 01, 2019 by kohlrak
avatar
kohlrak: Yes, which is why i posted some time ago that we should have a better way of doing it, because i'm in your exact same position. If you're using linux (like I am), you could presumably mount a ramdisk to solve that issue, which is on my to do list, but Kalanyr's argument is valid as well: this usually helps prevent (but doesn't always prevent) fragmentation.
Frankly I preferred when gogrepo didn't do any preallocation, or at least that it would be optional.

i was happily using gogrepo for a long time without preallocation (on a NTFS USB drive) and whatever fragmentation there was never became a problem. What would the problem be anyway, reduced speed? If there was, I never noticed it. Also considering we are dealing mostly with quite big files here (removing old files gigabytes in size, and then downloading newer versions of them), fragmentation shouldn't be that heavy anyway, and would affect only new or modified files anyway. Maybe if someone has some anxiety stress that there is any fragmentation...

And if I ever wanted to get rid of fragmentation, one can always just run a defragmentation program, or copy the files over to an empty drive and use it as the primary one. Doing so time to time would be beneficial anyway with archived HDDs so that all the magnetic data on the disc gets refreshed.

Anyway, I guess I can live with the current preallocation as well, even if I consider it of little or no benefit really. I don't really care about the extra writes either, as I am not using SSD for archiving my GOG game installers.
avatar
kohlrak: Now if I could convince him to verify the files as they're being downloaded, instead, that'd be great.
Why? I prefer verification is performed separately because then you can do an integrity verification on your GOG archive also offline, not connecting to GOG servers at all.
Post edited March 01, 2019 by timppu
avatar
kohlrak: Yes, which is why i posted some time ago that we should have a better way of doing it, because i'm in your exact same position. If you're using linux (like I am), you could presumably mount a ramdisk to solve that issue, which is on my to do list, but Kalanyr's argument is valid as well: this usually helps prevent (but doesn't always prevent) fragmentation.
avatar
timppu: Frankly I preferred when gogrepo didn't do any preallocation, or at least that it would be optional.

i was happily using gogrepo for a long time without preallocation (on a NTFS USB drive) and whatever fragmentation there was never became a problem. What would the problem be anyway, reduced speed? If there was, I never noticed it. Also considering we are dealing mostly with quite big files here (removing old files gigabytes in size, and then downloading newer versions of them), fragmentation shouldn't be that heavy anyway, and would affect only new or modified files anyway. Maybe if someone has some anxiety stress that there is any fragmentation...

And if I ever wanted to get rid of fragmentation, one can always just run a defragmentation program, or copy the files over to an empty drive and use it as the primary one. Doing so time to time would be beneficial anyway with archived HDDs so that all the magnetic data on the disc gets refreshed.
My sentiment exactly.
Anyway, I guess I can live with the current preallocation as well, even if I consider it of little or no benefit really. I don't really care about the extra writes either, as I am not using SSD for archiving my GOG game installers.
Turns out, SSDs currently have a very similar lifespan to HDDs, except they don't risk the motor going out. All drives ultimately need the data to be rewritten over time, too, which further reduces the lifespan. I haven't actually considered how many writes vs how often i update it would become a problem, though. On the flip side, GOG repo has timeout issues waiting for the pre-allocation to finish, becaue it connects to gog before starting the preallocation, so GOG kills the hanging connection. My script for running GOG repo has download multiple times on it just to ensure it "resumes" after timing out from pre-allocation. Which GOG might end up taking exception to, in the end, 'cause we are establishing a bunch of hanging connections that aren't active and are effectively having negative effects on GOG's performance (this is actually used for a type of DDoS attack, called Slowloris (IIRC), so if GOG ends up hiring someone overly zealous, they could end up convincing the company to ban gogrepo on those grounds). I always wanted to bring that point up, but i keep forgetting every time i'm in this thread.
avatar
kohlrak: Now if I could convince him to verify the files as they're being downloaded, instead, that'd be great.
Why? I prefer verification is performed separately because then you can do an integrity verification on your GOG archive also offline, not connecting to GOG servers at all.
Vs checking it as it's downloading or just checking it before moving it to it's permanent directory? I don't see what you're getting at. It gives the computer something else to do at the same time while the CPU is in a wait state rather than trying to crunch my 500+ games all at once on my external when i have to rebuild the entire manifest when I grab the updated version of gogrepo. It's not like i'm asking it to connect to GOG and double download or something, just perform it while it's downloading or while it's downloading another file but before moving it.
That’s a really interesting discussion, thank you guys :)

For now, I think that I will just download a few games per day manually without gogrepo, but still verify them (it’s working for me now with Kalanyr’s dev branch, I was just misusing the command :D ), and I’m not too concerned about fragmentation. I don’t really want to download absolutely everything anyway, and there are some games I also have on Steam or elsewhere
Post edited March 01, 2019 by Onox

Why? I prefer verification is performed separately because then you can do an integrity verification on your GOG archive also offline, not connecting to GOG servers at all.
avatar
kohlrak: Vs checking it as it's downloading or just checking it before moving it to it's permanent directory? I don't see what you're getting at.
Not sure if I understood your suggestion correctly, but I took it as in that gogrepo should verify the file right after it has downloaded it, unlike how nowadays you run a verification run for your downloaded files afterwards.

I still want the option to run the verification separately from downloading, for cases like having the USB hard drive disconnected for a year or two and then thinking "hey, I wonder if all the installers are still ok there, or whether some bitrot has occurred?". As long as you have the manifest file available that matches the set you have on that hard drive, you can verify all the files.

With non-GOG archives that I have, I use rhash and dvdsig for the same purpose, to be able to check at any point of time that all the files are still ok.
Post edited March 01, 2019 by timppu
avatar
kohlrak: Vs checking it as it's downloading or just checking it before moving it to it's permanent directory? I don't see what you're getting at.
avatar
timppu: Not sure if I understood your suggestion correctly, but I took it as in that gogrepo should verify the file right after it has downloaded it, unlike how nowadays you run a verification run for your downloaded files afterwards.

I still want the option to run the verification separately from downloading, for cases like having the USB hard drive disconnected for a year or two and then thinking "hey, I wonder if all the installers are still ok there, or whether some bitrot has occurred?". As long as you have the manifest file available that matches the set you have on that hard drive, you can verify all the files.

With non-GOG archives that I have, I use rhash and dvdsig for the same purpose, to be able to check at any point of time that all the files are still ok.
The verification as it is designed right now is to check for file integrity after downloading regarding downloading, which is why it skips files that were already previously verified. Now, if it verified all those files again, that would be a different story. As it stands right now, it's functionally for checking if the file downloaded correctly.

EDIT: Just a small excerpt:
06:20:14 | skipping previously verified zenith/setup_zenith_2.2.0.3.exe
06:20:14 | verifying zenith/setup_zenith_2.2.0.3-1.bin...
06:20:14 | skipping previously verified zenith/setup_zenith_2.2.0.3-1.bin
06:20:14 | verifying zenith/zenith_wallpapers.zip...
06:20:14 | skipping previously verified zenith/zenith_wallpapers.zip
06:20:14 | verifying ziggurat/setup_ziggurat_2018-05-08_(20608).exe...
06:20:14 | skipping previously verified ziggurat/setup_ziggurat_2018-05-08_(20608).exe
Post edited March 01, 2019 by kohlrak
avatar
kohlrak: The verification as it is designed right now is to check for file integrity after downloading regarding downloading, which is why it skips files that were already previously verified. Now, if it verified all those files again, that would be a different story. As it stands right now, it's functionally for checking if the file downloaded correctly.
Doesn't it have an option to do full verification, not skipping earlier verified files?

If not, then yeah I guess you are right.
avatar
kohlrak: The verification as it is designed right now is to check for file integrity after downloading regarding downloading, which is why it skips files that were already previously verified. Now, if it verified all those files again, that would be a different story. As it stands right now, it's functionally for checking if the file downloaded correctly.
avatar
timppu: Doesn't it have an option to do full verification, not skipping earlier verified files?

If not, then yeah I guess you are right.
It might, but i'm unaware if it does. I'd check the docs, but the docs are apparently out of date and unrelated to actual functionality as he said months ago. This is precisely why I want to try to start some sort of small project to better organize information rather than relying on this thread. There've been times i've asked myself why i didn't make my own fork, just for myself, but i'm reminded that i hate python with a passion, i don't really want to have to rely on myself updating this thing when someone who's more in the loop can do it much better than i, and it'd be a total waste, and also incredibly disrespectful to Kalanyr who has done a great job of maintaining it, despite the information issues, especially given he likely has incentives to be doing other things instead, which is another reason why i want to formalize something a little more reliable than this topic. I assume he's a very busy guy with more on his plate far more important than maintaining a simple script like this. I figure he could use some sort of help in the other areas to offload some of the work.
About reCAPTCHA:
I added GUI login method to lgogdownloader when reCAPTCHA is encountered on login form.
If anyone has experience with PyQt5 you could try porting the login implementation from lgogdownloader to gogrepo.

include/gui_login.h
src/gui_login.cpp
and the relevant part of src/website.cpp login code
avatar
Sude: About reCAPTCHA:
I added GUI login method to lgogdownloader when reCAPTCHA is encountered on login form.
If anyone has experience with PyQt5 you could try porting the login implementation from lgogdownloader to gogrepo.

include/gui_login.h
src/gui_login.cpp
and the relevant part of src/website.cpp login code
Does this actually apply only on the login page? If so, my cookie thing would be more ideal for people that might be SSHing to run the script.

Useful information to have, though. Could very well become mandatory in the future, which makes me sad.
I think I managed to install Python (and the stuff the script needs), but when running the update command to build a manifest file, I get quite a lot of error messages like this one: I know nothing about Python, and it tells me pretty much also nothing. Help, please?

Edit: The command I ran is "gogrepo.py update -os windows -lang en pl" (I had a look into the script, and while I don't know how to read Python, I think I managed to figure out, that Polish is also defined as a language in there?).
Post edited March 01, 2019 by piranha1
avatar
piranha1: I think I managed to install Python (and the stuff the script needs), but when running the update command to build a manifest file, I get quite a lot of error messages like this one:
I think they are irrelevant errors, you can disregard them. (all those where it tries to get details for "en1galaxy_installer0". Yeah I get them too, as do everyone else I think.

If I recall right they are related to the game installers that used to have the Galaxy client installer embedded into them. Or something. Kalanyr could explain it better.