kohlrak: Yes, which is why i posted some time ago that we should have a better way of doing it, because i'm in your exact same position. If you're using linux (like I am), you could presumably mount a ramdisk to solve that issue, which is on my to do list, but Kalanyr's argument is valid as well: this usually helps prevent (but doesn't always prevent) fragmentation.
timppu: Frankly I preferred when gogrepo didn't do any preallocation, or at least that it would be optional.
i was happily using gogrepo for a long time without preallocation (on a NTFS USB drive) and whatever fragmentation there was never became a problem. What would the problem be anyway, reduced speed? If there was, I never noticed it. Also considering we are dealing mostly with quite big files here (removing old files gigabytes in size, and then downloading newer versions of them), fragmentation shouldn't be that heavy anyway, and would affect only new or modified files anyway. Maybe if someone has some anxiety stress that there is any fragmentation...
And if I ever wanted to get rid of fragmentation, one can always just run a defragmentation program, or copy the files over to an empty drive and use it as the primary one. Doing so time to time would be beneficial anyway with archived HDDs so that all the magnetic data on the disc gets refreshed.
My sentiment exactly.
Anyway, I guess I can live with the current preallocation as well, even if I consider it of little or no benefit really. I don't really care about the extra writes either, as I am not using SSD for archiving my GOG game installers.
Turns out, SSDs currently have a very similar lifespan to HDDs, except they don't risk the motor going out. All drives ultimately need the data to be rewritten over time, too, which further reduces the lifespan. I haven't actually considered how many writes vs how often i update it would become a problem, though. On the flip side, GOG repo has timeout issues waiting for the pre-allocation to finish, becaue it connects to gog before starting the preallocation, so GOG kills the hanging connection. My script for running GOG repo has download multiple times on it just to ensure it "resumes" after timing out from pre-allocation. Which GOG might end up taking exception to, in the end, 'cause we are establishing a bunch of hanging connections that aren't active and are effectively having negative effects on GOG's performance (this is actually used for a type of DDoS attack, called Slowloris (IIRC), so if GOG ends up hiring someone overly zealous, they could end up convincing the company to ban gogrepo on those grounds). I always wanted to bring that point up, but i keep forgetting every time i'm in this thread.
kohlrak: Now if I could convince him to verify the files as they're being downloaded, instead, that'd be great.
Why? I prefer verification is performed separately because then you can do an integrity verification on your GOG archive also offline, not connecting to GOG servers at all.
Vs checking it as it's downloading or just checking it before moving it to it's permanent directory? I don't see what you're getting at. It gives the computer something else to do at the same time while the CPU is in a wait state rather than trying to crunch my 500+ games all at once on my external when i have to rebuild the entire manifest when I grab the updated version of gogrepo. It's not like i'm asking it to connect to GOG and double download or something, just perform it while it's downloading or while it's downloading another file but before moving it.