It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
ssokolow: It's easier than on Windows.

1. Open up Synaptic Package Manager and select everything you want to install or update (as you normally would).
2. Instead of clicking Apply, stick in a thumbdrive and choose "Generate Package Download Script".
3. Go to another machine with Internet access and run the script. (The script is simple enough that you can use it on Windows too if you rename it to .bat, and put a copy of wget.exe next to the script or in your %PATH%)
4. Bring the thumbdrive back to your offline machine, make sure the same packages are still selected (they will be if you didn't quit Synaptic), and choose "Add downloaded packages".

All your updates plus any programs you want to install in one simple, automated process.

Off the top of my head, I don't remember the process for offline-updating the list of available packages, but it's equally simple.
avatar
shaddim: Well, I'm pro-open source myself...but this part of the linux ecosystem is severely broken (and the design mistakes infact have nothing to do with being open source but with legacy unix baggage/thinking, which needs to go away).

You described the (non-obvious) process of getting something from an central repository to run it later locally. The distro centered approach with centralized repositories (while being fragmented among the distros) has several downsides: only what is in the repos is available & recommended to users, meaning limited amount of apps. As in the distro concept with the approach every lib is a system lib and therefore tight intercoupling of apps with system, only the complete system with apps is updated, meaning either out-of-dateness of applications or instability. As it is assume everything is updated and synchronized with the distro, stable API/ABIs, forming a stable platform identified as crucial long ago, across multiple distros never were developed or enforced for the linux ecosystem, so we can't provide stable interfaces to external app developers and therefore suffering on a severe lack of ISV apps (Photoshop, Adobe stuff, games) (see also the compatibility problems of external binary apps for steam for linux).
Wow thanks that's very insightful. So then if it's not open source that's the problem why doesn't the games industry use openGL. John Carmac tried to get openGL off the ground by making it the main render used in the quake 3 engine; and that engine was once the most used engine in the industry. So what happened?
avatar
ssokolow: Inertia, plain and simple. The practice of using DirectX has built up a lot of inertia and it'll take time to change that.

That's what Valve has been trying to do with things like the presentation at Steam Dev Days where they showed trends in the availability of features via DirectX and OpenGL.

(TL;DR: There's a lot of perfectly fine hardware running Steam where outdated drivers or old versions of Windows cripple DirectX while still leaving the newer features accessible via OpenGL... in the former case (drivers), because OpenGL extensions beat DirectX's monolithic update strategy to the punch and, in the latter case (windows version), mostly because WinXP doesn't go above DX9 and people often don't upgrade whatever they have.)
avatar
Magmarock: That's what I get for writing in the middle of the night. Yes there was no need for it to be that long.

However, yes Vavle are trying to move towards Linux but I have no confidence whatsoever. AS time as gone on Steam has gotten worse and I suspect that Valves Success with Steam was down to luck and good timing. Valve seems to start these amazing projects, but they seem to get board and move on before anything has really happened. So I don't see a future in the Steam OS I'll just use Mint.
To be honest, the only reason I referenced Valve was because their presentation happened to be the most memorable source.

The main thing I cared about was pointing out how, were it not for industry inertia, OpenGL would be the obvious choice because, due to old drivers or old Windows versions, on average, more features of the same hardware will be exposed via OpenGL than via DirectX.

avatar
ssokolow: It's easier than on Windows.
avatar
Magmarock: TL;DR: Well I skimmed it. The suggestions you made with Synaptic I did try that but it was simply too temperamental. Sometimes it worked sometimes it didn't and sometimes it half worked example installing Wine but not my Nvidia graphics driver. I certainly would never say that this was easier then Windows though. Not by a country mile. However I despite my consistent criticism I obviously like Linux just not the attitude of most of it's fans. I mostly use Terminal to make Linux do what I want. So far it's proven to be the best method for me. As for Wine I've actually had better luck getting games to work with Virtualbox and more luck with VM workstation. I'm thinking of buying it but I need to know if it uses online activation DRM first.

As for essential apps I'm not sure what else I can say. Windows just seems to have more the things that I want especly regarding system tools.
I'm still skeptical that Windows would be easier for offline use these days, but I'll take your word for it.

I'll definitely agree that there are some Linux fans out there who really don't do it any favours.

All else being equal, the terminal will always be the more direct route by its very nature. A GUI is easier to learn but a command-line is more efficient once you have learned it. That's a universal truth.

Virtualbox and VMWare will definitely give you more reliably results, since you're working with real Windows. The main advantage to Wine is that it's free and, when it works, you've got your games running pseudo-natively rather than in a Window or a "seamless" mode which gives you two taskbars.

(Sorry for being so terse. It's 2:45AM and I'm about to go to sleep.)
avatar
Magmarock: Wow thanks that's very insightful. So then if it's not open source that's the problem why doesn't the games industry use openGL. John Carmac tried to get openGL off the ground by making it the main render used in the quake 3 engine; and that engine was once the most used engine in the industry. So what happened?
Facebook bought Oculus Rift. I'm not sure if he has anything to do with OpenGL these days.

If you want to see where OpenGL is heading, read this:
http://www.anandtech.com/show/8363/khronos-announces-next-generation-opengl-initiative

It's way beyond Valve only.
Post edited January 12, 2015 by shmerl
avatar
ssokolow: I'm still skeptical that Windows would be easier for offline use these days, but I'll take your word for it.

I'll definitely agree that there are some Linux fans out there who really don't do it any favours.

All else being equal, the terminal will always be the more direct route by its very nature. A GUI is easier to learn but a command-line is more efficient once you have learned it. That's a universal truth.

Virtualbox and VMWare will definitely give you more reliably results, since you're working with real Windows. The main advantage to Wine is that it's free and, when it works, you've got your games running pseudo-natively rather than in a Window or a "seamless" mode which gives you two taskbars.

(Sorry for being so terse. It's 2:45AM and I'm about to go to sleep.)
I have a legit Windows 7 key however I also have a method of activating it offline. Sorry I can't tell you what it is, but I can tell you that it's as simple as pushing a batten, I use WSUS offline updates to update the system locally, which is legal and free. After that it's just a matter of clicking on little pictures to make things work.

I don't agree that it's a universal truth that command-line is always better.

The main difference between terminal and CMD is that there's almost no reason to ever use CMD. But in my experience Terminal is a must. Windows has enough internal programs to render CMD almost completely redundant, but Linux on the other hand has lot of functions that can only be accessed through terminal.
avatar
shmerl: According to GPL authors, GPLv3 clarified intentions of GPLv2. So according to them GPLv2 also forbids DRM. But since it wasn't clearly enough explained, they had to make another revision.
avatar
Gersen: What their intentions were is totally meaningless, the question is : does the GPLv2 in it's current form explicitly forbid DRMs and the answer is no.

Also GPLv3 doesn't forbid DRMs it just state in this case the anti-circumvention laws won't apply in case somebody crack said DRM.

So even IF Dosbox was using v3 and IF GoG was still using their passworded RAR (and of course using the supposition that it was actually to be considered as a DRM) then it would still be authorized by the GPLv3, it would just mean that cracking said password would be perfectly legal, even under DMCA.

http://www.gnu.org/licenses/gpl-faq.html#DRMProhibited
I thought this particular section forbids password protection:

Corresponding Source conveyed, and Installation Information provided, in accord with this section must be in a format that is publicly documented (and with an implementation available to the public in source code form), and must require no special password or key for unpacking, reading or copying.
the entire section is in legalese so I may very well be misunderstanding.
avatar
ssokolow: I'm still skeptical that Windows would be easier for offline use these days, but I'll take your word for it.

I'll definitely agree that there are some Linux fans out there who really don't do it any favours.

All else being equal, the terminal will always be the more direct route by its very nature. A GUI is easier to learn but a command-line is more efficient once you have learned it. That's a universal truth.

Virtualbox and VMWare will definitely give you more reliably results, since you're working with real Windows. The main advantage to Wine is that it's free and, when it works, you've got your games running pseudo-natively rather than in a Window or a "seamless" mode which gives you two taskbars.

(Sorry for being so terse. It's 2:45AM and I'm about to go to sleep.)
avatar
Magmarock: I have a legit Windows 7 key however I also have a method of activating it offline. Sorry I can't tell you what it is, but I can tell you that it's as simple as pushing a batten, I use WSUS offline updates to update the system locally, which is legal and free. After that it's just a matter of clicking on little pictures to make things work.

I don't agree that it's a universal truth that command-line is always better.

The main difference between terminal and CMD is that there's almost no reason to ever use CMD. But in my experience Terminal is a must. Windows has enough internal programs to render CMD almost completely redundant, but Linux on the other hand has lot of functions that can only be accessed through terminal.
That's due to Linux devs preferring command-line themselves and not seeing sense in making the effort of writing a GUI. It's a shame, but since they are not paid at all, it's no wonder that they don't want to have all this additional work when the program already runs well. Donating to Linux devs for making GUIs actually helps, I got a nice DosBox GUI this way for my OpenPandora (runs with a custom Angström) which is actually a bit easier to use than D-Fend (common Windows GUI for DosBox).
Post edited January 12, 2015 by Klumpen0815
avatar
Magmarock: I have a legit Windows 7 key however I also have a method of activating it offline. Sorry I can't tell you what it is, but I can tell you that it's as simple as pushing a batten, I use WSUS offline updates to update the system locally, which is legal and free. After that it's just a matter of clicking on little pictures to make things work.

I don't agree that it's a universal truth that command-line is always better.

The main difference between terminal and CMD is that there's almost no reason to ever use CMD. But in my experience Terminal is a must. Windows has enough internal programs to render CMD almost completely redundant, but Linux on the other hand has lot of functions that can only be accessed through terminal.
avatar
Klumpen0815: That's due to Linux devs preferring command-line themselves and not seeing sense in making the effort of writing a GUI. It's a shame, but since they are not paid at all, it's no wonder that they don't want to have all this additional work when the program already runs well. Donating to Linux devs for making GUIs actually helps, I got a nice DosBox GUI this way for my OpenPandora (runs with a custom Angström) which is actually a bit easier to use than D-Fend (common Windows GUI for DosBox).
Shame is the word and donating to devs using GUI sounds terrific. I'm up for that. In fact I'd probably go as far to commission a coder to help me make a program.
avatar
Klumpen0815: That's due to Linux devs preferring command-line themselves and not seeing sense in making the effort of writing a GUI. It's a shame, but since they are not paid at all, it's no wonder that they don't want to have all this additional work when the program already runs well.
It's in fact even worse: as there is not unified single linux GUI toolkit, but many competing who also break compatibility to themselves between releases, GUI software breaks notoriously often. So even for software where an GUI was written, the GUI is broken often now and don't work anymore. So, the missing of a unified, stable "linux" widget kit (like Windows has) is therefore highly demotivating for the creation of GUI software overall and leads to the breakage of the few existing linux GUI software too often. (e.g. take a look on subsurface where developers fight with this https://www.youtube.com/watch?v=ON0A1dsQOV0)

(for the CLI stuff, there is some kind of stability with posix...another reason why developers prefer the CLI)
Post edited January 12, 2015 by shaddim
avatar
Klumpen0815: That's due to Linux devs preferring command-line themselves and not seeing sense in making the effort of writing a GUI. It's a shame, but since they are not paid at all, it's no wonder that they don't want to have all this additional work when the program already runs well.
It's not necessarily that they prefer command lines. There are a couple factors here.

First, Linux can be run without a window manager at all, so if you have a gui only application, it won't be accessible on a window manager-less Linux box. Obviously, this doesn't matter to purely gui applications, like games, but it is very important to configuration and system applications.

Second, as has already been pointed out, Linux has several graphical toolkits available. It makes perfect sense to decouple the application from the gui, so multiple gui's is easily done. If you want a gui in a specific toolkit, and it doesn't exist, it's usually pretty trivial to make one.

In the end though, most seasoned Linux users are at least comfortable, if not prefer, the terminal.
Post edited January 12, 2015 by hummer010
avatar
Magmarock: Wow thanks that's very insightful. So then if it's not open source that's the problem why doesn't the games industry use openGL. John Carmac tried to get openGL off the ground by making it the main render used in the quake 3 engine; and that engine was once the most used engine in the industry. So what happened?
for some background info on opengl vs directX, go read
http://www.tomshardware.com/reviews/opengl-directx,2019.html

short story: once upon a time opengl was goliath, directX was a really shitty david. But Microsoft worked hard, catched up and eventually overtook opengl with actually offering more features(DirectX9). While opengl sort of just lingered around ;)
that's why nowadays most game devs use DirectX.

The article is from 2008.In the meantime OpenGL has now catched up and with the latest version it is pretty even in terms of features with DirectX11.
But as others said, thanks to industry inertia, DirectX is still prevalent. Most game devs simply only know DirectX.
With more and more cross-platform games being developed that will most likely change(and has in parts already).
The next years surely will be interesting in that regard.

edit:
concerning linux: The longstanding problem with really bad 3D drivers didn't exactly help to push OpenGL either :p
Though that has gotten better in recent times. A lot.
Post edited January 12, 2015 by immi101
avatar
hummer010: Second, as has already been pointed out, Linux has several graphical toolkits available. It makes perfect sense to decouple the application from the gui, so multiple gui's is easily done.
No, this decoupling is unneeded complexity which offers choice, no one need (and reduces choice wanted) & prevents required polish and consistent UX and usability. Especially, when the common position: "GUI toolkits are not important" is taken into account, so why having many of them? It just reduces quality and makes support more costly and complicated.

avatar
hummer010: If you want a gui in a specific toolkit, and it doesn't exist, it's usually pretty trivial to make one.
Woah...right before our eyes, the typical assumption of Linux devs: "GUIs are an secondary simple afterthought , done after the real work" :(

A good, consistent UX and usability are not afterthoughts at all but the hard part (more than the technical implementation) and would need a software design which inlcude this from the beginning. As afterthought, in the end on top of an existing software, it leads to an unpolished "thing" which feels fumbled together !

avatar
hummer010: In the end though, most seasoned Linux users are at least comfortable, if not prefer, the terminal.
Yeah, exactly... in the end linux was never meant as an PC operating system: meaning empowering end-users, not admins and developers.
Post edited January 12, 2015 by shaddim
avatar
shaddim: No, this decoupling is unneeded complexity which offers choice, no one need (and reduces choice wanted) & prevents required polish and consistent UX and usability. Especially, when the common position: "GUI toolkits are not important" is taken into account, so why having many of them? It just reduces quality and makes support more costly and complicated.
You have to remember, there are more instances of Linux running without a window manager than with one, so this decoupling does make perfect sense.
avatar
shaddim: No, this decoupling is unneeded complexity which offers choice, no one need (and reduces choice wanted) & prevents required polish and consistent UX and usability. Especially, when the common position: "GUI toolkits are not important" is taken into account, so why having many of them? It just reduces quality and makes support more costly and complicated.
avatar
hummer010: You have to remember, there are more instances of Linux running without a window manager than with one, so this decoupling does make perfect sense.
Ok, decoupling without fragmenting the API/ABI and taking UX and usability as first class citizen design-wise, would be a start.
avatar
hummer010: You have to remember, there are more instances of Linux running without a window manager than with one, so this decoupling does make perfect sense.
avatar
shaddim: Ok, decoupling without fragmenting the API/ABI and taking UX and usability as first class citizen design-wise, would be a start.
I can agree with that. UI's are easy. Good UI's are hard - I know it certainly isn't a strength of mine.

I don't find the GUI situation in Linux that bad, but then I guess I also run a pretty minimalist system.
avatar
hummer010: You have to remember, there are more instances of Linux running without a window manager than with one, so this decoupling does make perfect sense.
avatar
shaddim: Ok, decoupling without fragmenting the API/ABI and taking UX and usability as first class citizen design-wise, would be a start.
won't happen.(the thing with the one toolkit to rule them all)
You need to keep in mind that there isn't "THE linux" nor are there "THE linux devs". There are thousands of different projects with different project leads who all follow their very own idea of "what is the best". If you give people the freedom to follow their own ideas you can't expect conformity. Unless you take away the freedom and force the people to follow _one_ concept as it happens in systems developed by one company(MS, Apple).
The fragmentation of the linux software ecosystem is systemic.
enjoy it, hate it, love it, doesn't matter - just accept that it won't go away. :)

//edit: error fixed
Post edited January 12, 2015 by immi101