It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
shaddim: I agree you don't need separation in every case: for instance, in the embedded, server or super computer OS use case. But you NEED it in the single user/desktop/PC use case, this should be finally hammered into the minds of the linux ecosystem developers... finally!
avatar
shmerl: What I meant is, that even for the desktop, you don't need it in every case, since such separation comes with some price (even though it's a light virtualization). So when it's needed - it can be used, when not - there is no point. But at least it's good that such option is going to be available.

As an example, let's say GOG releases a game that depends on library A. They release it in a container and long term support idea. After some time, library A gets updated in backward compatible manner introducing optimizations and bug fixes. However the game run in the container will still use the old version (unless GOG is going to keep track of that updating these containers with newer middleware, which I doubt they would), while it could run natively and benefit from the improvements. In other cases changes can be breaking, then container would be the proper way to continue using it. So it's a tradeoff. What would be ideal is a way to run something with or without the container optionally to get the most optimal result when available.
No container is needed : package manager manage very well dependencies so that they can know if a package can use updated libs or not and so on.
Just use the existing package manager which does the job the right way.

avatar
shmerl: Those aren't workarounds, those are release cycles designed for different purposes and I see nor problem in having those differences. Stable is perfect for servers. Red Hat does the same with RHEL. But it's not suitable for desktop normally. Rolling distros are OK for desktop, but some might prefer something in between possibly.
avatar
shaddim: Those are workarounds, as you describe it yourself above: a user wants bleeding edge apps AND a stable system. None of the provided options (stable, testing, unstable) provides both. The user has only the option of selecting a bad compromise, where Macos, Android, Windows have actually a solution, separation & specific upgrade cycles.
Windows, stable? I hope you're joking ;)

avatar
shmerl: And, other OSes never introduced it successfully. See all the "add Windows XYZ compatibility" threads on GOG. Linux is actually innovating here.
avatar
shaddim: Sure, its not perfect but they try at least hard and they HAVE policies! Linux is not even trying to provide compatibility, neither over time nor over the space of the distros. Nothing. The windows introduced layer is at least so successfully and working that linux copied this layer (also as something similar is not available), the result is called WINE.
And this is because of lots of apps/games developper which don't accept to build versions for Linux. Nothing else. Compare win32 to POSIX & co. The first one is a bad joke from the quality point of view. So it was not "copied because successful" juste they had to "offer an option to peoples needing this or this tool/game/what else to allow them to use it on their favorite OS".
I would love to see this feature on GOG. These guys do great work already, but recently through a strange turn of events, i ended up with Ubuntu installed on an old laptop of mine, and now i am hopelessly hooked on Ubuntu. while PlayOnLinux works "good enough" for most games, offering a lot of support for GOG versions games, I still feel that true Linux based versions and Linux based support would be great. i also think that while Steam may offer Linux "support" for all the Valve games, I think that GOG might be the true pebble that starts the avalanche, causing next gen games to port over to Linux-based format in the next few years. this is a truly a dynamic time for Linux, with more users switching over to Linux based systems every day. i can confidently say that if GOG does start to support Linux now, they will be ahead of the game as things develop over the next few years!
avatar
ianneck528: i also think that while Steam may offer Linux "support" for all the Valve games, I think that GOG might be the true pebble that starts the avalanche, causing next gen games to port over to Linux-based format in the next few years.
Do you really think that it is the service with sell old classics, some indies and hardly any modern AAA titles will be the driving force for the next gen games, and not the service who do sell them ? and why is "support" in quotation marks?
avatar
Porkepix: No container is needed : package manager manage very well dependencies so that they can know if a package can use updated libs or not and so on.
Just use the existing package manager which does the job the right way.
Those containers are more than just packages. They provide some process isolation through cgroups and etc. See the documentation on the Docker: http://docs.docker.io/en/latest/faq/
It's more of a light virtualization than a simple bundling. Why this can be useful is explained above. The release cycle of GOG's support can differ from release cycle of certain distro, and let alone can't be synchronized with multiple ones (the more - the harder it would become). So they of course can make a simple self contained bundle with all the libs, but Docker goes further than that, providing a cleaner and more delineated isolation. And nothing prevents you from using the container together with the package manager (to install some contained game). They aren't contradictory.
Post edited December 12, 2013 by shmerl
avatar
shmerl: What I meant is, that even for the desktop, you don't need it in every case, since such separation comes with some price (even though it's a light virtualization). So when it's needed - it can be used, when not - there is no point. But at least it's good that such option is going to be available.

As an example, let's say GOG releases a game that depends on library A. They release it in a container and long term support idea. After some time, library A gets updated in backward compatible manner introducing optimizations and bug fixes. However the game run in the container will still use the old version (unless GOG is going to keep track of that updating these containers with newer middleware, which I doubt they would), while it could run natively and benefit from the improvements. In other cases changes can be breaking, then container would be the proper way to continue using it. So it's a tradeoff. What would be ideal is a way to run something with or without the container optionally to get the most optimal result when available.
avatar
Porkepix: No container is needed : package manager manage very well dependencies so that they can know if a package can use updated libs or not and so on.
Just use the existing package manager which does the job the right way.
Package manager a re just managed dependency hell, re-issussed with every distro release. Multiplied with every distro out there. Compete crazy from ISV or publisher point of view. There is a reason why the steam client completely bypasses this crazy infrastructure and builds something own on top for the games.

avatar
shaddim: Those are workarounds, as you describe it yourself above: a user wants bleeding edge apps AND a stable system. None of the provided options (stable, testing, unstable) provides both. The user has only the option of selecting a bad compromise, where Macos, Android, Windows have actually a solution, separation & specific upgrade cycles.
avatar
Porkepix: Windows, stable? I hope you're joking ;)
Stable over time. They have binary compatiblity and inter-version compatiblity. 95% of all directX games work still on 7/8. All programs work on all desktop variants, tese are qualities the linux ecosystem is missing.
avatar
shaddim: Sure, its not perfect but they try at least hard and they HAVE policies! Linux is not even trying to provide compatibility, neither over time nor over the space of the distros. Nothing. The windows introduced layer is at least so successfully and working that linux copied this layer (also as something similar is not available), the result is called WINE.
avatar
Porkepix: And this is because of lots of apps/games developper which don't accept to build versions for Linux. Nothing else. Compare win32 to POSIX & co. The first one is a bad joke from the quality point of view. So it was not "copied because successful" juste they had to "offer an option to peoples needing this or this tool/game/what else to allow them to use it on their favorite OS".
The linux ecosystem stucks still in the 70s with the posix feature freeze, its a joke and not a contemporary feature complete, modern API for an desktop and multimedia oriented OS. Therefore the limited adoption of the programmers is understandable. Win32+directX is highly successful, with standardized existing toolchains and SDKs, linux has nothing comparable to offer (no, don't call a bunch of libs a subsitute)
Post edited December 13, 2013 by shaddim
avatar
Porkepix: No container is needed : package manager manage very well dependencies so that they can know if a package can use updated libs or not and so on.
Just use the existing package manager which does the job the right way.
avatar
shaddim: Package manager a re just managed dependency hell, re-issussed with every distro release. Multiplied with every distro out there. Compete crazy from ISV or publisher point of view. There is a reason why the steam client completely bypasses this crazy infrastructure and builds something own on top for the games.
And that's a bad idea. Packages managers are a really really great thing missing on other OSs. On Mac OS, there are homebrew/macport/fink to fix this missing feature. But it's easier as it still an Unix-like OS. Windows have nothing of this and that's really missing. So much space consuming to copy every libs and so on…
And guess what? I can work either with Linux or Mac OS (don't care of which one), but no way now to work on Windows. Too much problems, too much missing features…

avatar
Porkepix: Windows, stable? I hope you're joking ;)
avatar
shaddim: Stable over time. They have binary compatiblity and inter-version compatiblity. 95% of all directX games work still on 7/8. All programs work on all desktop variants, tese are qualities the linux ecosystem is missing.
If it was right, GOG would have no reason to exist. They work on compatibility on more reacent Windows OSs…because it doesn't work for a lot of them. And…have you even tried for Linux or is it just something you state? I'm really curious about that…
avatar
Porkepix: And this is because of lots of apps/games developper which don't accept to build versions for Linux. Nothing else. Compare win32 to POSIX & co. The first one is a bad joke from the quality point of view. So it was not "copied because successful" juste they had to "offer an option to peoples needing this or this tool/game/what else to allow them to use it on their favorite OS".
avatar
shaddim: The linux ecosystem stucks still in the 70s with the posix feature freeze, its a joke and not a contemporary feature complete, modern API for an desktop and multimedia oriented OS. Therefore the limited adoption of the programmers is understandable. Win32+directX is highly successful, with standardized existing toolchains and SDKs, linux has nothing comparable to offer (no, don't call a bunch of libs a subsitute)
Stuck in the 70s? Right, Linux kernel was first released in 1991, but why not…
POSIX still very good standard and that's not for nothing if currently I still learn it at University.
win32 is old the same way, but documentation is really less useful, and same about usability.
Be curious. Try for example to create a little threaded program with
- posix threads
- Boost if you use C++ instead of C.
- win32
Then, tell me which one do you find the better to use, and which one had the best documentation ;)

About Direct X? OpenGL, SDL and all part on which the Khronos Group are working on are a perfect alternative.
You now have Unity 3D too. And probably others that I don't know as it's not my speciality at all.
Hello, I just registered here.. and then I remembered why I never did in the past.. I love the concept and that it's all DRM-free. But too bad there are not even a few Linux games in the library.

Linux is now starting to get so many titles (via Steam & indies) that there is enough to choose from without having to run dual-boot or messing around with Wine. So I will probably never buy another non-native game. Hopefully GOG will come around and at least offer Linux versions of some games eventually. Because I would prefer buying from here over Steam.
avatar
ninjalemming: Hello, I just registered here.. and then I remembered why I never did in the past.. I love the concept and that it's all DRM-free. But too bad there are not even a few Linux games in the library.

Linux is now starting to get so many titles (via Steam & indies) that there is enough to choose from without having to run dual-boot or messing around with Wine. So I will probably never buy another non-native game. Hopefully GOG will come around and at least offer Linux versions of some games eventually. Because I would prefer buying from here over Steam.
Welcome to GOG

If you like old DOS games, those you can buy from GOG since they're usually very easy to setup using the linux native Dosbox. That's what I do - buy old games and games that have no linux versions on GOG and those that have, I go elsewhere. Check out Humble Store if you haven't already, there are many DRM-free Linux games there.
Post edited December 13, 2013 by Daliz
The site is up for the SteamOS beta: http://store.steampowered.com/steamos/beta/

Edit: GOL also have an article about this: http://www.gamingonlinux.com/articles/steamos-beta-now-out-in-the-wild.2827
Post edited December 14, 2013 by adamhm
avatar
shaddim: Package manager a re just managed dependency hell, re-issussed with every distro release. Multiplied with every distro out there. Compete crazy from ISV or publisher point of view. There is a reason why the steam client completely bypasses this crazy infrastructure and builds something own on top for the games.
avatar
Porkepix: And that's a bad idea. Packages managers are a really really great thing missing on other OSs. On Mac OS, there are homebrew/macport/fink to fix this missing feature. But it's easier as it still an Unix-like OS. Windows have nothing of this and that's really missing. So much space consuming to copy every libs and so on…
And guess what? I can work either with Linux or Mac OS (don't care of which one), but no way now to work on Windows. Too much problems, too much missing features…
No, they are only a good idea for the core part, the OS. To use package management to glue everything together (OS and apps) is conceptually very wrong, in the words of debian founder Ian Murdock: "moving everything into the distribution is not a very good option. Remember that one of the key tenets of open source is decentralization, so if the only solution is to centralize everything, there’s something fundamentally wrong with this picture." Also Ingo Molnar, kernel developer, recently wrote an essay with the same focus: "Desktop Linux distributions are trying to "own" 20 thousand application packages consisting of over a billion lines of code and have created parallel, mostly closed ecosystems around them. The typical update latency for an app is weeks for security fixes (sometimes months) and months (sometimes years) for major features. They are centrally planned, hierarchical organizations instead of distributed, democratic free societies."

avatar
shaddim: Stable over time. They have binary compatiblity and inter-version compatiblity. 95% of all directX games work still on 7/8. All programs work on all desktop variants, tese are qualities the linux ecosystem is missing.
avatar
Porkepix: If it was right, GOG would have no reason to exist. They work on compatibility on more reacent Windows OSs…because it doesn't work for a lot of them. And…have you even tried for Linux or is it just something you state? I'm really curious about that…
avatar
shaddim: The linux ecosystem stucks still in the 70s with the posix feature freeze, its a joke and not a contemporary feature complete, modern API for an desktop and multimedia oriented OS. Therefore the limited adoption of the programmers is understandable. Win32+directX is highly successful, with standardized existing toolchains and SDKs, linux has nothing comparable to offer (no, don't call a bunch of libs a subsitute)
avatar
Porkepix: Stuck in the 70s? Right, Linux kernel was first released in 1991, but why not…
I meant the outdated architecture behind, unix. It was a historical mistake to built a free and open source DESKTOP OS after the model of a (outdated) server OS. As the unix design is also older than the invention of the PC conecpt several innovations which were innovated with the PC are still missing in linux, for instance the clear separation beween OS and apps.

avatar
Porkepix: POSIX still very good standard and that's not for nothing if currently I still learn it at University.
win32 is old the same way, but documentation is really less useful, and same about usability.
Be curious. Try for example to create a little threaded program with
- posix threads
- Boost if you use C++ instead of C.
- win32
Then, tell me which one do you find the better to use, and which one had the best documentation ;)

About Direct X? OpenGL, SDL and all part on which the Khronos Group are working on are a perfect alternative.
You now have Unity 3D too. And probably others that I don't know as it's not my speciality at all.
I indeed used pthreads some years ago for cross platform development.The linux port caused me no problem beside a performance one, the thread creation overhead on linux was way bigger than on windows. But the pthread lib provides no easy thread pool/ thread re-use possibility to compensate for that. I had to implement it myself, was a pain in the ass. But this is only one example: another one is the linux audio infrastructure: even Pulse audio creator Poettering agreed that it is hard to know which library to use, another example might be the fragile compatibility of the glibc... and so on. Such issues are non existing on real desktop OSes which are designed as coherent platforms.
Post edited December 14, 2013 by shaddim
avatar
Porkepix: And that's a bad idea. Packages managers are a really really great thing missing on other OSs. On Mac OS, there are homebrew/macport/fink to fix this missing feature. But it's easier as it still an Unix-like OS. Windows have nothing of this and that's really missing. So much space consuming to copy every libs and so on…
And guess what? I can work either with Linux or Mac OS (don't care of which one), but no way now to work on Windows. Too much problems, too much missing features…
avatar
shaddim: No, they are only a good idea for the core part, the OS. To use package management to glue everything together (OS and apps) is conceptually very wrong, in the words of debian founder Ian Murdock: "moving everything into the distribution is not a very good option. Remember that one of the key tenets of open source is decentralization, so if the only solution is to centralize everything, there’s something fundamentally wrong with this picture." Also Ingo Molnar, kernel developer, recently wrote an essay with the same focus: "Desktop Linux distributions are trying to "own" 20 thousand application packages consisting of over a billion lines of code and have created parallel, mostly closed ecosystems around them. The typical update latency for an app is weeks for security fixes (sometimes months) and months (sometimes years) for major features. They are centrally planned, hierarchical organizations instead of distributed, democratic free societies."
It depends on which distro are you talking about. They just all have different policies. For example, Ubuntu's ppa, Archlinux's AUR and so on.
You'll never find an OS more up-to-date than Archlinux. Updates are present at least in AUR at a maximum of 24H after their release.

Btw, the packet manager don't glue everything. Peoples still put packets inside because they find it more practical and don't want to bother about creating and managing their own repository. But some companies did it, for example Opera browser offer their own repository for debian (http://deb.opera.com/). More companies or peoples can just do it, and hence don't be linked and glued to distro/system.

avatar
Porkepix: If it was right, GOG would have no reason to exist. They work on compatibility on more reacent Windows OSs…because it doesn't work for a lot of them. And…have you even tried for Linux or is it just something you state? I'm really curious about that…

Stuck in the 70s? Right, Linux kernel was first released in 1991, but why not…
avatar
shaddim: I meant the outdated architecture behind, unix. It was a historical mistake to built a free and open source DESKTOP OS after the model of a (outdated) server OS. As the unix design is also older than the invention of the PC conecpt several innovations which were innovated with the PC are still missing in linux, for instance the clear separation beween OS and apps.
See upside. Repositories are centralized for user, but you have no obligation to centralize it on server side. The package manager centralize every repositories you need, but can use itself a very large bunch of repositories (tenths, hundreds…)

avatar
Porkepix: POSIX still very good standard and that's not for nothing if currently I still learn it at University.
win32 is old the same way, but documentation is really less useful, and same about usability.
Be curious. Try for example to create a little threaded program with
- posix threads
- Boost if you use C++ instead of C.
- win32
Then, tell me which one do you find the better to use, and which one had the best documentation ;)

About Direct X? OpenGL, SDL and all part on which the Khronos Group are working on are a perfect alternative.
You now have Unity 3D too. And probably others that I don't know as it's not my speciality at all.
avatar
shaddim: I indeed used pthreads some years ago for cross platform development.The linux port caused me no problem beside a performance one, the thread creation overhead on linux was way bigger than on windows. But the pthread lib provides no easy thread pool/ thread re-use possibility to compensate for that. I had to implement it myself, was a pain in the ass. But this is only one example: another one is the linux audio infrastructure: even Pulse audio creator Poettering agreed that it is hard to know which library to use, another example might be the fragile compatibility of the glibc... and so on. Such issues are non existing on real desktop OSes which are designed as coherent platforms.
I had for my part to use threads for a project. I tried POSIX and BOOST threads. They was working fine, with very good documentation, portability and pretty much easy to use. But then my teacher required me to use win32's ones. The documentation was old (it was 4 years ago…he gave me a documentation not updated since Windows 95), very confusing and pretty hard to use…and worst of all, not portable at all.
It's even not easy to find precise documentation on Internet because all of the win32 API is just a huge confusing piece of lots of things, and find a good documentation for everything is just a pain in the ass (would like to explain it in an other way, but vocabulary is a bit missing ;( ). On my side, I just RTFM : open my terminal, man pthread_create or what else, and that's fine.

The choice is very easy to do for my part.
C++11 has language level abstraction for threads. Anyway, that or boost threads translate into native one (pthread or Windows threads) one each system.
Side news - Valve is using Debian stable as a base for Steam OS.
avatar
shmerl: C++11 has language level abstraction for threads. Anyway, that or boost threads translate into native one (pthread or Windows threads) one each system.
And that's, imho, the better way to do it while the language is a multi-platform one.
avatar
shmerl: C++11 has language level abstraction for threads. Anyway, that or boost threads translate into native one (pthread or Windows threads) one each system.
avatar
Porkepix: And that's, imho, the better way to do it while the language is a multi-platform one.
Every intermediate library has to be translated into native system threads. I used pthreads syntax on linux AND windows for being portable (http://sourceforge.net/projects/pthreads4w/), despite that additional layer on windows pthreads performed better there (creation performance).

A good article about the bad history of linux and threads is here : http://www.drdobbs.com/open-source/nptl-the-new-implementation-of-threads-f/184406204

avatar
shaddim: No, they are only a good idea for the core part, the OS. To use package management to glue everything together (OS and apps) is conceptually very wrong, in the words of debian founder Ian Murdock: "moving everything into the distribution is not a very good option. Remember that one of the key tenets of open source is decentralization, so if the only solution is to centralize everything, there’s something fundamentally wrong with this picture." Also Ingo Molnar, kernel developer, recently wrote an essay with the same focus: "Desktop Linux distributions are trying to "own" 20 thousand application packages consisting of over a billion lines of code and have created parallel, mostly closed ecosystems around them. The typical update latency for an app is weeks for security fixes (sometimes months) and months (sometimes years) for major features. They are centrally planned, hierarchical organizations instead of distributed, democratic free societies."
avatar
Porkepix: It depends on which distro are you talking about. They just all have different policies. For example, Ubuntu's ppa, Archlinux's AUR and so on.
You'll never find an OS more up-to-date than Archlinux. Updates are present at least in AUR at a maximum of 24H after their release.

Btw, the packet manager don't glue everything. Peoples still put packets inside because they find it more practical and don't want to bother about creating and managing their own repository. But some companies did it, for example Opera browser offer their own repository for debian (http://deb.opera.com/). More companies or peoples can just do it, and hence don't be linked and glued to distro/system.
"Distro fragmentation" at its best, as this approach is built around a central distro repository. The opposite of portability, it is crazy for an ISV to built & test dozends of packages even for only all major distro variants.

And rolling release is some kind of russian roulette for an end user. (and not a option at all for commercial software)

About PPAs (and similar hackish backport approaches), the downsides are described in this bug report (and comments): It's easier and safer to install the newest versions of popular open source software on Windows than on Ubuntu. (Or, why it's high time Ubuntu made upgrading to stable versions of software easier and safer)
Post edited December 15, 2013 by shaddim