kohlrak: You seem to be missing the big picture of what I'm saying, largely because you must not be aware of what's going on.
Magnitus: I haven't done any C++ in close to 10 years so no, I'm blatantly unaware of what is going on. C++ is great for gpu intensive games, but its no longer a big driver in networked applications.
I'd argue C++ isn't even relevant for games anymore, because of their attempts to "keep it relevant."
kohlrak: C++ is being updated under ISO, and the representative from Google is trying to change the language to suit Google's needs. The most recent controversy is demands to break the ABI. Of course, this is a big deal for GOG customers, but trying to explain how this could affect people here is not likely to be well understood. I understand C++ works by jumping on top of C (something else they're desparately trying to change). Python2 vs Python3 is more like C++ of 2001 vs C++ as of the next update or two. The reason, for this, is obviously GO never caught on, and since C++ is still way bigger, they're needing to change the language to force people to use GO (by turning C++ into GO). Supposedly GO is more popular, but that's outside of the scope of my point.
Go is extremely popular in the cloud. If you're devops and want to extent Kubernetes or write Terraform providers, it really pays to know Go.
Overall, I don't think Go and C++ are really competing. They occupy different niches.
C++ is still the winner for lower level things that need to draw every inch of performance. Golang has taken a lot of the system level things above that. Its just a lot faster to write something correct in Go.
Well, here's the irony: the arguments from Google's rep are based on the idea that C++ is no longer able to keep up with other languages because the ABI's inefficient (i would agree, but not on the basis that it's because of the C++ specific things). The big picture, now, is to break the ABI to make a C++ specific version. Eseentially, this would end up breaking all the DLLs out there that have functions that begin with _Z (prefix for C++ imports).
kohlrak: My point is, we, as developers, need to start supporting things that are far, far more stable and reliable for long-term purposes. Projects like GOG repo can't be breaking (or putting the very busy dev under heavy stress) every 2 or 3 years (yes, i know python's cycle is longer than that, but for how long?) because someone thinks they have a way to change a language syntax to force coders to "make better choices." While this is great for making busy-work for the coders so that they can keep jobs on "completed projects," this means people like Kalanyr need to spend more time porting the code than working on other goals that he had. I mean, here's we're talking about python2 vs python3, but here i'm looking into whether or not these Python4 projects are real or if people are trying to pre-emptively say "Don't you even think about it." (Of course, i'm leaning towards the latter.)
If you want stable and reliable, give people a completely self-contained binary, ideally one for each platform supported with as few assumptions as humanly possible (ideally, a web page for the gui, almost everyone has a browser). Anything else will give you pains.
I know, I've been writing tooling for devs for nearly a decade now. I've seen it all.
Wrong version of node. Wrong version of Python (sometimes, differing only by minor version). Doesn't install well in Windows because of x. Doesn't install well in MacOs because of y. Etc.
And of course, everybody just loves the "He tried to install Python 3 besides Python 2, with pip for both and some libraries and now, its all a jumbled mess... nobody is quite sure what he did exactly, but he really needs to format his machine, because nobody wants to troubleshoot this mess... formatting is quicker".
For a bit, I though Docker was the magical solution for all that, until I realised most people are not motivated enough to absorb that bit of complexity (even though it will rock their world and open up countless possibilities for them and make them way better devs than they were before, they just will not learn it).
Try to avoid, as much as possible, writing anything that assumes something (the correct version of an interpreter, a java vm, linked libraries, whatever) is installed on your user's machine. If you do, there will be pain.
Sometimes, you have no choice, but if you have a choice, avoid it.
PS: Python is still awesome for server-side and pipeline scripting, where you have tight control on the environment. I would just not use it anymore for a client that I pass around to users.
That's a similar conclusion i came to, but now we're looking at threats of damaged binaries. Making binaries that are completely self contained is untenable, because, at the end of the day, something must shoot up to the OS. What I've opted for (and argued for above) is self-contained code: while i can't prove C is safe, it's also the hardest thing to change without completely breaking everything out there. While I wished i could use some of the automatic features of C++ to help me (constructors and destructors), I can't rely on it. Instead, I've been making my own versions of things like "smart pointers" in C, just basically making a C-friendly library of things I use from C++, because I can't trust anything, anymore. I seem to be slowly finding alot of loose people out there that are about sick of this crap, too, especially in regards to languages like PHP: people are sick of having to write code from scratch for simple projects like gogrepo that shouldn't need much more than minor security and feature updates.
While alot of the time these things are said to be for improving the security of code written by coders to force them to code with better practices or something, usually these updates end up causing coders to just rush development, and abandon plans to improve certain algorithms, because they also have other things to be coding.
As for docker, i've been very unimpressed. Don't get me wrong, it's a cool idea, but i foresee it used in horrid ways. How long until the IDE on your screen is only 30% of your screen at one extreme, and at another your boss locks your docker and you have to ask you manager for permission to open google to search a problem at the other extreme. I think docker has potential from a user customization perspective, but my experience with the average user is that they spend less time learning how to use this. As for coders, this has the potential to be a nightmware if they find ways (assuming they haven't) to lock it for reasons stated above.
When looking up magical solutions, I think we need to take a step back and look at how these always turn out. I think OOP is easily the clearest example. Object oriented programming was supposed to be the big thing that ended all complexity, because you would just only worry about your current level of abstraction. There were a number of problems with this, and it could easily be explained in terms of "the double free bug." You see, object oriented programming was attempting at coming up with a simpler way to address a common problem, but not actually the magic solution to said problem. Fundamentally, at the end of the day, someone needs to free the resource, but the question reasonably becomes who's job that really is, and the fundamental problem that C++ was to address was to make it so you didn't ahve to think about such things, but that required someone actually did and declared that in the object description. So, then, someone catches that there is a memory leak, so the coder using the object then frees the memory, and then the library's updated and the maker of the class has decided that he needs to solve that bug himself. Now you have a double-free issue, which resulted from the opposite issue: not freeing. Prior to OOP, the question of "who's job is it, anyway?" was always present, and OOP aimed to solve that, but it actually failed to do so. However, it looks really nice in it's attempts. I'd argue that OOP's actual gain wasn't the things it set out to do, but the things it managed to accomplish without setting out to do them (better documentation of available APIs, via the nice little drop-down menues in IDEs after putting a dot), automatic constructors and destructors (most people don't realize that C++ actually does have a garbage collector if anyone would ever actually use it). On the flip side, then we got the abstraction hell, which, due to how we compile code, often ends up in alot of RAM hogging (because, polymorphism usually results in function pointers to function pointers to function pointers to function pointers to function pointers ad inifnitum well beyond cache misses). (I did some googling and some people are complaining specifically about encapsulation, but i believe this is the result of the fundamental issue i stated above, not the lack of encapsulation [because encapsulation would happen naturally].)
That said, contrary to what i written above, I actually like OOP. OOP provides wonderful tools for encapsulation, once everyone crosses their Ts and dots their Is. To that end, i'm sure docker's great, but we should also ask ourselves what it'll really cost. That goes for any ideas.