kohlrak: And C++ itself is changing, too, so legacy code is no longer compatible with modern times. Although the web should not be one of those things, there are things that should not change. With all the power we have in technology, why is it so difficult to get some sort of reliable standard for backwards compatibility? Why is it that when a program could take 5 years to even make, do we feel that 5 years "is enough"? Why can't we keep a computer for 10 or more years, when they're so expensive? With Wirth's Law in perspective, i really don't see things improving, overall.
Darvond: <marquee>, <blink> and several other elements of HTML are no longer valid for use in today's websites.
(Just look at this Mozilla documentation repository for reference.) Nobody in their right mind uses ActiveX anymore, even Microsoft discontinued support for it; by your logic, serial and parallel ports should be the defacto standard, even though they're a pain to use and develop for, and PS/2 should still be used even though it can lock a system out if disconnected while in use.
You're making a strawman argument. Believe it or not, these ports are still used, but not by regular people. The idea is, the option to have or use these things should be available. For hardware ports, a PCI card would suffice (and PCI is a useful standard, still, and will probably never go away).
Why should someone have to be held to Soviet levels of backwards compatibility when things are improving and progressing? Should I really be held back by a 16 bit address bus, in spite of a 0% use case of computers being considered 16 bit? Surely even you understand that the internet was divided into 3 major eras: Web 1, Web 2, and Web 3. I suppose IoT, if it ever takes off, could be Web 4.
Actually, your computer, when turned on, usually starts out in a 16bit compatibility, which then jumps to 20bit compatibility mode, until it uses some method (usually a quick method using an extra bit on the PS/2 port) to go to the higher modes. Yes, your computer actually does this already. The compatibility need to require hardware removal to maintain.
As for your citation of Wirth's Law, do you really understand how hard it is to optimize software when you have a PHB yelling at you to get it done yesterday? At least today, patches can be made to optimize or even rebirth things into very good states. (KDE's devs recently went out and improved their entire frameworks a ton, and No Man's Sky is practically a new game at this point.)
And with every employee that accomplishes this, the requirements get harder. I understand, but i also understand that not every employee is given those strict deadlines. What about indie game devs? If all I'm doing is getting a new piece of hardware so i can do the same things with new software that i could do with old, am I really experiencing progress? When I can play skyrim but not minecraft on my computer, 'cause I don't have the money to upgrade, should I be expecting much progress? Throw minecraft out for just about any new game with nice pixel graphics and a weak engine. The reason for exponential growth is that
no one's optimizing.
kohlrak: And C++ itself is changing, too, so legacy code is no longer compatible with modern times. Although the web should not be one of those things, there are things that should not change. With all the power we have in technology, why is it so difficult to get some sort of reliable standard for backwards compatibility? Why is it that when a program could take 5 years to even make, do we feel that 5 years "is enough"? Why can't we keep a computer for 10 or more years, when they're so expensive? With Wirth's Law in perspective, i really don't see things improving, overall.
thealtmaster: this guy gets it
Honestly, i'm just salty that i spent years learning programming and what's under the hood, only to constantly be told that X needs to update, including the programming language itself, and that my tried and true methods are no longer valuable, 'cause someone isn't happy. ISO is in the process of updating both C and C++, and I don't even want to know what else they're stepping on. But, i bet they won't update the standard for CDs to make it any easier to implement or read (that was a real pain in the ass, back when i did it: they defined the definitions of terms for sections and sub-sections exhaustively, and i don't mean for sake of making sure every word in the english language was defined: i mean they actually defined the same words repeatedly [when i finally caught on to this, which was harder than it sounds, i realized that the whole thing could've been shrunk by about 80%, which really pissed me off 'cause i printed out the standard just so i could read it during my study halls]).
EDIT: To be clear, i would not be bothered if they actually made sensible updates to standards to make them much easier to comply to or even use. Certain things are out of control and are long overdue for some standards or even updates to their standards. Take video cards for example: how long has it been since the actual method of communication beeen updated? Why is therre no standard for something as simple as an ISR for vsync? And what about other hardware? UEFI was supposedly going to make all these big changes that would make (at least some) drivers universal, but that never happened, either.