Posted April 04, 2015
I've been thinking, and i've concluded console generations are way too short.
Console generations have lasted as short as 3 years, normally 5-6, and 8 for the 360. But if console generations were all 8-10 years, then this exponential growth in cost for making games wouldn't be nearly as severe as it is today. It grew too quickly too fast... Not only that hardware would have been cheaper to get out at the get go.
Consider. It takes 3+ years for the devs to get a feel for the hardware, meaning they have enough time for one more game, or if they are a huge company, 3-4 more games before the console ends at the expected 5 years... But if the generation length was always 8+ years, then they would have 3 years of early titles, and 5 years of good titles where they push the hardware.
Yeah graphical fidelity and quality might be lacking, but it's clear the PC market is the only place where the best is present. This wasn't always the case, for years and years the console market was ahead of PC's, Genesis and SNES and PS2 and Gamecube and the 360 all showed they were way ahead of the curve...
But graphics have never been the big thing that brought in people. Yeah pretty graphics are nice, but each time they push for the new technology and refuse to accept another kind, they leave behind their strengths and take on something new and usually end up with shoddy games (Earthworm Jim 3D as an example).
Assuming they had a set path, a projection of where the gaming systems were going way ahead of time back in the SNES days and said: The next generation will have XXX for the general specs within 25%, then in 10 years these are the specs within 25%, etc etc, then they could start developing and planning the next console long before hand, devs could get their hands on the figures and plan accordingly. As hardware grew stronger and cheaper they could then build the devices fairly cheaply. If the systems followed the same basic hardware (say ARM, or x86) they could have also planned backwards compatibility, or better yet the games could know the system could get better in time and games that had lower frame rates could take advantage of the 4x faster processor and make smoother gameplay, perhaps auto-upgrading resolution using algorithms in the background, increasing graphical polygon count for models, adding Anti-aliasing, etc. Games using 3D would improve with time (hopefully), and even the game devs could sell simple upgrade DLC that better enhances games for future generations rather than having to remake/remaster games whole.
This also means each system would have a huge upgrade each generation, a noticeable improvement like we saw between 8/16/32 bit systems.
Although we couldn't have planned for how things have turning out today, there was moore's law that was pretty accurate for a while and still probably used as a rough reference. If Moore's law was accurate (hardware doubling every 18 months), a 10 year life cycle could see a solid growth of hardware of each generation being 4x-6x, and probably resolution doubling each time, and with graphics accelerators they too would grow 4x-6x per generation. If hardware growth doesn't increase as well as expected the consoles can continue to grow at their own pace until they catch up to the cutting edge and then stop there.
Honestly, we're already there. Seeing how much trouble they have having with getting games to run at 1080p60fps, i seriously doubt they can push for 4k gaming. Maybe you can do it on PC, but simply higher resolution doesn't really seem to do much (although it removes the need for AA and some effects) and takes a lot more processing power, so much probably that a $3,000 or more rig is needed using high end graphics cards.
But gaming was always about having fun, not being the most graphically impressive. Honestly i can live with PS2 graphics or sprites, Diablo 2 looked awesome and isn't too intensive graphically or hardware wise, so why people are pushing to get past the uncanny valley i don't know, because as games are pushed more graphically the gameplay and mechanics suffer, while it's the mechanics that make the game, so the games are being siphoned. I'll still pick up and play Megaman 2 and love every minute of it (even if i can beat it in an hour or so); But games that are graphically intensive probably not nearly so much like Beyond Two Souls.
Console generations have lasted as short as 3 years, normally 5-6, and 8 for the 360. But if console generations were all 8-10 years, then this exponential growth in cost for making games wouldn't be nearly as severe as it is today. It grew too quickly too fast... Not only that hardware would have been cheaper to get out at the get go.
Consider. It takes 3+ years for the devs to get a feel for the hardware, meaning they have enough time for one more game, or if they are a huge company, 3-4 more games before the console ends at the expected 5 years... But if the generation length was always 8+ years, then they would have 3 years of early titles, and 5 years of good titles where they push the hardware.
Yeah graphical fidelity and quality might be lacking, but it's clear the PC market is the only place where the best is present. This wasn't always the case, for years and years the console market was ahead of PC's, Genesis and SNES and PS2 and Gamecube and the 360 all showed they were way ahead of the curve...
But graphics have never been the big thing that brought in people. Yeah pretty graphics are nice, but each time they push for the new technology and refuse to accept another kind, they leave behind their strengths and take on something new and usually end up with shoddy games (Earthworm Jim 3D as an example).
Assuming they had a set path, a projection of where the gaming systems were going way ahead of time back in the SNES days and said: The next generation will have XXX for the general specs within 25%, then in 10 years these are the specs within 25%, etc etc, then they could start developing and planning the next console long before hand, devs could get their hands on the figures and plan accordingly. As hardware grew stronger and cheaper they could then build the devices fairly cheaply. If the systems followed the same basic hardware (say ARM, or x86) they could have also planned backwards compatibility, or better yet the games could know the system could get better in time and games that had lower frame rates could take advantage of the 4x faster processor and make smoother gameplay, perhaps auto-upgrading resolution using algorithms in the background, increasing graphical polygon count for models, adding Anti-aliasing, etc. Games using 3D would improve with time (hopefully), and even the game devs could sell simple upgrade DLC that better enhances games for future generations rather than having to remake/remaster games whole.
This also means each system would have a huge upgrade each generation, a noticeable improvement like we saw between 8/16/32 bit systems.
Although we couldn't have planned for how things have turning out today, there was moore's law that was pretty accurate for a while and still probably used as a rough reference. If Moore's law was accurate (hardware doubling every 18 months), a 10 year life cycle could see a solid growth of hardware of each generation being 4x-6x, and probably resolution doubling each time, and with graphics accelerators they too would grow 4x-6x per generation. If hardware growth doesn't increase as well as expected the consoles can continue to grow at their own pace until they catch up to the cutting edge and then stop there.
Honestly, we're already there. Seeing how much trouble they have having with getting games to run at 1080p60fps, i seriously doubt they can push for 4k gaming. Maybe you can do it on PC, but simply higher resolution doesn't really seem to do much (although it removes the need for AA and some effects) and takes a lot more processing power, so much probably that a $3,000 or more rig is needed using high end graphics cards.
But gaming was always about having fun, not being the most graphically impressive. Honestly i can live with PS2 graphics or sprites, Diablo 2 looked awesome and isn't too intensive graphically or hardware wise, so why people are pushing to get past the uncanny valley i don't know, because as games are pushed more graphically the gameplay and mechanics suffer, while it's the mechanics that make the game, so the games are being siphoned. I'll still pick up and play Megaman 2 and love every minute of it (even if i can beat it in an hour or so); But games that are graphically intensive probably not nearly so much like Beyond Two Souls.