It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
I've been thinking, and i've concluded console generations are way too short.

Console generations have lasted as short as 3 years, normally 5-6, and 8 for the 360. But if console generations were all 8-10 years, then this exponential growth in cost for making games wouldn't be nearly as severe as it is today. It grew too quickly too fast... Not only that hardware would have been cheaper to get out at the get go.

Consider. It takes 3+ years for the devs to get a feel for the hardware, meaning they have enough time for one more game, or if they are a huge company, 3-4 more games before the console ends at the expected 5 years... But if the generation length was always 8+ years, then they would have 3 years of early titles, and 5 years of good titles where they push the hardware.

Yeah graphical fidelity and quality might be lacking, but it's clear the PC market is the only place where the best is present. This wasn't always the case, for years and years the console market was ahead of PC's, Genesis and SNES and PS2 and Gamecube and the 360 all showed they were way ahead of the curve...

But graphics have never been the big thing that brought in people. Yeah pretty graphics are nice, but each time they push for the new technology and refuse to accept another kind, they leave behind their strengths and take on something new and usually end up with shoddy games (Earthworm Jim 3D as an example).

Assuming they had a set path, a projection of where the gaming systems were going way ahead of time back in the SNES days and said: The next generation will have XXX for the general specs within 25%, then in 10 years these are the specs within 25%, etc etc, then they could start developing and planning the next console long before hand, devs could get their hands on the figures and plan accordingly. As hardware grew stronger and cheaper they could then build the devices fairly cheaply. If the systems followed the same basic hardware (say ARM, or x86) they could have also planned backwards compatibility, or better yet the games could know the system could get better in time and games that had lower frame rates could take advantage of the 4x faster processor and make smoother gameplay, perhaps auto-upgrading resolution using algorithms in the background, increasing graphical polygon count for models, adding Anti-aliasing, etc. Games using 3D would improve with time (hopefully), and even the game devs could sell simple upgrade DLC that better enhances games for future generations rather than having to remake/remaster games whole.

This also means each system would have a huge upgrade each generation, a noticeable improvement like we saw between 8/16/32 bit systems.

Although we couldn't have planned for how things have turning out today, there was moore's law that was pretty accurate for a while and still probably used as a rough reference. If Moore's law was accurate (hardware doubling every 18 months), a 10 year life cycle could see a solid growth of hardware of each generation being 4x-6x, and probably resolution doubling each time, and with graphics accelerators they too would grow 4x-6x per generation. If hardware growth doesn't increase as well as expected the consoles can continue to grow at their own pace until they catch up to the cutting edge and then stop there.

Honestly, we're already there. Seeing how much trouble they have having with getting games to run at 1080p60fps, i seriously doubt they can push for 4k gaming. Maybe you can do it on PC, but simply higher resolution doesn't really seem to do much (although it removes the need for AA and some effects) and takes a lot more processing power, so much probably that a $3,000 or more rig is needed using high end graphics cards.

But gaming was always about having fun, not being the most graphically impressive. Honestly i can live with PS2 graphics or sprites, Diablo 2 looked awesome and isn't too intensive graphically or hardware wise, so why people are pushing to get past the uncanny valley i don't know, because as games are pushed more graphically the gameplay and mechanics suffer, while it's the mechanics that make the game, so the games are being siphoned. I'll still pick up and play Megaman 2 and love every minute of it (even if i can beat it in an hour or so); But games that are graphically intensive probably not nearly so much like Beyond Two Souls.
With consoles I find it better to get in late. With PCs it's the opposite. A friend of mine just got a PS4. Right now there are like 300 or so games for the PS4 in total. Last year I just got a PS3. There's thousands of games available for the PS3, plus I spent under $200 for mine, whereas my friend spent over $500 on his. Less money, more games... or more money, less games. And what's the upshot for the newer system? Better graphics and some bragging rights?

Meh. Stick with the basics and wait for a deal. You'll enjoy more, save money, and feel like a boss knowing you saved money when your friends were being tech-chumps.
But if the planned the consoles correctly with proper backwards compatibility knowing the games could take advantage of future hardware advancements (cpu/gpu power mostly) then it would no longer be 'there are only 200 games for this system' it's 'there's 200 games and 1000 from it's previous libraries'. Also adopting newer systems would be instant because you wouldn't have to have multiple hardware to play the games you have/want. The whole sitting on the fence because you are playing the previous gen's games wouldn't happen, instead you either upgrade, or wait for the hardware to critically fail, then buy/upgrade and transfer all your saves/prefs to the new system.

The way it's done today is horrible... :( Seriously...

They seriously should have also sat down with the different competitive systems and said: "Hey, we're competing, but we're both selling our systems for $300 and you can only play one at a time. What if you could hook one to the other and use a simple API in order to take advantage of the other hardware to boost a game assuming you have both systems?" which effectively shares the CPU, RAM and GPU but not the games on the other system. That would have let you get 2 separate systems and if you happen to have both, they help eachother work, not quite as an upgrade, some games could require both systems and include two discs one for each, then you just use one for the output while they both work in conjunction...

*Sigh*....
...waiting for TinyE to make a joke about length...

I still play PS2 games. I was late to the PS3 market, and it'll probably be another few years before I jump on the PS4 bandwagon. I'd rather have good games than "Look how pretty this is; hopefully you won't notice the game sucks."
Post edited April 04, 2015 by DieRuhe
avatar
DieRuhe: ...waiting for TinyE to make a joke about length...
How long are you willing to wait?
avatar
DieRuhe: ...waiting for TinyE to make a joke about length...
avatar
mrkgnao: How long are you willing to wait?
As long as it takes for him to come.
avatar
mrkgnao: How long are you willing to wait?
avatar
Elenarie: As long as it takes for him to come.
That's not long.
avatar
rtcvb32: But graphics have never been the big thing that brought in people.
.
.
.
But gaming was always about having fun, not being the most graphically impressive.
Maybe not for the users of this site, but for the general games-buying public I think you are dead wrong.
avatar
DieRuhe: ...waiting for TinyE to make a joke about length...
That might be too sensitive a subject for him...