Fenixp: a) i7s are not that common, and my i5 is not handling the game nearly as well.
b) While my VRAM is not capped, my GTX 970's processor is not exactly chill when chewing trough Witcher 3.
With power comes responsibility. After all, we are PC Master Race. /grin
If game is properly made (i.e. not like Project CARS, Assassin's Creed: Unity, or Batman: Arkham Knight) for PC it's only fair to have higher system requirements for better everything, not just slightly farther draw distance, a statistically insignificantly more grass patches, and "ultra" settings that somehow stagger-lock your PC, while to find difference between this setting and consoles visuals you'd need a forensics team (hint: they won't find any).
On serious side, true, i7 are more work-oriented CPUs, and I don't have any i5 at hand to verify it personally, yet none of my buddies with them were complaining on W3 performance. Neither were those with AMD CPU, and that I can verify as well (FX 6350 and 8350), Even i3 works fine, if coupled with adequate GPU - game may use only 2 Gb of VRAM, but if you prefer higher settings, faster GPU is welcomed.
As for temperatures, my 290x is not really that hot while playing W3, even in Crossfire mode (that is still have issues, and I don't think it's AMD fault) where primary GPU is hotter, temperatures are still lower than in some other games, not to mention stress-tests or benchmarks, which really warm GPUs up. In addition, even single GPU load diagram shown by, say, GPU-Z app, shows quite uneven load distribution. Given various timeframes jumps, this may be sign of not really good optimization. Or so I've heard.
Fenixp: <snip>
The final game has needs to handle a lot more than just that.
Yeah, I read that in that Eurogamer interview I mentioned earlier. And that's where I become curious - how big that world actually was? How many NPCs were tracked, how exactly they are tracked, all are in real-time or there is something similar to X-universe approach, (okay, that was sector-based game, not "open world"), where calculation occurs differently for those whom you can see and for those whom you cannot, or it is something GTA V alike, where NPCs are spawned out of thin air? Moreover, I may be wrong, but while observing ingame NPCs I did not exactly saw much "life simulation". I tried to follow several NPCs, it's not like Elder Scrolls schedules where Skingrad butler could go shopping and fall off the bridge. Judging by load on CPU I don't really think Witcher 3 has so many issues computing all those unnamed NPCs schedules and behaviour.
So I think best proof of their words is just to give us those visuals so anyone could enjoy original palette, accompanied by earlier smoke FX and draw distance. Offering extended settings, a-la ArmA series offer, where you can adjust practically anything you want, including draw distance, would also be nice. Maybe we can do the same via digging in various .ini files, but come on, it's 2015, where is usability in that? I doubt you go down into the mines to chip some ore if you need nails.
Easier method would be real-time video interview, I guess we have few psychologists around who could easily tell whether CDPR are lying or not. :p
Fenixp: There's a difference between bullshit and enthusiasm.
Enthusiasm is fine as long as it does not involves financial interactions between parties. That's where enthusiasm is substituted with something different. You can write down very intricate EULA, but it won't change the point.
Fenixp: Most gaming developers I have ever met loved the games they were developing, regardless of them being called "Mafia" or just some random hidden object games. That was always the unifying factor I felt when conversing with them - they were willing to work under fairly shitty conditions just because they wanted to see their product finished.
Most interesting work does not always offers best salary.
Heard that from various developers how happy they are to work somewhere, how they wait for next morning so they could come to the job again, what anticipation they have while waiting for next monday, because every weekend is a torture away from work... And then you ask them why their game is that bad. ¯\_(ツ)_/¯ Oh well.
All I want to say is that things are not that sunshine and bunnies, because, well, they are not. Some abuse workers enthusiasm, some just publish shovelware games (hint: some won two golden poo in row, some cannot do that, because they aren't in US), negating all positive emotions generated on personal level.
Fenixp: What that also means, however, is that when they're told their product will be showcased to general public, they'll go nuts over making it look as good as possible. There's no dishonesty involved - only enthusiasm for the product they were working their ass off to create for years, and desire to share their vision with the world. I'm not going to claim PR won't abuse this enthusiasm, but that's another matter entirely, there's very little they can do about the final demo they'll be showcasing.
Yeah, just like in ah, Aliens: Colonial Marines, right? Or Batman: Arkham Knight? Chipmunked Nvidia Gameworks video? Pure enthusiasm, desire to make it look as good as possible... To share their vision, or to sell as much copies, knowing that even game is not exactly good (okay, it's rubbish), people won't really issue a wave of refunds and will wait till game will be patched as it usually were?
Don't get me wrong, true, I'm sarcastic guy, I'm not hiding that, but I like games. Not like Anita, unlike her I do like games, and play them. And criticize them, if I like them (or there is just good target for pinching, like Sims - I don't play them, but that "always online" is just so good reason to bite them for it), just because I want to see them to became better. That's why when I see practices like these I will shove developers mugs into mud, just because these practices are bad, unhealthy, and promote negligence and lack of responsibility. Everyone who defends such practices only encourages developers to play this trick again, and again. The only thing developer will get - is loss of trust. No amount of "free" DLC would change that.
Nobody criticized CDPR for that awesome "Night to remember" trailer, because it is great, and it is trailer. Solutions to avoid situations like these? 1. Lazy way: write "this video does not represent final quality of product (which is most likely will deteriorate, because of consoles hardware limitations)" :P. 2. Honest way: show what you can show now, without bullshots or vertical slice. I'm not big fan of Bethesda's craftsmanship, but their way of showing rather ugly, yet actual visuals is impressing. And if Bethesda will manage to make game prettier - even better.
Leave visions where they belong, Otherwise, showing something unobtainable is outright misinformation. No matter how much I like and respect some developer. Yes, Sand if one of my favourite characters in NWN2. /grin
Fenixp: Now, I don't want to put the blame on consumer, but it's also quite difficult for me to fault developers for wanting to create the best product imaginable.
And nobody blames them for it. But you can't have everything, you either provide better visuals, physics, gameplay, and immersion for one platform, or you follow the herd who believes there are far more money on consoles and downgrade or tailor your game for them. You can't have both, unless you really want to have best of both worlds or created wonderfully scalable engine that automatically adjusts everything to platform it works on. Which is not the case. Judging by recently published numbers, calculations I've made more than 2 months ago on CDPR's revenue from Witcher 3 sales were correct. One more reasons for me to doubt "this game exist only because of consoles" statement.
Fenixp: As you said in your post, one way out of this would be more transparent development of videogames - but that comes with its own share of pitfalls.
But that's fine. It's better to see how game starts to appear from lines of codes, crude "clay" models and sketches how it turns into playable concept, how new features are tested,
Fenixp: Plainly said, it's just not feasible to know how is a final product going to look like 2 years before its release, that's something people need to realize.
Quite the contrary. I was among those who pre-ordered Galactic Civilization 3 (among other things) early on when Stardock opened it on their site, so I got access to Alpha and Beta versions. Yes, it was around a year, not two, but still it was clear how game was about to look and how it looks now.
Of course games development is not something solid, things change, something new appears, something old disappears, but it's not an excuse to show something unobtainable. This situations resembles a case where you ordered a book, but first reviews showblurry text and illustrations on worse quality paper than promoted, with borders cut and some pages torn out. When you ask them WTF, they start telling you that it's reviewers bad camera and Youtube compression so you don't have to worry, but when you finally get the book yourself, they comfirm that original paper is too dark for pictures, pages were too thick and heavy, so glue can't hold them in place, their printing machine couldn't handle fidelity of original fonts and pictures, that's why they are blurry, and they cut books borders and torn some pages, because they couldn't fit it into their box.
Fenixp: I'm also not going to claim that CD Project's public relations is good - I sort of ignore it after all the crap they pulled.
While we are not Geth, we finally found consensus. ;)