It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
My question is very easy:
CD Projekt, your customer service and patching is very great.

According to some german magazines you´re upcoming new games named Witcher III and Cyberpunk will both use the Red Engine III which will support DirectX 11 for better Performance and Graphics while Witcher II (PC) is still running in the original Red Engine.

Will you give us an update for the Witcher II: Enhanced Edition when you´re work on Red Engine III is finished?
Of cause, WItcher II is a very good-looking Game at all.
But with the help of DX 11 we could experience much more performance on mid-range-systems and due to dynamiclly increased LOD via Tesselation and POM much more quality on High-End-Systems!

To put it in a nutshell:
Will there be a DX 11-Patch to improve Witcher II any further?
By the way, as you published you´re "four year plan"( a great socialistic idea btw ;) ) I´ve missed the Witcher I-Red Engine-Remake!

Did you totally forget the plan to bring up a Witcher-Remake?!
i think it takes quite a while to port a whole game to a new engine! hence why duke Nukem forever took so long because they kept throwing out engines! probably not worth the development time to cd project.
I seem to remember some tesselated surfaces in The Witcher II, adding detail to those spikes on the doors in Loc Muinne, not to mention the rocks around... Such things.
You sure it isn't already using DX11?


Edit: Ah, you mean explicitly using it for large scale LODing, rather than tiny details like what's already in there.
Never mind me then.4
Post edited May 15, 2013 by Rubbercookie
avatar
Rubbercookie: I seem to remember some tesselated surfaces in The Witcher II, adding detail to those spikes on the doors in Loc Muinne, not to mention the rocks around... Such things.
You sure it isn't already using DX11?

Edit: Ah, you mean explicitly using it for large scale LODing, rather than tiny details like what's already in there.
Never mind me then.4
What you mean is not tesselation, but POM (parallax occlusion mapping). Which can be done in DX9, but is not possible in combination with anisotropic filtering.
That's why some textures are very blurry from certain angles.
Witcher 2 itself is using DX11 - you may specify which one is running -
Point is Witcher 2 won't need facelift for another 2years or so - When witcher 3 and the other one are most likely be running on dx12+ if its still going to be there... I would hoped for OpenGL since Mac and *nix users would be able to play without any graphical issues. This is going to bring new waves of gpu's and new ideas for great effects. Until then we might only think about Witcher 1 to be remade - yet the game is good looking anyway and doesn't need to have improved graphics - cd project red could release original size textures though it would help modding community and you'll see custom fan made mod with much better graphics without actual money spending deadline meeting project.
avatar
CyklonDX: Witcher 2 itself is using DX11 - you may specify which one is running -
Point is Witcher 2 won't need facelift for another 2years or so - When witcher 3 and the other one are most likely be running on dx12+ if its still going to be there... I would hoped for OpenGL since Mac and *nix users would be able to play without any graphical issues. This is going to bring new waves of gpu's and new ideas for great effects. Until then we might only think about Witcher 1 to be remade - yet the game is good looking anyway and doesn't need to have improved graphics - cd project red could release original size textures though it would help modding community and you'll see custom fan made mod with much better graphics without actual money spending deadline meeting project.
First point:
Witcher II /the RedEngine is running in DX9. Gromuhl already explained it a bit, but if you don´t believe me, here is abit of proof:
http://en.thewitcher.com/forum/index.php?/topic/22288-nvidia-fxaa-for-witcher-2-or-any-dx9-game/
Oh, and all big Hardware-Sites tell the same:
http://ht4u.net/reviews/2013/nvidia_geforce_gtx_titan_asus_gigabyte_test/index27.php

Second one: You didn´t read my post very carefully, did you?
I was never talking about grafics:
I was talking about performance on mid-range systems:
With the help of DX11 you can make an engine much more efficient!
That´s why, for example Giana Sisters-Twisted Dream, is running on round 40 FPS in DX 9 and on 60 FPS in DX11!
(No graphical change between those APIs)

Third one:
Yes, I was actually talking about Quality: About IMAGE Quality. That´s not the same thing like Grafics!
Witcher II is one of the best-looking games you could get, in the way of style and in the way of tech.
But this changes nothing that blurred POM doesn´t have to be there:
If they switch to DX11, AF will also apply POM-Textures!

Last one:
Witcher I wasn´t very ugly. it got the same good syle as Witcher II got:
But it´s suffering under it technical problems of it´s engine. WItcher I was using an updated version of the Aurora Engine, just like Neverwinter II did:
And both simply got the same problem:
A low performance/grafics-index!

BTW:
Open GL would be nice to have anyways, just to push it up again!
Post edited May 27, 2013 by RadonGOG
You are wrong if you think DX11 would improve performance - it would kill mid-range pc's.
Putting it onto OpenGL would benefit future ~ thats where we agree.

ok more respect to RED team, at last some normal guys that didn't go for microsoft commerce with dx11.

###
going back to dx11 performance - you would feel most improvement on highest settings with high-end pc. On mid range it would be unplayable. Most of the optimization for DX11 are on call out where you can replace 4-6lines of code with one argument but itself it isn't any better than DX9 - i would say that DX9 is much lighter and load friendly comparing DX11 -- also everything in DX11 can be done under DX9.
###

Red go for OpenGL ;) screw microsoft and its monopoly.

worth mentioning - on my 7970 DirectCU II i get around 40fps on highest on witcher 2 - i tested witcher 2 on trifire before 6990 + 6970 wasn't any better got only 67+- fps. The improvements you are looking are going to come with better gpu's. (btw. by forcing vertical sync and running in window mode -> witcher 1 gets over 144fps <max on my monitor> no performance issues there.)

It would be fun if RED would supply us with all required files to compile the game under future engines - thus fixing any bugs.

again sorry posts are too long :P and its too late in my country to actually bother.
Post edited May 27, 2013 by CyklonDX
I'm sure this isn't what anybody wants to hear, but I personally would much rather that someone spent the time and effort to improve the existing graphics glitches in Witcher 2.

I am myself (and have been since launch -- I don't post much, but I did post about it back then; I found a post of my own dating to June 2011 on these very forums) a dual card Radeon HD user; ATI or AMD or whichever name they prefer these days calls this particular flavour of SLI by the name: CrossFire X.

My dual Radeon HD 6790's have been powering the gorgeous graphics in The Witcher 2 just fabulously ever since launch, and I've kept up with AMD (or ATI) drivers plus so-called "CAP" releases (Catalyst Application Profiles) ever since then. And now, almost two years later, it's still the case that you cannot run The Witcher 2 with any kind of decent graphics setting and still see a targeting reticule (or crosshair, or whatever your moniker for this rather vital in-game visual indicator might be). Witcher skill-tree skills like "Riposte" are rather useless when you can't see who you're targetting, but even the basic game right from the outset suffers. On the flipside, you get marvelously surprising moments when Geralt does some kind of long distance sword spin to an enemy you never wanted to target (Vergen wall defence with two ladders up the wall especially helped in driving this point home.)

In short: I'm sure nobody would object to some effort in making an already beautiful game even more visually arresting. But given a list of priorities, I'd feel weird about not nudging upwards into the light those issues that have remained in force since launch.
avatar
CyklonDX: You are wrong if you think DX11 would improve performance - it would kill mid-range pc's.
No it would - all things being equal (i.e. same level of detail) - increase performance. Dx11 is a more efficient renderer, period.

But it wouldn't be by much with a game like wither 2 which has a small number of rendered actors/objects. Dx9 is fine at this kind of thing: why we get so many corridor shooters.

-

Why witcher 2 performance is so abysmal, I don't know, it doesn't look good enough for what we get.

On second thought maybe dx11 would be more benefit than I assume if it's some kind of horrible kludge dx9 complex lighting model that kills fps??
Post edited May 30, 2013 by rahal
If witcher 2 would use tasselation you would feel the pain of graphics - it would give much advantage to nvidia gpu's. If you were to run some radeon's 4000 - 5000 it would kill them even most 6900 series would die alone, also worth mentioning dx11 has SM5.0 more computing again low-mid end gpu's would die there again.

If we were take a game that doesn't use those features of dx11 just the same one's the dx9 uses we would get a boost in dx11 but its hardly significant. (that's what i was referring to in previous post.)

resolution with uber sampling is putting much stress overall on the witcher since (radeon's 6000's can't really handle ubersampling in crossfire config), by disabling it you'll get a lot of performance.

also worth mentioning when you are in crossfire / sli you only use memory of 1 gpu - its not like you have 2 6970's you'll get 4GB or so... same thing with gtx 680 and other cards. Its not yet done so cards actually have to have same thing -> be in sync with each-other.

Remember witcher 2 has much better geometric and texture detail than most of the games in such environments. even if you are compering new titles like crysis 3.
avatar
CyklonDX: If witcher 2 would use tasselation you would feel the pain of graphics - it would give much advantage to nvidia gpu's. If you were to run some radeon's 4000 - 5000 it would kill them even most 6900 series would die alone, also worth mentioning dx11 has SM5.0 more computing again low-mid end gpu's would die there again.

If we were take a game that doesn't use those features of dx11 just the same one's the dx9 uses we would get a boost in dx11 but its hardly significant. (that's what i was referring to in previous post.)

resolution with uber sampling is putting much stress overall on the witcher since (radeon's 6000's can't really handle ubersampling in crossfire config), by disabling it you'll get a lot of performance.

also worth mentioning when you are in crossfire / sli you only use memory of 1 gpu - its not like you have 2 6970's you'll get 4GB or so... same thing with gtx 680 and other cards. Its not yet done so cards actually have to have same thing -> be in sync with each-other.

Remember witcher 2 has much better geometric and texture detail than most of the games in such environments. even if you are compering new titles like crysis 3.
Well, the Tesselation problem only applies on high-faktor tesselation and only on Radeons:
Why wouldn´t they just use low-faktor-tesselation for objects and blurless POM for textures?!
This would bring us better performance, less visible LOD and blurless Textures the same time!
1. Today both cards (nv i radeon) preform quite nicely on tessellation issue so its not issue for radeons.
2. I still don't understand the textures seem fine for me... there isn't much blurriness and issues you are referring to.

If you have some issues with LOD you can set it 2 far and change to AFx16 this will help your visuals... but then you have performance issues because your pc isn't strong enough and your gpu is far too weak, which also doesn't have enough memory for it. Again I will state it SLi & CF doesn't sum up your gpu memory. You still use your master card memory resources and in some strange cases it falls down to the lowest crappier card you have (including clocks but it was problem on older machines)

Problem is that the games didn't require that much processing power in past (becuase all of those games were made for freaking consoles from 2005-07) and GPU's didn't require that much power in their tick-tock improvements... right now we should be seeing some better gpu's like Titan(*780gtx) and 8970 might be something big unless amd decides to hold back and let people buy nvidia's 'reheated potatoes'. Still one 7970 on 1080p good 144Hz monitor doesn't need AA and uber sampling - the effects are almost invisible to naked eye. But if you have any problems with that the settings offer you AA, uber sampling and stuff and your pc cannot take all of it well its your problem... don't blame DX9 and state DX11 would solve your problems because it won't! It will create more of them... and improvements won't be noticed. IF you would enable features of dx11 those effects won't be fluid - not even 2x 7990 would be able to push that game on max details.
avatar
CyklonDX: 1. Today both cards (nv i radeon) preform quite nicely on tessellation issue so its not issue for radeons.
2. I still don't understand the textures seem fine for me... there isn't much blurriness and issues you are referring to.

If you have some issues with LOD you can set it 2 far and change to AFx16 this will help your visuals... but then you have performance issues because your pc isn't strong enough and your gpu is far too weak, which also doesn't have enough memory for it. Again I will state it SLi & CF doesn't sum up your gpu memory. You still use your master card memory resources and in some strange cases it falls down to the lowest crappier card you have (including clocks but it was problem on older machines)

Problem is that the games didn't require that much processing power in past (becuase all of those games were made for freaking consoles from 2005-07) and GPU's didn't require that much power in their tick-tock improvements... right now we should be seeing some better gpu's like Titan(*780gtx) and 8970 might be something big unless amd decides to hold back and let people buy nvidia's 'reheated potatoes'. Still one 7970 on 1080p good 144Hz monitor doesn't need AA and uber sampling - the effects are almost invisible to naked eye. But if you have any problems with that the settings offer you AA, uber sampling and stuff and your pc cannot take all of it well its your problem... don't blame DX9 and state DX11 would solve your problems because it won't! It will create more of them... and improvements won't be noticed. IF you would enable features of dx11 those effects won't be fluid - not even 2x 7990 would be able to push that game on max details.
There were similar topics earlyier:
http://static.gog.com/upload/forum/2013/02/129b2af1ab17e2a99b79b67285f278cd0a18c3f3.jpg

Look at the ground: 16xAF in HighQualityMode, everything exept Ubersampling (aka. 2x2 OGSSAA) is turned on.
That´s the famous "DX9-POM-Blurryness-Issue"...
avatar
CyklonDX: don't blame DX9 and state DX11 would solve your problems because it won't!
oh but it would if the problem is rendering shadows and a complex lighting model. dx9 is horrible at that stuff.

dx11 isn't just tesselation and multithreading: which I agree aren't a big deal for a game like tw2.
Post edited June 05, 2013 by rahal