Posted January 10, 2015
I think there was a similar discussion earlier, but The Witcher 3 requirements discussion sparked this again:
Which are the graphical settings you are willing to sacrifice in order to get an ok framerate?
For me the top three probably are:
1. depth of field, motion blur and such useless effects, if there are any
2. (edge) antialising (MSAA, CSAA, whatever there is): I don't mind that much if there are some jaggies, as long as the game is not ultra-low res. I rather use a higher resolution, than use AA, if I have to choose between the two.
3. Resolution. Yep, I guess e.g. 1280x720 is fine by me, even if I'd prefer full 1920x1080. I don't even consider 4xHD resolutions at this point, especially on computer monitors.
After that it becomes a bit harder to decide. I wouldn't want to use e.g. poorer quality textures, but I think that will become an issue only if there is not enough VRAM, not so much about the GPU or CPU speed. I dislike fuzzy unclear textures.
Which are the graphical settings you are willing to sacrifice in order to get an ok framerate?
For me the top three probably are:
1. depth of field, motion blur and such useless effects, if there are any
2. (edge) antialising (MSAA, CSAA, whatever there is): I don't mind that much if there are some jaggies, as long as the game is not ultra-low res. I rather use a higher resolution, than use AA, if I have to choose between the two.
3. Resolution. Yep, I guess e.g. 1280x720 is fine by me, even if I'd prefer full 1920x1080. I don't even consider 4xHD resolutions at this point, especially on computer monitors.
After that it becomes a bit harder to decide. I wouldn't want to use e.g. poorer quality textures, but I think that will become an issue only if there is not enough VRAM, not so much about the GPU or CPU speed. I dislike fuzzy unclear textures.