It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Check out these benchmarks

extrapolating... belief no1 who tries to tell you that a 3080 will be overkill for 1080p, and certainly already not for the current line of ray tracers and of course want a 144 fps output minimal

https://www.youtube.com/watch?v=32GE1bfxRVo
Problem is, aside from a powerful graphcis card like the RTX3080, you want a faster processor than the currently available ones if you want to run 1080p at high frame rates.
I'm going to mostly shrug largely, given that I don't really see the point in this hyper consumeristic greed/want for the newest shiny bauble. Wake me when they have it running under 50 watts on a single card form.
As Themken said, the difficulty getting to 144fps at 1080 is usually all about processor. Either that or its such a new game with such demanding graphics that even a 3080 couldn't manage it, like Control with ray tracing for example. If you want 144fps you can always lower the graphics. I can run Control at ~100fps on a 2070 at 1440p if I use DLSS without ray tracing, so I'm sure a 3080 can get there somehow.

The 3000 series is a massive leap in performance. Some tech channels are like "it only beats a 2080ti by 30%!" as if that's a disappointment, when its actually replacing a 2080, which was much less powerful. The increase across the same product line for both the 3070 and 3080 is supposed to be 70% roughly, which is a GREAT improvement. Same number for the 2000 series was less than 20% most of the time.
avatar
StingingVelvet: which is a GREAT improvement.
Actually no it's not, because the 20 series was basically a scam what was almost the exact same thing as the 10 series and had virtually no progress from the 10 series in any real gaming aspects that matter.

So the real reason why NVIDIA is able to spin-doctor the claim of having a "great improvement" with the 30 series is because they did nothing significant to progress the GPU technology for the last 5 years or so, and then they implemented those 5 years worth of improvements suddenly all at once in the 30 series...instead of staggering them out incrementally over time as soon as they were ready, like a good & honest company would have done.

They could have, for example not released the 20 series as a ripoff scam that was almost identical to the 10 series, and instead they could have done the ethical & pro-consumer, which would be to release a truly improved product as the 20 series.

And if they had done that, then the 30 series would not currently be being spin-doctored as a massive leap in performance...because in that case, it would only be marginally better than the 20 series.

Massive improvement occurring after 5 years of total stagnation...that's to be expected as a normal matter of course. It's not an 'accomplishment' or something to be proud of.
Post edited September 17, 2020 by Ancient-Red-Dragon
I'm gonna wait for 3090 numbers and see what AMD has to offer. I'm concerned over frames on Cyberpunk 2077, I wonder if all the marketing material they showed was with DLSS on... and the game wont run as good without it for 4k users. I hope it can nail past 70 to at least 80 without DLSS on 3080.

I just seen someone show off Witcher 3 maxed out on 4k and it seems the 3080 ONLY gets like 10 to 15 extra fames compared to a 2080ti card on that game, very odd.

Maybe some game engines are not optimized for these types of high end cards? even Crysis 3 cant go past 60 to 65 fps even on a 3080.
Post edited September 17, 2020 by DreamedArtist
I'm gonna wait for RTX 4080 and buy RTX 3080 from some crazy overlocker for quarter the price. Or maybe not, but I'll wait anyway.
avatar
Ancient-Red-Dragon: Massive improvement occurring after 5 years of total stagnation...that's to be expected as a normal matter of course. It's not an 'accomplishment' or something to be proud of.
Nothing you just said changes the fact it's a great improvement today. Past failings are what they are, but getting a 70% boost in a new GPU from the same class is a GREAT improvement. If it was the first new GPU in 10 years I'd say the same thing.
avatar
DreamedArtist: I'm gonna wait for 3090 numbers and see what AMD has to offer. I'm concerned over frames on Cyberpunk 2077, I wonder if all the marketing material they showed was with DLSS on... and the game wont run as good without it for 4k users. I hope it can nail past 70 to at least 80 without DLSS on 3080.

I just seen someone show off Witcher 3 maxed out on 4k and it seems the 3080 ONLY gets like 10 to 15 extra fames compared to a 2080ti card on that game, very odd.

Maybe some game engines are not optimized for these types of high end cards? even Crysis 3 cant go past 60 to 65 fps even on a 3080.
The latest reviewers did check the 3080 dlss 2.0 on hd and 2k upscaled to 4k in several rtx heavy games and the verdict seems to be so far that a 2k upscaling actually manages to provide sometimes more and clearer detail which can be shown with a 10x zoom on several parts and that hd or 1080p upscaled still falls well enough within the limits of reason to be considered as a useable option.

still if your not bothered in any way with providing a new system to go with the 3090 not to mention the operating costs that might be your best bet
Post edited September 17, 2020 by Radiance1979
If you check the current and expected amount of rtx ready/dlss ready titles it still does seem that in the most general cases you might be way better of with a console purchase then a new gpu, unless you are already in the process of upgrading // are a pc addict // you know for sure your planning to play at least half the number of the amount released atm

https://www.rockpapershotgun.com/2020/09/01/confirmed-ray-tracing-and-dlss-games-so-far/

Personally i'm happy with the 2060 super which provides more then enough quality on 1080p 60 hz and i guess will even work if i had the demand to show my war titles and 4x titles and isometric rpg titles in stunning 4k though i probably would get in a lot of hurt with older open world titles such as mass effect andromeda

It would be nice to see what intel plans on doing when they enter the fray, not to mention the upcoming big navi which i guess you can expect of the highest tier to at least perform just as well as the new consoles

anyway with expectancies believing 2022 will be the year that RTX really takes over waiting for the next gen still seems a viable option/ waiting for other model releases / competition
Going to wait for the inevitable Ti lines coming out sometime next year.
Nice enough uplift i suppose, 29% Average on 4K over 2080 TI and 20% on 1440p so three options now remain. Buy 3090 or wait for 3080 20GB or Big Navi and see how they'll turn out since i'm not building a new one before Zen 3 has been out a month or two anyway.
avatar
Swedrami: Going to wait for the inevitable Ti lines coming out sometime next year.
I'm also curious if Nvidia or the AIB's will do things like add more VRAM too.

As great as the 3070 and 3080 sound performance-wise - yeah, I'd figure we'd see say 12 GB VRAM version of 3070 and 16 GB VRAM on the 2080.

They (NVidia and their AIB's) might be waiting for AMD to strike w/ more VRAM, so they can...strike back with similar or more VRAM amounts.

I held off on the GTX 960 back in the day b/c of the 2GB VRAM version. Didn't feel like much of a increase VRAM-wise, over the 560 Ti at 1GB. Though, 4GB VRAM 960 when EVGA came out with was much better and made more sense, IMHO.

I literally, at that time, had games like Batman Arkham Knight and Watch Dogs easily eating around 4GB of VRAM at 1080p on higher settings!
Post edited September 17, 2020 by MysterD
avatar
StingingVelvet: which is a GREAT improvement.
avatar
Ancient-Red-Dragon: Actually no it's not, because the 20 series was basically a scam what was almost the exact same thing as the 10 series and had virtually no progress from the 10 series in any real gaming aspects that matter.

So the real reason why NVIDIA is able to spin-doctor the claim of having a "great improvement" with the 30 series is because they did nothing significant to progress the GPU technology for the last 5 years or so, and then they implemented those 5 years worth of improvements suddenly all at once in the 30 series...instead of staggering them out incrementally over time as soon as they were ready, like a good & honest company would have done.

They could have, for example not released the 20 series as a ripoff scam that was almost identical to the 10 series, and instead they could have done the ethical & pro-consumer, which would be to release a truly improved product as the 20 series.

And if they had done that, then the 30 series would not currently be being spin-doctored as a massive leap in performance...because in that case, it would only be marginally better than the 20 series.

Massive improvement occurring after 5 years of total stagnation...that's to be expected as a normal matter of course. It's not an 'accomplishment' or something to be proud of.
The 20 series wasn't a great improvement performance wise, but was a prototype with new technology that hadn't even been optimized yet. I agree they shouldn't have released it, but not because it wasn't an improvement, but rather because it's bleeding edge technology. It was advanced but not complete. Essentially it was the Windows Vista of GPU series, and hopefully the 30 series lives up to the expectations. I think it was shifty for them to charge so much for what was essentially an experimental proof of concept though. Those who bought 20 series cards must have been desperate for something new.
Post edited September 17, 2020 by paladin181
avatar
Radiance1979: still if your not bothered in any way with providing a new system to go with the 3090 not to mention the operating costs that might be your best bet
I do have a 1000watt PSU and 3950x, I won't have to deal with upgrading. just replacing the card at this point is the only factor. And tbh I might just have to dip to a 3090, same price as a 2080ti here in Canada.

I know I will get crap for my 2080ti so I'm gonna donate it to my sister so she can experience maxed out performance at 10080p :) she still uses a 670 and I know that has a hard time making things look good for modern AAA games. At least now she can play cyberpunk 2077 1080p maxed out! I'm sure she will be very happy.
Post edited September 17, 2020 by DreamedArtist