It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Just to point out:
Intel's best integrated graphics is much faster than the HD 620. Even Intel thinks integrated was too slow. I am not terribly fond of Userbenchmark but still, the new Intel iGPUI is more than twice as fast.

When this came out, it beat AMD's fastest integrated:
https://gpu.userbenchmark.com/Compare/Intel-UHD-Graphics-620-Mobile-Kaby-Lake-R-vs-Intel-Iris-Plus-G7/m320744vsm888370


Also, do you have enough RAM and running in dual channel?
Is your processor a total potato model? Those that run at just 1GHz are awful for gaming on no matter what graphics you have. No need to answer me specifically.


Now whether games should target this or that graphics solution. I think some developers are silly when they use a medium heavy 3D engine for what is basically a 2D game but The Witcher games are true 3D games and the kind that are expected to look nice.
avatar
Themken: Those that run at just 1GHz are awful for gaming on no matter what graphics you have.
Not if you choose your games carefully.

I've played Ikenfell on a laptop with chromebook-tier specs and it runs just fine.

(Whereas Ikenfell doesn't run on the HD 4000 series because they don't support some feature the game needs.)
avatar
tomimt: And that's really the problem. Far Cry 2 was released in 2008, so what you are asking is, that the games industry would turn back time about 10 years or so. That won't happen, not in the AAA industry at least. As long as the performance of onboard GPU's keeps lagging several generations behind what dedicated cards can do, the's just no business there for AAA-games.
avatar
thegreyshadow: Good answer, but this part misses my point. FC2 was an example but it's not really a good comparison.
I'm not asking the industry to turn back 10 years.
I was able to run FC2 smoothly on 1080p and with highest settings.
I'm not asking that.
Make the highest setting all the demanding you want; all I ask is to have some "lowest setting" suitable for onboard GPUs.

And I do think there's a business there. Many people have good systems which cannot be expanded with a discrete GPU and who would be delighted to play some games. And there's a crunch due to this pandemic.

You have good points. I appreciate your thoughtul answer.
You have to remember now - the NEW Consoles (X Box Series S/X and PS5) are now setting the new minimum baseline here for PC basically. So, we can expect that mid-range AMD hardware to now become the minimums across the board for new consoles & PC, as more games are now going to shoot for x86 architecture hardware of that sort, style and ilk.

So, you're likely to see games pushing say GTX 1660, RTX 2000's or RTX 3000 series cards...and of course your AMD equiv's. Everybody else...is eventually going to get left behind, at some point.

While you do think there's good business in that sector - the problem is, most "core" games from AAA's that will sell, they will fall in mid-range to high-range dept.

Most gamers in the Steam survey have GTX 1060's, so....those also are what we've been seeing pop-up for the recommended specs for newer titles. There will probably be games that even go beyond that, making that even a minimum.

Steam Survey over here - https://store.steampowered.com/hwsurvey/videocard/?sort=pct

I can't wait see Metro: Exodus' upcoming RTX-only version's requirements, which is likely going to force a 2000-card minimum.

EDIT:
With consoles now supporting RT in games like Spider-Man on PS5, I do wonder when we'll see more games forcing RT-based cards as a minimum.

And really, AMD needs to get that DLSS-equivalent done and get it out there, as they're going to need something like that to compete w/ what Nvidia's doing w/ DLSS; they're already behind what NVidia's doing in that regard.

More so true, if you can run games w/ both On settings for RTX and DLSS to get better performance. AMD needs to get w/ the program.

Especially since AMD are the ones mainly powering the hardware for both CPU's & GPU's in the new consoles, too. They really need that DLSS-equivalent to reach consoles.
Post edited March 08, 2021 by MysterD
avatar
dtgreene:
Is that a turn based game?

Games really vary in what resources and how much of it they need.
avatar
dtgreene:
avatar
Themken: Is that a turn based game?

Games really vary in what resources and how much of it they need.
You fail to include the part of the quote where I mention a specific game.

It often makes sense to cut down the post you're replying to, but please make sure to keep the relevant part intact.
avatar
Themken: Is that a turn based game?

Games really vary in what resources and how much of it they need.
For example, Civ IV is chunky azz in terms of performance at times due to it being on Gamebryo, and Starbound is never going to run right because it renders directly to CPU via SDL and is about as well optimized as an indie project gets when simulating a universe on a Netbook.
avatar
Shadowstalker16: In terms of actually optimizing for recent hardware, I think all devs should do it. But I'd imagine that it would depend on the skill of the people coding it and their publisher's will when it comes what and where they invest their time. If the design philosophy is just to vomit out open unoptimized worlds, I'd imagine quantity would take priority over (optimization) quality.
avatar
.Keys: It's interesting. I've seem people here say that OP's exemple (Far Cry 2) is an old game (2008) and it should run well in any Onboard GPUs nowdays. (Not saying you did that.)

But Metal Gear Solid V: The Phantom Pain is a 2015 game and runs in integrated graphics in lowest resolution 30 to 50 fps depending on your CPU.

FoxEngine is really well optmized and if they did it, why others can't? ('Rethoric' question).

And what you said is completely true. (Yes Im looking at you Ubisoft.)
The slow development of iGPUs to the level of being competent enough to play older demanding games is something that happens naturally; at least, that's how it seems to me. Its not out of conscious optimization efforts from any developer and much more the consequence of the latest technology and innovations eventually trickling down the price and product lines.

Yes, the Fox Engine seems to be a consciously made effort to optimize for PC. Unfortunately, the engine is completely in the possession of Konami so we might as well ignore anyone getting use out of it. Another well optimized(as per the developer) recent game that popped up on my radar is Space Mercs : https://store.steampowered.com/app/1088600/Space_Mercs/ which while lacking mouse support, seems to look stunning and should well looking at the system requirements. I haven't played the game but the developer probably has some great ideas.
avatar
Niggles: Ill chime in with everyone else. Integrated gpu are built and meant for business. They are cheap and nasty and not meant for anything more than very light gaming if u need to game using them.
That needs to change though. With the expansion of GPU uses to include more non-gaming related stuff and games not seemingly the primary driver of GPU innovation, I think its time that we get efficient gaming focused parts and especially more in the budget segments where the big three are competing to with-hold what little they developed years ago from slipping into lower price brackets. I may be speaking too much from experience in my own market here, but the budget segment sucks and has sucked for years and is getting worse.
avatar
Shadowstalker16: Games can only be optimized to a certain extent. There will always be some hardware that will not be powerful enough to run a game, especially if the game is a modern 3d game with the latest visual effects. Models and textures for example can only scale down so much and I doubt the devs of any game can make it so that their game will run on something with outdated specifications without compromising the game elsewhere.

What do you think about CDPR releasing unoptimized versions of CP2077? I think if support for weaker hardware goes in that direction, ie making another inferior version of the game entirely from scratch just to run on it, it would be a huge step backwards. It would be like the early console days where PS1 or N64 or PS2 and XBox versions were different and the player had to choose his poison in terms of optimization and the definitive version of the game.

In terms of actually optimizing for recent hardware, I think all devs should do it. But I'd imagine that it would depend on the skill of the people coding it and their publisher's will when it comes what and where they invest their time. If the design philosophy is just to vomit out open unoptimized worlds, I'd imagine quantity would take priority over (optimization) quality.
avatar
thegreyshadow: I agree with this.
I'm not thinking about "let's optimize our next game so it can run on both potatoes and high-end rigs equally well".
I'm thinking more on the lines of having onboard GPUs as playable targets upon designing their next game engine. Asking otherwise would not be practical, I think.
I agree. I think more devs should ask themselves the question of whether their game is well optimized. My brother plays M&B Warband on his Pentium G2020 and that's a great thing to be able to do considering the scale and depth of the game.

Hopefully, as I said in my earlier post, the need for dedicated gaming only or dedicated lower end GPUs both integrated and dedicated is seen by hardware companies and game developers accordingly both realize and are encouraged to optimize for a wider spectrum of hardware.
Post edited March 09, 2021 by Shadowstalker16
Most of these games likely will not be optimized though. They are pumping out open-worlds, DLC's/expansions, and whatnot like it's water. Release now and if it's good enough, since people are buying good cards anyways - let the brute force of the card do the work...unless they need to go in and fix the game's performance later, if people are whining about poor performance when the game's released.

NVidia really can't keep the 3000 series cards on these shelves anyways - so, let the card do the grunt work. [shrug] Seems like the philosophy here.

Namely, in the AAA space, they are not doing the old-school of design in many cases, of where levels were broken into pieces/modular/chunks in most instances - even if you could freely travel b/t smaller hubs, areas, etc. Even if the world was "open", games like Deus Ex and had to still break it all up, so performance remained w/ all of its fidelity and whatnot.

Much of the design now is one big huge seamless performance-killing open-world here w/ AI's, NPC's, character models, and whatnot are everywhere.

You'd think w/ the speed of SSD's and whatnot here, they'd go back to the old-school design where areas and whatnot of an "open world" were broken up into chunks/areas/segments, in many instances, just to get some more performance. Even if they tossed in load screens in the middle - which with SSD's, probably won't be in the way for long.

RT isn't going to help here, as it's basically a framerate killer.

And well...now we also have new consoles like PS5 supporting RT, which likely are the new minimum/baseline for PC gaming's recommended or requirements.
Post edited March 09, 2021 by MysterD
avatar
MysterD: You'd think w/ the speed of SSD's and whatnot here, they'd go back to the old-school design where areas and whatnot of an "open world" were broken up into chunks/areas/segments, in many instances, just to get some more performance. Even if they tossed in load screens in the middle - which with SSD's, probably won't be in the way for long.
Games *really* shouldn't require SSDs to run; this is especially true if the game is, for some reason, huge. The game should be optimized to only load what's needed in that case. (Or better, make the game smaller so that it doesn't have to load as much.)
avatar
dtgreene: Except that the progression is not linear.

You can have a situation where game A can be played on system X but not system Y, while B can be played on Y but not X.

With GPUs, for example, older GPUs might not supports standards supported by more recent ones even if the older one was high end back in its day and the more recent one is the iGPU of a Celeron or Atom.

Also, the naming conventions can be confusing. For example, 4000 > 620, but in terms of GPU performance, Intel HD 4000 < Intel UHD 620 (I *think*, someone correct me if I'm wrong).
Sure and that's why many people prefer consoles because it's easier and less of a hassle. But on PC you can still use programs like CanIRunIt that will detect if your PC is up to specs for a specific game. But it is confusing if your not running state of the art hardware. It just comes with the territory of being a PC gamer.
avatar
jepsen1977: Games have their specs listed by minimum and recommended specs and that's really all they need. If whatever PC you play on meet the specs then you can play the game and if it doesn't you can't.

Honesty and transparency is all that's needed here.
avatar
dtgreene: Except that the progression is not linear.

You can have a situation where game A can be played on system X but not system Y, while B can be played on Y but not X.

With GPUs, for example, older GPUs might not supports standards supported by more recent ones even if the older one was high end back in its day and the more recent one is the iGPU of a Celeron or Atom.

Also, the naming conventions can be confusing. For example, 4000 > 620, but in terms of GPU performance, Intel HD 4000 < Intel UHD 620 (I *think*, someone correct me if I'm wrong).
This has not much to do w/ the numbering.
Everything got reset, once the U lettering got introduced.

The U and no U are the key.
HD = High Def (regular).
UHD = Ultra High Def.

UHD started over w/ the numbering process b/c that begins w/ much better 4K video support.

HD signifies HD support, which would be 720p to 1080p arena.
UHD is Ultra Hi-Def, which is 2160p (also known as 4K).

I wouldn't expect games to run well on those b/c they are integrated GPU's, of course - but regular video watching (movies, TV, etc) should be able to be handled there in those respective areas.

Expect to spend a fortune, if you plan to play games at 4K 60fps.
Post edited March 09, 2021 by MysterD
avatar
MysterD: You'd think w/ the speed of SSD's and whatnot here, they'd go back to the old-school design where areas and whatnot of an "open world" were broken up into chunks/areas/segments, in many instances, just to get some more performance. Even if they tossed in load screens in the middle - which with SSD's, probably won't be in the way for long.
avatar
dtgreene: Games *really* shouldn't require SSDs to run; this is especially true if the game is, for some reason, huge. The game should be optimized to only load what's needed in that case. (Or better, make the game smaller so that it doesn't have to load as much.)
Thing is: there are those new AAA games ARE indeed HUGE. They are aiming 4K support or better. Expect that to eat-up space.

Your UbiSoft games and all of those huge seamless open-world games with AI, NPC's, action, special effects, shadows, RTX support, and everything else everywhere - yeah, they're going to need all this power. They're going to need every bit of juice and power they can get.

The new games are utilizing SSD's b/c they are much faster for loading of textures, character models, areas, maps, game-worlds, etc. It's not even a contest here, in terms of fast load times; especially if you have a M.2 SSD. Don't expect any texture pop or anything.

Also, consoles are going w/ SSD's now for their main storage, on both Xbox and PS5. So, guess what? That means, your new baseline is going to be THAT for newer games. So, basically - that means players are going to have to adapt to these new changes, if they want to keep up w/ the Joneses. If the lowest common denominators (i.e. X Box Series S & X and PS5) are requiring SSD's, so will PC games also ported from those platforms to PC.

I think pretty much, regular HDD's are going to be basically used for back-up game storage, since games are now getting so big - some you won't even fit them on one BR disc or BDXL disc. Some games, like some of the newer COD's, are in the 200+ GB range on PC.

Some games, especially when they are also on consoles, go w/ uncompressed audio and video & will also put the same exact textures multiples times (so it ain't calling the same texture file-name) - which also makes performance better. And since consoles aren't super-strong, they're going to use every advantage they can to make sure those games run solid on mid-range console hardware powering the PS5 and XSX.

EDIT:
Yeah, I wish they'd go back to the old-design of load screens in b/t areas & breaking game-worlds into smaller areas/hubs all into smaller pieces/areas/maps/chunks/etc - b/c RT is really not there yet for 4K. Sure, it's fine for 1080p - but ugh, it's gonna need DLSS and other upscaling tricks to pump out more performance here.

RTX is just a frame-rate killer.
Post edited March 09, 2021 by MysterD
avatar
MysterD: I think pretty much, regular HDD's are going to be basically used for back-up game storage, since games are now getting so big - some you won't even fit them on one BR disc or BDXL disc. Some games, like some of the newer COD's, are in the 200+ GB range on PC.
That size would preclude me putting the game on SSD at all.

(The only SSD I own that would be big enough I'm not actually able to use at the moment.)
avatar
MysterD:
From ARK.Intel website some CPU's I´m very familiar with:

HD 4400 3200x2000@60Hz
HD 620 4096x2304@60Hz
UHD 620 4096x2304@60Hz

As I've stated before the UHD 620 an HD 620 are the same, performance-wise, even supported video codecs are the same (don't quote me on this). They are very similar to the older HD 520, wich are the last Windows 7/8/8.1 supported GPU's.

So, regarding wich superseedes what, from older to newer:

HD 3000 used on Sandy Bridge 2ng gen CPU'S. Underpowered, only old codec support and DX10

HD4000 Ivy Bridge 3rd gen, this marks the start of good Intel integrated graphics with DX 11, Direct Compute, modern codecs (not sure if accelerates vp9 and h265) and enough power to run many modern games (of course, not triple A)

HD 4400/4600 Haswell/Broadwell 4th and 5th gen. Slight update from previous line, small performance improvement but better overall support, including QuickSync encoder and Vulkan on Linux (didn't test myself).

HD 520 Sky Lake 6th gen. Major update with DX 12, modern 4k decoders, Vulkan and better performance.

HD 620 Kaby Lake 7th gen. Slight update to codecs and first GPU with Win10 only support.

UHD 620 8 and 9th gen CPU's and as far as I know, the same as above.

UHD G1/G4/G7 Used on 10th gen 10nm CPU's. Increased core count, better performane and overall support.


I've only listed more popular parts, stuff found on Celerons and Atoms are all over the place. Iris graphics have dedicated memory/cache, more cores, high power consumption and way better performance, usually only found in very expensive devices such as Apple's laptops. (I've posted a couple on NUC's featuring Iris graphics on this thread, it's the first time I see it on cheaper computers)

Also, Intel GPU's also have generations. HD4000 is gen7 while HD 520 is gen9, in the above list what's mentioned is the CPU gen for clarity and simplicity.

Not trying to be a smart azz but there seem to be a lot of confusion on this topic (who'd wonder, with all this naming schemes). Intel ARK and Noteboock check are good sources to check this stuff.