It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
Dark_art_: HD 4400/4600 Haswell/Broadwell 4th and 5th gen. Slight update from previous line, small performance improvement but better overall support, including QuickSync encoder and Vulkan on Linux (didn't test myself).
I get a warning about the Vulkan support being incomplete on Linux when I test it.
avatar
Dark_art_: HD 4400/4600 Haswell/Broadwell 4th and 5th gen. Slight update from previous line, small performance improvement but better overall support, including QuickSync encoder and Vulkan on Linux (didn't test myself).
avatar
dtgreene: I get a warning about the Vulkan support being incomplete on Linux when I test it.
It seems Intel dropped a lot of promised support when things started to go south, including Vulkan on HD 4400/4600, Atoms and Some Celerons. Linux was probably half done by that time.
avatar
MysterD:
avatar
Dark_art_: From ARK.Intel website some CPU's I´m very familiar with:

HD 4400 3200x2000@60Hz
HD 620 4096x2304@60Hz
UHD 620 4096x2304@60Hz

As I've stated before the UHD 620 an HD 620 are the same, performance-wise, even supported video codecs are the same (don't quote me on this). They are very similar to the older HD 520, wich are the last Windows 7/8/8.1 supported GPU's.

So, regarding wich superseedes what, from older to newer:

HD 3000 used on Sandy Bridge 2ng gen CPU'S. Underpowered, only old codec support and DX10

HD4000 Ivy Bridge 3rd gen, this marks the start of good Intel integrated graphics with DX 11, Direct Compute, modern codecs (not sure if accelerates vp9 and h265) and enough power to run many modern games (of course, not triple A)

HD 4400/4600 Haswell/Broadwell 4th and 5th gen. Slight update from previous line, small performance improvement but better overall support, including QuickSync encoder and Vulkan on Linux (didn't test myself).

HD 520 Sky Lake 6th gen. Major update with DX 12, modern 4k decoders, Vulkan and better performance.

HD 620 Kaby Lake 7th gen. Slight update to codecs and first GPU with Win10 only support.

UHD 620 8 and 9th gen CPU's and as far as I know, the same as above.

UHD G1/G4/G7 Used on 10th gen 10nm CPU's. Increased core count, better performane and overall support.

I've only listed more popular parts, stuff found on Celerons and Atoms are all over the place. Iris graphics have dedicated memory/cache, more cores, high power consumption and way better performance, usually only found in very expensive devices such as Apple's laptops. (I've posted a couple on NUC's featuring Iris graphics on this thread, it's the first time I see it on cheaper computers)

Also, Intel GPU's also have generations. HD4000 is gen7 while HD 520 is gen9, in the above list what's mentioned is the CPU gen for clarity and simplicity.

Not trying to be a smart azz but there seem to be a lot of confusion on this topic (who'd wonder, with all this naming schemes). Intel ARK and Noteboock check are good sources to check this stuff.
HD 4400's 3200x2000 is NOT 4K / 2160p / 3840x2160.

Though, sounds like HD 620 and UHD 620 are 4K supported, as it goes a little bit over 4K aka 2160p aka 3840x2160.
So, thanks for clarifying that up quite a bit more.

Looks like HD 620 and UHD620 went through a very smart re-branding process, to clear up & un-muddy some waters here, as the U (for Ultra) here signifies 2160p....and actually a bit beyond, in this case.

Thanks for the links.

EDIT:
Regardless, gamers really serious about modern games, they should really invest in some solid CPU/GPU pairs:

1. CPU's with Ryzen 3000 or newer CPU's for desktops or Ryzen 4000+ for laptops (or better); and/or Intel's 10th gen or newer with i5's or i7's (laptops or desktops)

2. And for GPU's - go with AMD's 5000 or 6000 series (6000 is probably better for those who want RTX and hopefully their DLSS equivalent comes soon, as 5000 doesn't do RT); and/or NVidia GTX 1660 (no RTX) or NVidia 2000 or 3000 series (with RTX).

Probably more so should at least aim for having RT support, especially since consoles (PS5 and XBS/X) support RT now.

Of course, patient gamers doing older stuff and not doing say RT stuff in the future or waiting for that major performance/RT jump, a few gens down - they'd probably be more than fine w/ even GTX 1660's, for now.
Post edited March 10, 2021 by MysterD
What is modern benchmarking software to compare the iGPU and GPUs in games?
Some of the games on gog have methods for testing performance. The Quake games have timedemos and I think the X space economy games have them. X3 has a seperate rolling demo which I have linked as the game itself doesn't let you do it.
https://www.egosoft.com/download/x3/demos_en.php?download=145


avatar
Mortius1: It has been tried in the past
Hybrid Crossfire
LucidLogix Virtu GPU

Results were disappointing and both technologies sank without a trace.

Nvidia came closest to having something passable: dedicated PhysX cards.
One GPU for graphics, the other for physics. Naturally both must be Nvidia cards.
Lucidlogix looked interesting.

I think it would have been better if separate physics cards had remained and went through a competitive cycle the same way 3d accelerators happened with graphics.

avatar
dtgreene: With virtual machines and GPU passthrough, you can. You won't be using them for the same game, but you could be running a demanding game with one GPU and a *different* demanding game with a different GPU (assuming the CPU is powerful enough to handle both, and every relevant piece of hardware supports this (which means no consumer-level nvidia card for the guest)).

This is, however, a rather unusual configuration, unless we're talking about a cloud hosting provider that offers GPUs (for an extra fee, of course).
It would have been good if you could assign the different GPUs to tasks like the cores on the CPU.
Post edited March 11, 2021 by §pectre
avatar
dtgreene: With virtual machines and GPU passthrough, you can. You won't be using them for the same game, but you could be running a demanding game with one GPU and a *different* demanding game with a different GPU (assuming the CPU is powerful enough to handle both, and every relevant piece of hardware supports this (which means no consumer-level nvidia card for the guest)).

This is, however, a rather unusual configuration, unless we're talking about a cloud hosting provider that offers GPUs (for an extra fee, of course).
avatar
§pectre: It would have been good if you could assign the different GPUs to tasks like the cores on the CPU.
Well, there's Jailhouse, which allows (and I believe even requires) reserving specific CPUs for specific VMs; it's useful for things like real time processing (in particular, hard real time).
avatar
§pectre: What is modern benchmarking software to compare the iGPU and GPUs in games?
If you mean compare performance in games you own, most people use MSI Afterburner (you don't need an MSI card), and there are plenty of decent guides on Youtube.

If you mean "benchmarking software" as in a downloadable benchmark, the two most popular are 3DMark and Unigine, and they both have options for lighter weight benchmarks for lower end iGPU's (eg, 3DMark Night Raid or older Unigine Heaven / Valley / Tropics). Unigine is probably the better of the two for both offering a Linux version and not needing a Steam client.
Post edited March 11, 2021 by AB2012
avatar
§pectre: What is modern benchmarking software to compare the iGPU and GPUs in games?
avatar
AB2012: If you mean compare performance in games you own, most people use MSI Afterburner (you don't need an MSI card), and there are plenty of decent guides on Youtube.

If you mean "benchmarking software" as in a downloadable benchmark, the two most popular are 3DMark and Unigine, and they both have options for lighter weight benchmarks for lower end iGPU's (eg, 3DMark Night Raid or older Unigine Heaven / Valley / Tropics). Unigine is probably the better of the two for both offering a Linux version and not needing a Steam client.
Is there a (preferably open source) benchmarking program that isn't limited to x86? I think it would be interesting to see how the Raspberry Pi 4 compares to my desktop and small laptop in GPU performance.
avatar
dtgreene: Is there a (preferably open source) benchmarking program that isn't limited to x86? I think it would be interesting to see how the Raspberry Pi 4 compares to my desktop and small laptop in GPU performance.
I don't have a Pi myself but this seems to be what some people use:-
https://www.geeks3d.com/20190930/raspberry-pi-4-vs-raspberry-pi-3-cpu-and-gpu-benchmarks/
You can also use some 3D demo and monitor the fps (not sure if it's open source though). Or good old GLX-gears with v-sync turned off.

Although, not exactly a GPU benchmark WebGL samples provide a interesting platform to compare web 3D performance. I use Aquarium (it takes a while to fully load) many times, and for reference, the i3 7100u on Win 8.1 using Firefox can do 60fps with 500 fish and 45 fps with 1000 fish
Post edited March 11, 2021 by Dark_art_
avatar
§pectre: What is modern benchmarking software to compare the iGPU and GPUs in games?
For what purpose? Are you first going to buy multiple MB/GPUs, then benchmark them? Software that tests your hardware wan be 'interesting', but does it serve a purpose?
avatar
Shadowstalker16: In terms of actually optimizing for recent hardware, I think all devs should do it. But I'd imagine that it would depend on the skill of the people coding it and their publisher's will when it comes what and where they invest their time. If the design philosophy is just to vomit out open unoptimized worlds, I'd imagine quantity would take priority over (optimization) quality.
avatar
AB2012: On the one hand the heavyweight AAA games have never really run well on the iGPU generation on which they launched. On the other hand, you're right that devs have gotten lazy with doing the minimum they can to build a game without ever really digging deep to fully optimise it.
Stop blaming the developers about this. Some devs care more about their craft than others, but there are ultimately time constraints that will dictate where emphasis is put. This is dictated by managers and in the case of very small indie operations, the need to get something out there ASAP, make some money and pay the bills.

Game development shops are infamous for putting their devs through the wringer and burning them out. As much as I like games, I made a career decision not to work for gaming shops. Those people are not lazy, they are often badly overworked and have to make a lot of tradeoffs to deliver very complex software on time.

For indie lone developers, I think its plain crazy that some people out there make a good game with non trivial graphics entirely from scratch by themselves (programming, design, graphics, the works). Of course, they'll have to take shortcuts somewhere to release within their lifetime. You expect those people to do fine-grained gpu optimisation on top of everything else they have to do? Really?!?!
Post edited March 11, 2021 by Magnitus
avatar
AB2012: On the one hand the heavyweight AAA games have never really run well on the iGPU generation on which they launched. On the other hand, you're right that devs have gotten lazy with doing the minimum they can to build a game without ever really digging deep to fully optimise it.
avatar
Magnitus: Stop blaming the developers about this. Some devs care more about their craft than others, but there are ultimately time constraints that will dictate where emphasis is put. This is dictated by managers and in the case of very small indie operations, the need to get something out there ASAP, make some money and pay the bills.

Game development shops are infamous for putting their devs through the wringer and burning them out. As much as I like games, I made a career decision not to work for gaming shops. Those people are not lazy, they are often badly overworked and have to make a lot of tradeoffs to deliver very complex software on time.

For indie lone developers, I think its plain crazy that some people out there make a good game with non trivial graphics entirely from scratch by themselves (programming, design, graphics, the works). Of course, they'll have to take shortcuts somewhere to release within their lifetime. You expect those people to do fine-grained gpu optimisation on top of everything else they have to do? Really?!?!
Its about striking a balance. As I said before, publishers may not care enough to optimize and in those cases, nothing can really be done. In the case of indies, it depends on their budget and skill. But both indie devs and AAA studios won't get much revenue out of something that looks like Minecraft and runs like Metro. Asking them to optimize isn't unreasonable, and they should only listen to and deliver on what's possible. But don't think optimization is something unjustly demanded; its in the developers' interest as well and when you're selling something, it should conform to some minimum standards.
low rated
avatar
thegreyshadow: Games should be playable on onboard GPU systems
I second that and your exposition in general @thegreyshadow

And I disagree with those reducing it to:
-South America problem
-A cash problem
-Gaming is exclusive to desktop PCs and/or expensive GPUs
-Integrated GPUs = business only and/or never designed for games
-The rule is to upgrade/replace PC every 3 years or so
-Profit, Market share...
What a wowthread!

Lot of excuses and no brain solutions
First off, nice move, you necromancer, you. Here are some hugely outdated love metal lyrics about what you did to this thread.

avatar
cryware: -Integrated GPUs = business only and/or never designed for games
No longer really true - see Vega 8 and Intel Xe LP. You can definitely do some light-weight gaming on these things - or even play AAA games from a dozen years ago maxed out.
And here I thought you were linking to the Powerwolf one :)) (Which thought led me to find out they had a new version of it and a video... but couldn't see it because of something else that was just mentioned around here. Huh.