It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
thegreyshadow: You said it. They are in the majority of PCs.
There's untapped potential.
and you missed the rest of argument. There's NO potential for game developers whatsoever. Integrated graphics are NOT designed for games, they are for business, that's the primary use of PCs worldwide contrary to what you may think.
Post edited March 08, 2021 by anzial
avatar
tomimt: Onboard GPU's are not meant for gaming. They are cheap, low-energy solutions meant to be used in situations, where even midlevel graphics card would make little sense. In other words, they are meant for office use. AAA-games industry is not going to bother with them, because they can lack many of the features dedicated, even low end, cards support.

It could be possible, that at some point PC's evolve to a point, where onboard GPU's becomes the only thing you need, but as long as there is a thousand-mile gap between the performance of them and dedicated cards, it won't happen.
Good point.
However, there are good (but old) games which play very well and look reasonably good. Case in point, Far Cry 2, a complex game, with complex physics, and runs well in onboard GPUs even at their highest settings.
avatar
thegreyshadow: You said it. They are in the majority of PCs.
There's untapped potential.
avatar
anzial: and you missed the rest of argument. There's NO potential for game developers whatsoever.
On that point we clearly disagree.
Post edited March 08, 2021 by thegreyshadow
avatar
thegreyshadow: On that point we clearly disagree.
Too bad. The rest of the world agrees with me, and you experienced it already. You can't change the world to your wishes, you simply don't have the clout or money to do so. Live with it.
Post edited March 08, 2021 by anzial
avatar
thegreyshadow: It simply doesn't make business sense.
As others have mentioned, it is exactly business sense. Up until fairly recently, there wasn't enough data bandwith for a dedicated external GPU, and even that comes with massive caveats.
avatar
Darvond: Up until fairly recently, there wasn't enough data bandwith for a dedicated external GPU, and even that comes with massive caveats.
lol, I have a 10 year old notebook with external GPU :) is 10 years 'fairly recent'? To be fair though, I've pretty much never used lol, it's only useful as a docking station, not for gaming. I think external GPUs for notebooks are again primarily used for that, for docking a notebook into a sort of desktop replacement with mobility as an option, intended for business use, not gaming.
Post edited March 08, 2021 by anzial
low rated
avatar
Orkhepaj: but many games are not playable at all with that hardware
avatar
thegreyshadow: My point is that they should if game companies want to increase their potential install base.

avatar
Orkhepaj: and would make no sense to limit games to that level
avatar
thegreyshadow: Read my post and replies again.
I never said anything about limiting anything. Game companies can set the highest tier as high as they want.
If anything, my point is that game companies are limiting their profit and market opportunities by setting hardware constraints too high.
They should expand those constraints, not limit them.
they should print the game out to toilet paper so even those can "play" who has no pc-s
win-win according to your logic
avatar
thegreyshadow: On that point we clearly disagree.
avatar
anzial: Too bad. The rest of the world agrees with me, and you experienced it already. You can't change the world to your wishes, you simply don't have the clout or money to do so. Live with it.
Hey, slow down, cowboy :)

Not sure if the rest of the world agrees with you, and I cannot obviously change the world to my wishes, and I'm living with it.

I'm just making a suggestion, shooting to the moon, call it however you want. Let me use my little soapbox...

Anyway you had good and thoughtul comments. Thanks for that.
avatar
thegreyshadow: My point is that they should if game companies want to increase their potential install base.

Read my post and replies again.
I never said anything about limiting anything. Game companies can set the highest tier as high as they want.
If anything, my point is that game companies are limiting their profit and market opportunities by setting hardware constraints too high.
They should expand those constraints, not limit them.
avatar
Orkhepaj: they should print the game out to toilet paper so even those can "play" who has no pc-s
win-win according to your logic
Sorry, hyperbole doesn't work.
Never said no limits, only that such limits should be lowered a little more.
Post edited March 08, 2021 by thegreyshadow
low rated
avatar
thegreyshadow: Sorry, hyperbole doesn't work.
Never said no limits, only that such limits should be lowered a little more.
hyperbole works , it makes you came up with a limit , should cpu-s with integrated gpu-s be the limit?
ofc not, and there goes your whole topic
avatar
thegreyshadow: Good point.
However, there are good (but old) games which play very well and look reasonably good. Case in point, Far Cry 2, a complex game, with complex physics, and runs well in onboard GPUs even at their highest settings.
And that's really the problem. Far Cry 2 was released in 2008, so what you are asking is, that the games industry would turn back time about 10 years or so. That won't happen, not in the AAA industry at least. As long as the performance of onboard GPU's keeps lagging several generations behind what dedicated cards can do, the's just no business there for AAA-games.

A good deal of onboard GPUs are in machines that were never designed to be used as gaming devices. They are in small public computers and meant for business laptops and desktops and other devices, like set-top boxes or even household utensils. The market segment for them is vastly different.

Now, do they keep getting better? Yes, absolutely, but the performance increases in them are way smaller than on the dedicated cards. There's very little incentive for GPUs manufacturers to make them really much better, as they know the market segment and the needs of that segment. They are cheap alternatives with purposes outside gaming. That is also reflected in the driver sets for these GPUs. They just aren't a priority.
Games can only be optimized to a certain extent. There will always be some hardware that will not be powerful enough to run a game, especially if the game is a modern 3d game with the latest visual effects. Models and textures for example can only scale down so much and I doubt the devs of any game can make it so that their game will run on something with outdated specifications without compromising the game elsewhere.

What do you think about CDPR releasing unoptimized versions of CP2077? I think if support for weaker hardware goes in that direction, ie making another inferior version of the game entirely from scratch just to run on it, it would be a huge step backwards. It would be like the early console days where PS1 or N64 or PS2 and XBox versions were different and the player had to choose his poison in terms of optimization and the definitive version of the game.

In terms of actually optimizing for recent hardware, I think all devs should do it. But I'd imagine that it would depend on the skill of the people coding it and their publisher's will when it comes what and where they invest their time. If the design philosophy is just to vomit out open unoptimized worlds, I'd imagine quantity would take priority over (optimization) quality.
avatar
thegreyshadow: What do you think?
Technology will eventually catch up with game development, even in the lower end. If you're not getting the right level of performance from your integrated graphics, you might want to dish out for a more powerful iGPU/processor combo. AMD certainly has some which can play The Witcher 2 no problem, at least at 1080p. The new Intel XE should be good enough as well.

Otherwise, just wait. I am currently playing Freelancer on a passively cooled thin-and-light, all maxed out and the potato iGPU is not even breaking a sweat. Back in the day you needed a reasonably powerful graphics card to play it in full detail at 1024x768.

As to developers adjusting detail level for even lower end hardware, sure, there are those that do that. They're generally called mobile game developers :P.
Post edited March 08, 2021 by WinterSnowfall
OP your argument is twofold.

Yes, there is a large market out there made of low end notebooks capable of some light gaming (compared to today's standard) but it has to be seen if it would be profitable.
First of all not everybody with such PCs are gamers and not all those gamers would buy the game, usually producers and investors don't like to make investments based on assumptions.
As you were already told such machines are not made and sold for gaming but rather to accomplish business tasks and, in order to do that, they come equipped with low end GPUs but low end CPUs as well in many cases, additionally consider that older CPUs lack modern/contemporary instructions sets.

Add to that (someone already touched on this) you can scale down a game's code and graphics only to some extent beyond which you are forced to stop or develop different games for different target machines.
For example you can simplify geometry but at some point you run into the basic structure of your meshes: the alternative is to modify the asset or remodel it from the ground up. The same is true the other way around: you can't model something low poly and expect it will scale up its topology automatically.

All those things said there's no assurance that people owning low end PCs will buy the game because they already might own s console as well, your argument is acceptable but is both technically and economically not very profitable for the numbers of modern AAA industry probably.
Post edited March 08, 2021 by Judicat0r
high rated
avatar
thegreyshadow: What do you think?
On the one hand the heavyweight AAA games have never really run well on the iGPU generation on which they launched. On the other hand, you're right that devs have gotten lazy with doing the minimum they can to build a game without ever really digging deep to fully optimise it. It doesn't just affect iGPU's but low-end dGPU's as well, eg, more than a few people have questioned why games like Deus Ex Mankind Divided should run 4-6x slower than Human Revolution when they're barely a few years apart with hardly anything to show for it. If there's one side-effect of the current GPU shortage though, it's that devs who are planning a release may be more incentivised to optimise it a bit better knowing they can't rely on "just upgrade your GPU then" for a larger than usual percentage of gamers who can't buy one or risk losing sales this year.

The real issue for expecting high performance on iGPU's from modern AAA's simply by offering lower presets is that the "weight" of modern games engines has a baseline that just doesn't scale down well. Eg, lowering the textures down more and more won't get it running anywhere near as well as older lighter weight engines do with same textures. I saw someone post this on another forum the other day and LOL'd, but it does highlight the point perfectly.
Post edited March 08, 2021 by AB2012
avatar
thegreyshadow: What do you think?
A non-issue.

There are lots of games, especially indie games, that run just fine on weaker onboard GPUs. Play them. That is the beauty of PC gaming, there is a vast library of games available, from decades old shareware games to the latest AAAAAA++++ games, that one can play it various PC specifications. Not all of them, but for any PC, you can find lots of games to play.

Requiring that all AAA games should also be playable on low-spec computers is odd, to say the least.
Idea: Develop a game on a Raspberry Pi 4, then port it to more mainstream hardware. The game should run well on low-end devices if done this way, right?

If compile times are taking too long, develop on a more powerful system, but occasionally test on the Pi.

Bonus point: You're guaranteed that your game will run on Linux, unless you do something Pi or ARM specific, and I don't see that being likely to happen here.

(Before anybody thinks this is a joke, I'm contemplating doing exactly this.)