It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
paladin181: I think that hardcore gamers with disposable incomes are the target audience. People who have computers on the cheap aren't likely to spend much on games
avatar
Dark_art_: I share the same opinion and let me add that "those" people had moved to mobile.
I can't recall a single person on my friend circle who plays primarily on iGPU laptop. Either have a desktop, a beefy laptop or play on mobile and use the computer for other stuff.
I came in here to say almost exactly this.

Anyone "playing games" on a laptop with no GPU... ISN'T playing games on a laptop - they're on their phone or console.
avatar
Shadowstalker16: In terms of actually optimizing for recent hardware, I think all devs should do it. But I'd imagine that it would depend on the skill of the people coding it and their publisher's will when it comes what and where they invest their time. If the design philosophy is just to vomit out open unoptimized worlds, I'd imagine quantity would take priority over (optimization) quality.
It's interesting. I've seem people here say that OP's exemple (Far Cry 2) is an old game (2008) and it should run well in any Onboard GPUs nowdays. (Not saying you did that.)

But Metal Gear Solid V: The Phantom Pain is a 2015 game and runs in integrated graphics in lowest resolution 30 to 50 fps depending on your CPU.

FoxEngine is really well optmized and if they did it, why others can't? ('Rethoric' question).

And what you said is completely true. (Yes Im looking at you Ubisoft.)
avatar
Orkhepaj: I think you are lame.
Guess why people buy those expensive gpu-s? Because they want to play better/faster graphics games.
And that's why gaming companies make those games.
I mean, here the situation is very complicated, for example a 3060 TI is arround $1700 USD, and a complete computer is arround the double $3400 USD, if you take in count that our salaries are arround 400-600 USD, it would thake month of work to build a basic medium spec computer, you don't need expensive equipment to play games, maybe you won't be able to play over the top graphic games but there are many games to play, especially classic games... Man, I used to play on a Intel HD 4400 and I was really happy!
avatar
KetobaK: I mean, here the situation is very complicated, ...
It's not - there are big GPU shortages all over the world, right now. Avoid buying a graphics card if you don't really need one, until prices settle again.
Or... spend big, I don't care, it's not my money.
avatar
KetobaK: I mean, here the situation is very complicated, ...
avatar
teceem: It's not - there are big GPU shortages all over the world, right now. Avoid buying a graphics card if you don't really need one, until prices settle again.
Or... spend big, I don't care, it's not my money.
I know I know but here we have exorbitant taxes on top of the overprice, that make it even more complicated (65% for purchase on foreing currency, and 50% for pass the limit amount of dollars to purchase) I'm a really small retailer, and I can't purchase anything, I'm crossing all my fingers for a quick solution XD
avatar
thegreyshadow: While my paltry Intel UHD 620 onboard GPU can play some very good games with great results, there are others which make it struggle (such as The Witcher 2, a game already 10 years old!).

Making games which are unplayable (at least on the lowest end) on systems without discrete GPUs is counterproductive to game companies.

First of all, GPU production is experiencing (and has experienced since the global cryptocurrency explosion) chronic shortages and problems in availability.

Secondly, there is an absurdly large market of PCs with onboard GPUs.

Excluding such a computer base from your latest game simply impose an unreasonable limit on your potential audience.

Games should have Crysis-like and ray-tracing GPU consumption levels on their highest tiers, I can get that. But they should also ensure that their games should be playable on the lowest setting with onboard GPUs such as Intel UHD chipsets.

What do you think?
well i know that next gen onboard gpu's will rival the performance of the 1050Ti

it seems tech just wasn't good enough right up until now

desktop wise a discrete gpu can be bought for a little amount of cash
avatar
thegreyshadow: While my paltry Intel UHD 620 onboard GPU can play some very good games with great results, there are others which make it struggle (such as The Witcher 2, a game already 10 years old!).

Making games which are unplayable (at least on the lowest end) on systems without discrete GPUs is counterproductive to game companies.

First of all, GPU production is experiencing (and has experienced since the global cryptocurrency explosion) chronic shortages and problems in availability.

Secondly, there is an absurdly large market of PCs with onboard GPUs.

Excluding such a computer base from your latest game simply impose an unreasonable limit on your potential audience.

Games should have Crysis-like and ray-tracing GPU consumption levels on their highest tiers, I can get that. But they should also ensure that their games should be playable on the lowest setting with onboard GPUs such as Intel UHD chipsets.

What do you think?
Not everybody in the entire world needs or wants a GPU; and the extra $ cost it comes with. Many people just need an office machine or Internet machine; they don't NEED all this extra iGPU or discrete-GPU stuff. Having that stuff on board, that costs $ to both the developer/maker of the iGPU and then gets of course passed to the consumer.

Intel is not going to provide a great iGPU, which developers can take advantage of. Intel is really in the CPU business, not GPU business. And they've never had great iGPU's.

AMD might be the crew to look at - as they bought out ATI years ago; and they also make CPU's. So, if anyone's to make a solid iGPU, it would be them.

NVidia also might also be someone to look at - if their deal w/ ARM goes through....as then they can corner both the CPU and GPU market.

But even then, most iGPU's are so NOT in the league of what current modern PC discrete GPU's are doing. GPU's are WAY ahead of iGPU's again - especially now w/ DLSS and RTX support. We've had that next level area again, as now the new consoles are out too (i.e. PS5 and X Box Series X).

So, w/ those consoles here - expect PC requirements soon to get very similar to what the PS5 and XSX has for the bare minimums.

So, if people really want a good GPU - yep, they're basically going to have to pony-up for those PC's (desktops or laptops) providing them inside and/or pony-up for the desktop part itself...provided you can actually find those.

For those buying laptops - you're going to have to buy a gaming laptop with a 2000 or 3000 card to play some current modern PC games properly; especially w/ new consoles here & all.

avatar
Shadowstalker16: In terms of actually optimizing for recent hardware, I think all devs should do it. But I'd imagine that it would depend on the skill of the people coding it and their publisher's will when it comes what and where they invest their time. If the design philosophy is just to vomit out open unoptimized worlds, I'd imagine quantity would take priority over (optimization) quality.
avatar
.Keys: It's interesting. I've seem people here say that OP's exemple (Far Cry 2) is an old game (2008) and it should run well in any Onboard GPUs nowdays. (Not saying you did that.)

But Metal Gear Solid V: The Phantom Pain is a 2015 game and runs in integrated graphics in lowest resolution 30 to 50 fps depending on your CPU.

FoxEngine is really well optmized and if they did it, why others can't? ('Rethoric' question).

And what you said is completely true. (Yes Im looking at you Ubisoft.)
You also have to remember - Far Cry series' old Crytek-based engine wasn't built for multi-cores CPU's too. When Far Cry 1 was built on Crytek's Engine, that engine was aiming for single-core PC's and were expecting games to heavily increase the Ghz speed, not the core-count. Far Cry 5 still isn't the best running game and is still on that same engine, even if they improve, tweak, or whatnot - and isn't really built entirely for multi-core CPU's.

While many of the Ubi games and Far Cry games are heavy on the CPU, they're also heavy on the GPU too. Even FC5, still isn't really built for multi-core CPU's.

Would we really expect Crytek or UbiSoft to go back in and improve performance in an old engine and big open-world style game w/ AI, explosions, combat, and action everywhere?

Something like the FoxEngine, which is much more modern of an engine, probably was built from scratch for multi-cores CPU's. So, it's going to straight-up run better and utilize CPU's better too.
Post edited March 08, 2021 by MysterD
avatar
.Keys: It's interesting. I've seem people here say that OP's exemple (Far Cry 2) is an old game (2008) and it should run well in any Onboard GPUs nowdays. (Not saying you did that.)

But Metal Gear Solid V: The Phantom Pain is a 2015 game and runs in integrated graphics in lowest resolution 30 to 50 fps depending on your CPU.

FoxEngine is really well optmized and if they did it, why others can't? ('Rethoric' question).

And what you said is completely true. (Yes Im looking at you Ubisoft.)
avatar
MysterD: You also have to remember - Far Cry series' old Crytek-based engine wasn't built for multi-cores CPU's too. When Far Cry 1 was built on Crytek's Engine, that engine was aiming for single-core PC's and were expecting games to heavily increase the Ghz speed, not the core-count. Far Cry 5 still isn't the best running game and is still on that same engine, even if they improve, tweak, or whatnot - and isn't really built entirely for multi-core CPU's.

While many of the Ubi games and Far Cry games are heavy on the CPU, they're also heavy on the GPU too. Even FC5, still isn't really built for multi-core CPU's.

Would we really expect Crytek or UbiSoft to go back in and improve performance in an old engine and big open-world style game w/ AI, explosions, combat, and action everywhere?

Something like the FoxEngine, which is much more modern of an engine, probably was built from scratch for multi-cores CPU's. So, it's going to straight-up run better and utilize CPU's better too.
You got a point. But, to be fair, why shouldn't they be working in a new, well optmized, engine?

We're talking about Ubisoft. Not a small studio. They have time, work force and money enough to do this if they really wanted to. We can say -without much fear of making a mistake- that they're just lazy at this point imho.

I mean, they may have the technology, knowledge and workforce to do this by now. Right?
Post edited March 08, 2021 by .Keys
avatar
MysterD: You also have to remember - Far Cry series' old Crytek-based engine wasn't built for multi-cores CPU's too. When Far Cry 1 was built on Crytek's Engine, that engine was aiming for single-core PC's and were expecting games to heavily increase the Ghz speed, not the core-count. Far Cry 5 still isn't the best running game and is still on that same engine, even if they improve, tweak, or whatnot - and isn't really built entirely for multi-core CPU's.

While many of the Ubi games and Far Cry games are heavy on the CPU, they're also heavy on the GPU too. Even FC5, still isn't really built for multi-core CPU's.

Would we really expect Crytek or UbiSoft to go back in and improve performance in an old engine and big open-world style game w/ AI, explosions, combat, and action everywhere?

Something like the FoxEngine, which is much more modern of an engine, probably was built from scratch for multi-cores CPU's. So, it's going to straight-up run better and utilize CPU's better too.
avatar
.Keys: You got a point. But, to be fair, why shouldn't they be working in a new, well optmized, engine?

We're talking about Ubisoft. Not a small studio. They have time, work force and money enough to do this if they really wanted to. We can say -without much fear of making a mistake- that they're just lazy at this point imho.

I mean, they may have the technology, knowledge and workforce to do this by now. Right?
Sure, they (UbiSoft) have the tech, knowledge, and workforce to do this all - but the question is: will they?

The thing is - eh, probably not; they probably won't.

They haven't really done that much before - so, why would they do this now?

They got the Engine from Crytek after everything went south after Far Cry; and they also have the Anvil Engine for AC. I'm sure they got other engines for other games - but, you got my drift, right?

They are a corporation. They are here for profit, to make $. They are going to milk all of their engines for their games as much as they can...before they really truly need a new one.

I mean, they also are putting out AC yearly or every other year and constantly pumping out sequels, right?

They're saving money on their end by not really doing much that much the engine and/or redoing the engine entirely & just tweaking it here & there, passing it all (the cost of hardware) onto the consumer. The consumer is here to basically keep up w/ the Joneses here & they are the ones spend the $ on good hardware to play Ubi's games.

Most PC gamers probably upgrade their PC every 1-3 years; and/or build or buy a new PC probably every 3-6 years, right?

Given that their games sell really well and given that they often are unoptimized for tons of reasons - i.e. high fidelity; huge seamless open world games; AI and NPC's everywhere; not re-working engines; putting new iterations in a series every year or every other year; etc etc. - yeah, I'm expecting their games to be the most demanding & unoptimized stuff on the market....well, except for probably The Medium.
Post edited March 08, 2021 by MysterD
Every piece of hardware a game company supports costs extra development resources to write code for, do development and quality testing on, and to acquire the various hardware variants to do that on among other costs so naturally software developers will focus their resources on the widest amount of hardware they can target with the resources they have available, but have to draw the line somewhere also.

It's also possible for a developer to attempt to target a much much too broad variety of hardware from very low end to very high end and end up in a situation that can be disastrous. For an example of that, look at CD Projekt RED's Cyberpunk 2077. They tried to target every computer and gaming console ever made back to the PDP-11 mainframe and well, the game didn't work very well on all of that hardware and Bad Things Happened as a result. #TooSoon? <grin>

I'm both joking a bit and being quite serious when I say that. A developer really does have to pick and choose very carefully what hardware they are going to target and try to determine what hardware will be widely in use when their game launches, not just when they start developing it. For games that are of a simpler nature it can be easy to support a very wide variety of hardware potentially, especially if the game isn't cutting edge. The more a game pushes the envelope of technology however the more it's going to rely on people having higher end technology, and in some cases that might mean that integrated GPUs or older GPUs or any particular piece of hardware might not have capabilities that align with what they plan to expend their limited development resources, quality testing resources and other resources on.

In the end it all comes down to business math. What percentage more money and human manpower resources do you expend to target what percentage more users that use a given piece of hardware? Do you spend 10% more budget to target a GPU that might make up 2% of your userbase? They end up having to figure out where to draw those lines in the sand, and there will always be people who have hardware that get left out as hardware and software technologies ever move forward.

The same is true for supporting other operating systems or devices too. Nothing can be supported without associated costs of doing the work, and if they don't have the resources to do it, or the bean counters do not predict a return on investment that is worthwhile for a given amount of work and the risk involved then it isn't likely to be a priority.

Today's games are rushed to the market as-is with lots of bugs, lacking optimization on the hardware they do end up officially supporting. Stuffing more hardware on to their list just delays the game launch further, or lowers the quality of the game either outright or on some hardware more than others, or causes features to get dropped in order to meet the deadlines to release the game.

At the end of the day something has to give, and some hardware just isn't powerful enough and/or widely used enough to get on the radar so it doesn't get supported.
avatar
thegreyshadow: It simply doesn't make business sense.

I repeat: on the highest tiers go all the way you want; studios could require even three parallell jumbo GPUs for rendering everything to the highest detail; but it makes sense to have a "lowest / potato" tier which makes the game accessible to GPUs such as Intel onboard chips.
avatar
Mortius1: Have you forgotten the Cyberpunk 2077 outcry?

CDPR explicitly supported PS4 and XBox One consoles, despite the platforms not being strong enough. Could you describe this blur as anything other than potato mode?

For releasing Cyberpunk 2077: Potato Edition, CDPR was pilloried in the press. Sony pulled the game from their online store and offered refunds to anyone that wanted it.

That hardly seems like business sense.
Indeed, they were extremely overly ambitious IMHO. Had they targeted only PC and newest generation consoles for the game, think of all of the resources poured into all the older consoles etc. that could have been redirected into completing the actual game, optimizing it, creating decent AI, etc. Heck, it would have even been more worthwhile if they had made the game NOT work on the laptop I just finished playing it on (1050Ti) to focus on making it work better on the primary targets that are actually powerful enough to run the game more properly.
Post edited March 08, 2021 by skeletonbow
avatar
thegreyshadow: The scenario you propose of course would not make any sense at all. But that's not what I'm proposing.

Entry level: Intel GPU or similar.
The game should be playable with an acceptable level of detail.
avatar
Orkhepaj: but many games are not playable at all with that hardware
and would make no sense to limit games to that level
Indeed, and many games choose to not support Intel integrated video because of that, which makes sense. Personally I think all games would be better off if they did that, as all of the development resources that went into supporting potato onboard graphics could be spent optimizing and improving the game on the actual discreet gaming cards that everyone uses.

And as counter-intuitive as it might be, I say that as someone owning thousands of games who has outdated 7 year old AMD GPU, and low end (by today's standards) onboard Intel and nvidia GPU laptop. So my opinion would actually hurt me more than help me, but it'd be better for gaming overall and keeping things moving forward and with higher quality.

I'd be getting new hardware if it was actually possible to get it in 2021... Perhaps they'll end up having to support old potato hardware in another year because new hardware only exists in PDF datasheets and not on store shelves. :)
avatar
KetobaK: I mean, here the situation is very complicated, ...
avatar
teceem: It's not - there are big GPU shortages all over the world, right now. Avoid buying a graphics card if you don't really need one, until prices settle again.
Or... spend big, I don't care, it's not my money.
People who have purchased good GPU's haven't suddenly disappeared despite there is a shortage. They still exist. if you look at the Steam hardware statistic, you quickly see, that the number of people using something like Intel GPU's are a huge minority. The most common cards are from Nvidia 10** and 16** lines. Those are the mid to minimum specs most AAA devs aim for, leaving the biggest bells and whistles to better line of cards.

So based on the numbers, it really makes little sense in trying to make AAA games run with onboard GPU's. Besides, I suspect that most people who use Steam with a device with such a GPU, are those who play either older games, lighter indies or casual games like hidden object titles Artifax Mundi does.
avatar
Orkhepaj: but many games are not playable at all with that hardware
and would make no sense to limit games to that level
avatar
skeletonbow: Indeed, and many games choose to not support Intel integrated video because of that, which makes sense. Personally I think all games would be better off if they did that, as all of the development resources that went into supporting potato onboard graphics could be spent optimizing and improving the game on the actual discreet gaming cards that everyone uses.

And as counter-intuitive as it might be, I say that as someone owning thousands of games who has outdated 7 year old AMD GPU, and low end (by today's standards) onboard Intel and nvidia GPU laptop. So my opinion would actually hurt me more than help me, but it'd be better for gaming overall and keeping things moving forward and with higher quality.

I'd be getting new hardware if it was actually possible to get it in 2021... Perhaps they'll end up having to support old potato hardware in another year because new hardware only exists in PDF datasheets and not on store shelves. :)
indeed, best would be if there was a basic setup each year and game requirements would label it can run on 2015 version recommends 2018 version , and you could check your hardware if it is as good as those versions , they could even show settings and fps numbers

stupid crypto should be outlawed , does it bring anything good for humanity? I dont think so
some luckers got rich , many other got scammed , many criminals enjoy its benefits
avatar
tomimt: And that's really the problem. Far Cry 2 was released in 2008, so what you are asking is, that the games industry would turn back time about 10 years or so. That won't happen, not in the AAA industry at least. As long as the performance of onboard GPU's keeps lagging several generations behind what dedicated cards can do, the's just no business there for AAA-games.
Good answer, but this part misses my point. FC2 was an example but it's not really a good comparison.
I'm not asking the industry to turn back 10 years.
I was able to run FC2 smoothly on 1080p and with highest settings.
I'm not asking that.
Make the highest setting all the demanding you want; all I ask is to have some "lowest setting" suitable for onboard GPUs.

And I do think there's a business there. Many people have good systems which cannot be expanded with a discrete GPU and who would be delighted to play some games. And there's a crunch due to this pandemic.

You have good points. I appreciate your thoughtul answer.
avatar
teceem: It's not - there are big GPU shortages all over the world, right now. Avoid buying a graphics card if you don't really need one, until prices settle again.
Or... spend big, I don't care, it's not my money.
avatar
tomimt: People who have purchased good GPU's haven't suddenly disappeared despite there is a shortage. They still exist. if you look at the Steam hardware statistic, you quickly see, that the number of people using something like Intel GPU's are a huge minority. The most common cards are from Nvidia 10** and 16** lines. Those are the mid to minimum specs most AAA devs aim for, leaving the biggest bells and whistles to better line of cards.

So based on the numbers, it really makes little sense in trying to make AAA games run with onboard GPU's. Besides, I suspect that most people who use Steam with a device with such a GPU, are those who play either older games, lighter indies or casual games like hidden object titles Artifax Mundi does.
yes , this is the problem of the linux and anti-spy community , they dont share their setups with the stores the publishers use to get info from , no wonder publishers dont support those when they cant how many people actually using linux