It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
avatar
teceem: "semi-decent" is subjective - it all depends on what you're trying to achieve. My GTX970 can still play A LOT of recent-ish* games on high settings at 1440P. Second hand price (over here): 150-200 euro - Try buying a new, complete system for that.
Actually, my small laptop cost less tha $150 when I bought it, and it includes an integrated display, keyboard, pointing device (though I prefer not to use it), and battery.

(Of course, it only has chromebook tier specs, but I still like using it (especially since it's fanless and doesn't get hot), and am even thinking of upgrading its storage (which I've confirmed to be possible; not all laptops with similar specs have that option.)

Edit: Also, looking at amazon, the first result (from a duckduckgo search result) is around $375 used, and newegg has one costing $355 which is too much. (Then again, it's nvidia, so it's not something I would consider for my Linux desktop anyway.)
Post edited June 10, 2021 by dtgreene
avatar
thegreyshadow: Game studios are shooting themselves in the foot by excluding the millions of PCs with integrated graphics.
That's highly debatable. I'd wager the vast majority that only use a PC with an iGPU aren't interested in playing games at all. Let's face it, as big as the gaming market has become over the years, it's still very much a niche hobby in regards to the general population.
avatar
teceem: "semi-decent" is subjective - it all depends on what you're trying to achieve. My GTX970 can still play A LOT of recent-ish* games on high settings at 1440P. Second hand price (over here): 150-200 euro - Try buying a new, complete system for that.
avatar
dtgreene: Actually, my small laptop cost less tha $150 when I bought it, and it includes an integrated display, keyboard, pointing device (though I prefer not to use it), and battery.

(Of course, it only has chromebook tier specs, but I still like using it (especially since it's fanless and doesn't get hot), and am even thinking of upgrading its storage (which I've confirmed to be possible; not all laptops with similar specs have that option.)
"Try buying a new, complete system..." -> Is that laptop new?
Anyway, there are just too many variables in play to have a decent discussion in this topic. e.g. I think ergonomics are very important - so I won't use a laptop's built-in anything for much of its functionality.
Those "chromebook tier" specs might do for most of your gaming purposes - but they won't do for mine.
Etc.

avatar
thegreyshadow: Game studios are shooting themselves in the foot by excluding the millions of PCs with integrated graphics.
avatar
Mr.Mumbles: That's highly debatable. I'd wager the vast majority that only use a PC with an iGPU aren't interested in playing games at all. Let's face it, as big as the gaming market has become over the years, it's still very much a niche hobby in regards to the general population.
Small correction: the "PC gaming market" might be relatively niche, gaming in general certainly isn't.
Then again, what is "gaming". If someone plays 5 hours of ((e.g.) Windows version of...) Solitaire every day - that's some serious "hardcore gaming", isn't it?
Post edited June 10, 2021 by teceem
avatar
teceem: "Try buying a new, complete system..." -> Is that laptop new?
It was new at the time.

Just looking online, it's possible to get a (presumably new) chromebook for $180. (Though note that it only has 32GB internal storage, so even if you can find a way to put Windows 10 on it, I wouldn't recommend it.)

avatar
teceem: Then again, what is "gaming". If someone plays 5 hours of ((e.g.) Windows version of...) Solitaire every day - that's some serious "hardcore gaming", isn't it?
Gaming takes many forms.

There's certainly forms of gaming that wouldn't realistically take much graphics power; there's TUI roguelikes like Nethack (can even play such a game over ssh or even telnet in a text terminal; I believe public nethack servers exist), and I believe there's even audiogames which don't have graphics at all, instead relying on audio for the user interface (and hence being a good choice for blind hearing people).

Also, games like Cookie Clicker and Progress Quest are certainly games (though Cookie Clicker is more demanding on the CPU than it ought to be).
Post edited June 10, 2021 by dtgreene
avatar
dtgreene: If you want to port your game to mobile platforms, which could be a good source of money, it needs to have relatively low system requirements, or it won't work.
I think it'd be interesting to look at the mobile market because of a variety of different specs to extrapolate to the PC market. The hardware specs in recent years has improved to the point where you can still use the same phone for up to 4-6 years with no issues.

But I wouldn't consider the Nintendo Switch because everyone is on equal footing with hardware specs. Everyone is paying for the exact same standardized experience.

avatar
thegreyshadow: No need for research. The mass of available systems with integrated graphics is incredibly higher than those with discrete graphics.
Yes, but the number of iGPUs out there does not mean you're going to buy games unless you have research showing otherwise. This is the scope we've already defined. If there's a market demand for these systems to reward the effort it takes to optimize these games for those systems, I surmise an AAA game would've done it by now. Because they haven't, the opportunity has either been already considered and dismissed OR hasn't been investigated thoroughly enough. What do you think is the most likely scenario here when you have professional analysis groups like Newzoo charging $625-4000/mo USD for their global games market reports on this data?

And yes, we might not purchase a USD 60 game at launch, but still. Look at games such as Control o The Witcher 3 and the prices GOG asks for them right now (summer sale).
But still what? Just because AAA games lower in price years later isn't good enough justification to make it technically less demanding on release. Unless I'm missing something and Witcher 3 and Control can run well on iGPUs; I don't think they do. Again, this sounds like all feeling instead of data.
Post edited June 10, 2021 by Canuck_Cat
avatar
thegreyshadow: There's no way around this: Game studios are shooting themselves in the foot by excluding the millions of PCs with integrated graphics. Please make games with reasonable minimum requirements!
This discussion is as old as PC gaming. When games like Strike Commander hit the shelves back then, a machine that could run them in all their glory was really expensive, and even a system that could barely run them wasn't exactly cheap and low end. And that was in a time when devs were still able to optimise and squeeze another frame per second - and worked hard to do it.

The problem nowadays - apart from the insane hardware prices currently - is that optimisation isn't happening any more - devs rely on frameworks and third-party-tools. Which allows them to create passable results quickly, and manage the complexities of development better, but those frameworks - especially if they target multiple platforms - are big resource hogs by themselves. Optimisation is sacrificed for convenience, and yes, creativity. Nowaday you don't need to be a coding guru like John Carmack to create a game. Which allows for many people implementing their ideas (with varying results obviously). But this easiness and convenience comes at the price of performance issues, there's no way around it.
Nothing beats hand-optimised assembly code in terms of performance, but only a handful of people can manage that. Next best thing is C - still a very technical thing. So layers upon layers of abstraction were added, until we arrived at things like Unity or Unreal engines, which let you focus on the creative side of actual game design. But each of these layers costs performance.

The performance issues are basically a price we pay to have more, and more variety of, games. There are not enough coders out there who can make a game shine even on low-end machines. And it's a lot of work, even for those - on every single platform.
avatar
dtgreene: There's certainly forms of gaming that wouldn't realistically take much graphics power;
Exactly! Like... most/many games, sold on this website.
What makes this topic so vague: nowadays you can buy a huge amount of PC games online: from the late 70s to now. The vast majority of these can be played on integrated graphics from the last decade.

Title: "Games should be playable on onboard GPU systems"
Answer: No problem, most are.
Want to play every and any game (at least on low settings) on any type/model of GPU: never happened, never going to happen.
Post edited June 10, 2021 by teceem
avatar
toxicTom: Nothing beats hand-optimised assembly code in terms of performance
Not necessarily. Sometimes, it's hard or impossible to beat a compiler.

(There's also the fact that porting to different architectures becomes a lot more work when there's code that is architecture specific.)

avatar
toxicTom: Next best thing is C - still a very technical thing.
There's also Rust, which can reach speeds comparable to C. (The trade-off is that compile errors are *very* common in Rust, including things that would have been run-time errors or undefined behavior in C, so it might take some finagling to get your code to compile.)

There's also Zig, though that language doesn't seem to be as popular.

(Also, don't forget shader languages like GLSL that run on the GPU.)
Post edited June 10, 2021 by dtgreene
avatar
toxicTom: Nothing beats hand-optimised assembly code in terms of performance
avatar
dtgreene: Not necessarily. Sometimes, it's hard or impossible to beat a compiler.
That really depends. Though yeah - modern compilers are pretty smart, and architectures way more complex.

avatar
dtgreene: (There's also the fact that porting to different architectures becomes a lot more work when there's code that is architecture specific.)
That's what I meant - frameworks take care of that - at the cost of optimisation for a specific platform. And the PC platform is extremely difficult in itself with its wide range of hardware combinations, drivers and OS's.

avatar
toxicTom: Next best thing is C - still a very technical thing.
avatar
dtgreene: There's also Rust, which can reach speeds comparable to C. (The trade-off is that compile errors are *very* common in Rust, including things that would have been run-time errors or undefined behavior in C, so it might take some finagling to get your code to compile.)
I've not done Rust yet. Never found the time. Heard mostly good things though.

avatar
dtgreene: There's also Zig, though that language doesn't seem to be as popular.

(Also, don't forget shader languages like GLSL that run on the GPU.)
I've never heard of Zig, tbh.
Shader languages are - in my understanding - a perpetuation of the principle. You can either go native and implement your own optimised code for every platform, or you can rely on frameworks like D3D, Vulcan or OpenGL to do the work for you. Correct me if I'm wrong, the stuff I do is all boring business stuff with no use for shaders :-)
avatar
dtgreene: (There's also the fact that porting to different architectures becomes a lot more work when there's code that is architecture specific.)
avatar
toxicTom: That's what I meant - frameworks take care of that - at the cost of optimisation for a specific platform. And the PC platform is extremely difficult in itself with its wide range of hardware combinations, drivers and OS's.
C + OpenGL ES should be portable to any modern syatem that supports OpenGL ES or full OpenGL. In particular, the same code should run on both a desktop and a Raspberry Pi. (For Android or iOS, you might need to write some code in another language to bootstrap it, but you should still be able to reuse most of the C code.)

Also, don't forget things like SDL.

(Note that these things are libraries that abstract out OS/hardware differences, not frameworks that do everything for you, so it should still be good enough to get good speed. After all, many emulators (of 8/16-bit systems, not for modern consoles like the PS5) use SDL and still run well on lower-end systems.)
avatar
Mr.Mumbles: That's highly debatable. I'd wager the vast majority that only use a PC with an iGPU aren't interested in playing games at all. Let's face it, as big as the gaming market has become over the years, it's still very much a niche hobby in regards to the general population.
its supposedly at least 10% of pc gamers using an integrated card - probably far more with the ongoing explosion in new pc gamers and the GPU shortages.
Post edited June 11, 2021 by Sachys
avatar
toxicTom: Shader languages are - in my understanding - a perpetuation of the principle. You can either go native and implement your own optimised code for every platform, or you can rely on frameworks like D3D, Vulcan or OpenGL to do the work for you. Correct me if I'm wrong, the stuff I do is all boring business stuff with no use for shaders :-)
In modern graphics APIs, you really can't get any lower in level than the likes of GLSL or similar. (Well, you might be able to write SPIR-V by hand, but even that is abstracted away from the actual instruction set used by the hardware.)

D3D (from DirectX 11 or earlier) and OpenGL (3.0 and up) are libraries that handle some lower level details for you, but are still at a pretty low level, much like in C. It can still be quite tedious to write even simple examples with these APIs, particularly since you need both a vertex shader and a fragment shader just to draw a triangle (the hello world of 3d graphics programming). I would compare this to writing in C.

Vulkan (and DX12/Metal) is lower level, to the point where I'd compare it to writing assembly. Even the triangle example I mentioned above takes many hundreds of lines to set everything up. (Looking online, I find an example that's over 1,000 lines of code, but note that many of them are comments meant for those learning Vulkan, so that number is inflated somewhat.) It's bad enough to the point where there exists libraries like V-EZ and vk-bootstrap that exist solely to reduce the amount of boilerplate code needed for Vulkan.

(Older OpenGL versions use the fixed function pipeline, which is less work for simple examples (since you don't have to write shaders), but less flexible (since you can't write custom shaders, though it appears that 2.x has the beginnings of that.)
avatar
toxicTom: That's what I meant - frameworks take care of that - at the cost of optimisation for a specific platform. And the PC platform is extremely difficult in itself with its wide range of hardware combinations, drivers and OS's.
avatar
dtgreene: C + OpenGL ES should be portable to any modern syatem that supports OpenGL ES or full OpenGL. In particular, the same code should run on both a desktop and a Raspberry Pi. (For Android or iOS, you might need to write some code in another language to bootstrap it, but you should still be able to reuse most of the C code.)

Also, don't forget things like SDL.

(Note that these things are libraries that abstract out OS/hardware differences, not frameworks that do everything for you, so it should still be good enough to get good speed. After all, many emulators (of 8/16-bit systems, not for modern consoles like the PS5) use SDL and still run well on lower-end systems.)
The problem is the productivity. C and its dialects are for techies, let's be honest here. And there's simply not enough of those to create the masses of games in all their variety we have today.
I mean, I tip my head to anyone capable to complete complex project this way and thus produce a performant product - game or not, but those people are rare specialists.

SDL is pretty impressive though.

I think it's actually the mobile market that still drives the need for optimisation. There's money in making things work on those tiny and often outdated machines. Often through garbage business practices, but money no less. Without this, the situation would be even worse.
avatar
toxicTom: Shader languages are - in my understanding - a perpetuation of the principle. You can either go native and implement your own optimised code for every platform, or you can rely on frameworks like D3D, Vulcan or OpenGL to do the work for you. Correct me if I'm wrong, the stuff I do is all boring business stuff with no use for shaders :-)
avatar
dtgreene: In modern graphics APIs, you really can't get any lower in level than the likes of GLSL or similar. (Well, you might be able to write SPIR-V by hand, but even that is abstracted away from the actual instruction set used by the hardware.)

D3D (from DirectX 11 or earlier) and OpenGL (3.0 and up) are libraries that handle some lower level details for you, but are still at a pretty low level, much like in C. It can still be quite tedious to write even simple examples with these APIs, particularly since you need both a vertex shader and a fragment shader just to draw a triangle (the hello world of 3d graphics programming). I would compare this to writing in C.

Vulkan (and DX12/Metal) is lower level, to the point where I'd compare it to writing assembly. Even the triangle example I mentioned above takes many hundreds of lines to set everything up. (Looking online, I find an example that's over 1,000 lines of code, but note that many of them are comments meant for those learning Vulkan, so that number is inflated somewhat.) It's bad enough to the point where there exists libraries like V-EZ and vk-bootstrap that exist solely to reduce the amount of boilerplate code needed for Vulkan.

(Older OpenGL versions use the fixed function pipeline, which is less work for simple examples (since you don't have to write shaders), but less flexible (since you can't write custom shaders, though it appears that 2.x has the beginnings of that.)
Thanks for the explanation. Yeah I've seen Vulcan code (without being able to read it) and it reminded me of Assembly. I thought it was a mistake actually, that someone posted intermediate language instead of the real code... I'd have thunk Vulcan more abstracted.

Explains why it's not as successful while having a good reputation in terms of performance.
Post edited June 11, 2021 by toxicTom
avatar
thegreyshadow: Game studios are shooting themselves in the foot by excluding the millions of PCs with integrated graphics.
avatar
Mr.Mumbles: That's highly debatable. I'd wager the vast majority that only use a PC with an iGPU aren't interested in playing games at all. Let's face it, as big as the gaming market has become over the years, it's still very much a niche hobby in regards to the general population.
I think you don't understand my point. Joe User might not be interested right now in playing games at all, but if all it takes him to play a nice DRM-free game is to fork USD 10, he could very well be interested. With this criterion of excluding onboard GPUs, this is not possible. I submit that game studios are turning down a lot of potential income due to this policy.

The point is not who might be interested, but that the barrier of entry is artificially high and should be lowered. Make the entry point into games as frictionless as possible.

Now think: we are 15 months into this pandemic. Many people who previously were not interested in games (I was one of those) began to play games at this time. The pandemic saw many new gamers. How much more could they have been had they been able to play their favorite title in their onboard GPU computer...?
Post edited June 11, 2021 by thegreyshadow
Just a quick search for some games that shouldn't need more than the power of integrated graphics, many of them old, yet simply don't support them, as clearly stated in the requirements:

Kao the Kangaroo (first one, 2000)
Medal of Honor: Allied Assault War Chest (2004, 2002 for the base game)
Heretic Kingdoms: The Inquisition (2004)
SWAT 4 (2005)
MX vs. ATV Unleashed (2006)
Warhammer: Mark of Chaos (2006)
Broken Sword 4 (2007, 2006 according to other sources)
(probably) Fallout 3 (2009, 2008 for the base game)
Lethis: Path of Progress (2015, but simple graphics)
Teamfight Manager (2021, but pixel graphics)

Stopped at 10, and didn't include any of those saying that integrated graphics may work but are not officially supported, but should be a fair sample. In these cases it's not a matter of optimization, but quite obviously of using instructions specifically catering to dedicated cards...