It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
I have a problem to see, whether a game supports only Intel 32-Bit (which does not run on an M1 Mac) or whether a game also supports M1 native. Steam is better than GOG there, although GOG always says they want to take care of such things...
Intel 32-bit shows under the hardware requirements that is an old 32bit app only and won't work on modern macs. Now, 64bit INTEL or M1 you have no way of knowing. If it doesn't say it doesn't run on 64bit MacOS, then chances are it will work with Rosetta (ergo, Intel 64bit). The M1/M2 support from GOG is basically non existent... however, Neverwinter Nights for instances is supposedly M1 native if you don't use Galaxy (previous infinity engine games are through Rosetta).
shrug
You try hitting code targets that change this frequently. In what was basically a very short time, they went from 68k to Power, to Intel, to ARM, initially bridging support but shortly after slamming the doors thereof. The inverse of a POSIX Portable system.
avatar
Lebostein: I have a problem to see, whether a game supports only Intel 32-Bit (which does not run on an M1 Mac) or whether a game also supports M1 native. Steam is better than GOG there, although GOG always says they want to take care of such things...
All games listed as Mac-compatible on GOG are by default 64-bit. The only way to see 32-bit Mac games is under the Windows system requirements (bizarrely). The Mac system requirements sometimes list M1/2 native, or not; nobody enforces that.
avatar
Darvond: shrug
You try hitting code targets that change this frequently. In what was basically a very short time, they went from 68k to Power, to Intel, to ARM, initially bridging support but shortly after slamming the doors thereof. The inverse of a POSIX Portable system.
That took decades; it's not a "short time". The last CPU architecture switch was 2005, which was almost 20 years ago. Consoles switch architectures far more frequently than that. Reported for yet more off-topic trolling.

Oh, and POSIX is CPU-agnostic, and macOS remains officially POSIX-compliant. Not that you'd actually know anything about what you're blathering on about.
Post edited March 14, 2023 by eric5h5
avatar
Darvond: shrug
You try hitting code targets that change this frequently. In what was basically a very short time, they went from 68k to Power, to Intel, to ARM, initially bridging support but shortly after slamming the doors thereof. The inverse of a POSIX Portable system.
Lol the 68k-PowerPC transition was 1994, 29 years ago, that's not a "very short time", long before the BSD-based MacOS existed. In 1994 I was running MS-DOS 6.22 and windows 3.1 on my PC at home (it was still 2-3 years before I'd discover linux and Windows 95 wasn't out yet), and at school was running on System 7 on their Macs.
MacOS X came out in 2001 - 22 years ago, still in the PowerPC era.
So 29 years ago. 11 years after that 68k-PowerPC transition, was the transition to intel in 2005, and then ANOTHER 15 years before the transition to ARM, which still for the moment at least will run x86-64 code just fine.

You can even still create a single binary that will run on EVERY MacOS machine of every architecture back to the start of unix-based mac operating systems NATIVELY using the current binary system. I created a small C program 3 months ago and compiled it as an experiment on my M1 macbook targeting arm64, then targeting x86-64 (you can still cross-compile for those, or compile under an x86-64 shell under rosetta2), and then running QEMU and older OS releases, compiled the code for 32-bit x86 (i386), PPC64 and original PowerPC targets. A couple of terminal commands later and I had a single "fat" binary that works natively on any mac from the last 22 years, and even though asking the oldest system what architectures that binary supports - it would say unknown in the list entries for the much later architectures as they were not recognised (didn't exist at the time), it ran just fine. Of course nobody would do this for a real app but it was a fun experiment and it worked.

MacOS IS POSIX-compliant.

If you can't keep up with a handful of architecture changes over 30 years that are eased every single time for at least a while with compatibility tools in order to give devs plenty of time to catch up (we're 2.5 years into arm64 macs and x86-64 apps are still supported in rosetta2 for those apps that haven't yet caught up), idk what to tell you, maybe developing just isn't for you at that point.

People are driving cars now who weren't born before the previous major arch change.
Attachments:
Post edited June 29, 2023 by dangaz84
You forgot about the Intel to M2 architecture change, which is from x86 to ARM. I imagine that there's probably going to be a sweeping change in the near future too, as Mr. Cook is not the charismatic cult leader that Steve Jobs was, and without their lead, the company is back in headless chicken mode.

Do you really think Apple is going to keep that Rosetta bridge extended for long? They hate legacy tech, as that's what nearly killed Apple back before 1998.
avatar
Darvond: You forgot about the Intel to M2 architecture change, which is from x86 to ARM.
No I didn't, it was clearly mentioned, and like I said, it's the ONLY architecture change they've done in the last 18 years:
avatar
dangaz84: was the transition to intel in 2005, and then ANOTHER 15 years before the transition to ARM
avatar
dangaz84: we're 2.5 years into arm64 macs
avatar
Darvond: Do you really think Apple is going to keep that Rosetta bridge extended for long?
No, but they don't need to do it forever. Any dev can spend 5 seconds switching targets and recompile, the fat binaries will not disappear, they've existed for 22 years so you can make the same app run on one or more architectures natively to keep up support for older systems, and there is dwindling need for rosetta once most apps move over, they'll keep it until it starts costing them too much dev time keeping up with OS changes.

PowerPC on x86 Rosetta should have continued support a little longer than it did I think, but in the end any actively developed software moved over, often continuing support for YEARS using fat binaries compiled for both platforms, and anything popular that had stopped development and didn't support the new platform would be replaced to fill the need. It might not be ideal but progress often means breaking compatibility.

There are very often compatibility issues without major arch changes under windows and linux also with older binaries, over the same time period you gave, PC users lost direct compatibility for running MS-DOS apps, 16-bit (pre-win95) windows apps, apps that relied on things that existed and were later changed in variations of OS between 95, 98, ME, the change to NT-based kernels in normal windows from win2000 caused massive compatibility issues, XP, 7, 8, 8.1, 10 and 11, each time breaking compatibility with some or many earlier apps and drivers, requiring people to run apps like dosbox or VMs running older versions of windows to play their very old games or some other software.

Under linux there are compatibility issues very frequently with dynamically linked executables ending up with dependency issues when some libraries update and others don't and getting to where they are incompatible with versions available etc. Yes, most programs under linux are open source and can just be recompiled but those same programs are still open source when you are on mac and can simply be recompiled also. Try to run a particularly old dynamically linked linux binary with even fairly basic dependencies on a current distro and see how well that goes. There's no cross-architecture execution system in linux so you cannot run a linux binary for compiled for a different architecture at all without running an emulator and an OS for that architecture in that emulator.

At least with fat binaries and universal apps devs can support the latest platform and can choose to support older ones at the same time for as long as they like with native code for more than one arch. That doesn't help people who upgraded run older software when rosetta gets killed eventually but it at least gives the option of easily supporting people who can't afford to upgrade yet which is good.

I would have loved it if the original rosetta continued on forever and that the current one would too, but at some point it becomes harder to justify for them, just as maintaining full backwards compatibility didn't happen anywhere else even between software revisions without major architecture changes. I use all 3 of these platforms, and linux on multiple architectures for what works best for me on each of them and understand that progress requires sometimes compatibility breaking changes eventually, on any of them.
Post edited June 30, 2023 by dangaz84
avatar
dangaz84: No, but they don't need to do it forever. Any dev can spend 5 seconds switching targets and recompile
I can imagine. Game company sets out to find their 20-year-old source code from their dusty archives.. the poor intern tasked with this job faces a homegrown build system that no longer works. But after some monkeying around, they finally get it to the stage where it runs a compiler.. only to see a bunch of errors, because who would've guessed, trying to use a modern compiler on ancient code is like that. Company realizes unity bois can't fix these issues and they have to hire a real programmer. A ripe-to-retire 35-year-old C++ developer with grey hairs arrives to the scene, fixes all the compile errors, and spends the next three months fixing segfaults and other horrors due to the subtle bugs and undefined behavior exposed by new architecture and new compiler's more aggressive optimizations..
Under linux there are compatibility issues very frequently with dynamically linked executables
That is true. Unfortunately static linking is not a solution either..
There's no cross-architecture execution system in linux so you cannot run a linux binary for compiled for a different architecture at all without running an emulator and an OS for that architecture in that emulator.
That's not true! Qemu user mode emulation is a thing. https://wiki.debian.org/QemuUserEmulation
avatar
dangaz84: No, but they don't need to do it forever. Any dev can spend 5 seconds switching targets and recompile
avatar
clarry: I can imagine. Game company sets out to find their 20-year-old source code from their dusty archives..
I see your point but in that part I was talking about keeping up with the latest arch change as the other commenter had said "You try hitting code targets that change this frequently." If they had an app that works perfectly on intel on the current OS then it wasn't dusty 20 year old code that hasn't needed to be updated since, it would be relatively up to date and thus mostly a case of changing target for the new architecture.

Yes it would be harder to bring dusty 20 year old code back to life sometimes but that wasn't what I was referring to in that part, and if the app hadn't been touched for 20 years, it wouldn't have still been running just before this arch change anyway due to the need to keep up with software changes just like on every other OS, but also the removal of 32-bit support etc (and the previous arch change from just under 20 years ago, but they may have gotten away with a target change back then too without touching code). They would have shown no interest in developing the app and making it available for close to 20 years.
avatar
clarry: That's not true! Qemu user mode emulation is a thing. https://wiki.debian.org/QemuUserEmulation
Well, that is cool! I didn't know that was a thing.
Post edited July 04, 2023 by dangaz84