I'm guessing they don't want to maintain and build and test x86_64 versions of all the macos libraries like Appkit and UIKit (including large changes like liquid glass) when they are no longer shipping x86_64 macOS versions. Which is not entirely unreasonable as I'm sure it takes a lot of effort to keep the whole ui library stack working properly on multiple archs.
Perhaps that's what they're hinting about with the note about a "subset of Rosetta". So maybe there is hope that the core x86_64 binary translator will stick around for things like VM and emulation of generic (linux? wine?) binaries, but they don't want to maintain a whole x86_64 macOS userspace going forward.
Space savings from not shipping fat binaries for everything will probably also be not insignificant. Or make room for a new fat binary for a future "arm64v2" :)
> ...they don't want to maintain and build and test x86_64 versions...
This feels wrong. Apple sold Intel-based Macs until early June 2023. The last one was the 2019 Mac Pro model.
Ending support for Rosetta in macOS around 2028 also means ending support for any x86_64 versions of software. This means that those unfortunate users who bought an Intel Mac Pro in 2023 only got five years of active usability.
Just because the latest OS isn't able to be installed on older hardware does not mean the hardware in no longer usable. I know people to this day that still run the last 2012 cheese grater MacPros with Snow Leopard as daily work machines. They still use Final Cut 7 on them to capture content from tapes. At this point, they are very fancy dedicated video recorders, but they still run and are money making devices.
You're right; I still have a 2010 MBP w/8GB of RAM and a SSD upgrade I made to it years ago. My mother still uses her similar vintage MBP with the same upgrades. These work just fine for most non-work tasks.
That doesn't mean that I expect these things to be updated or supported 15y after I bought them. I am absolutely certain I made the back $850 I originally paid (edu discount) + the ~$250 in upgrades over the years and I'm entirely ok with just letting it limp along until it physically dies. I think most people have similar expectations.
I still have my 2011 MBP with very similar upgrades, but unfortunately, it has the known bad Nvidia GPU that has been repaired multiple times. The last time it was taken in for repair, Apple said they were no longer supporting the repair. It's still usable as long as nothing tries to access the GPU, but as modern web tries to use GPU it would crash the laptop constantly.
I liked out in that mine never developed any issues with the GPU itself. Though it was stolen in 2014, so who knows longer term. My daughter is still running my (iirc 2014) model. I've been relatively happy with my 16gb M1 Air, aside from my own vision issues.
Lucky you, so to speak. Back in the day I had the same one, but it would pass their diagnostics, so they wouldn't repair it, though I could literally make it crash in front of the Genius Bar techs reliably and repeatedly (essentially the same way, by trying to do anything that hit the GPU a certain way - websites, Photoshop). "Sorry, our diagnostic tool says it's not the GPU". At one point I even demanded they do a completely fresh install of the OS. On first login, I fire up Safari, go to a certain site, crash. Restart, go to a different site, crash. "Sorry."
Production networks like these are typically not on the internet. That's a bit of information that I take for granted that people not familiar with would not.
Sure! The point is that it wasn't necessary because of Rosetta. For example, I no longer have an Intel-based Mac, but I still want to build and test for x86_64.
I understand where you are coming from and commend you for trying to support your users (I'd do the same!), but I don't think Apple marketed Rosetta 2 as a permanent solution after the transition.
Another aspect is, a Mac stops getting software updates after ~7 years, and then the API level starts to drift between the latest macOS releases.
So, after 10 year mark, you can't get the latest versions of the applications already since the features developers use aren't available in the older macOS versions and you can't run the software anyway.
There’s someone out there who wants to build for PowerPC. At some point you have to say it’s a tiny piece of the market and making a few people spend $300 for old hardware is better than maintaining back compat forever.
The difference is there is still a lot of x86 software written for windows, which you will need x86 emulation to run it through whiskey/crossover on a mac.
More issues generally arise from supporting/qualifying older OS versions than supporting specific architectures in my experience, so developers keep around older hardware or VMs for that purpose. In some other circumstances Rosetta may not be sufficient for testing older Intel hardware (one example is work on GPU)
Apple always phases out these kinds of technologies after some time to keep the ecosystem tidy and give a last push to developers to abandon legacy code.
In this iteration, it might also allow some simplification of the silicon since Mx chips have some black magic to mimic x86 (mostly in memory access IIRC) to allow Rosetta to work that fast. IOW, Rosetta 2 is not a software only magic this time.
I remember using the first Rosetta to play Starcraft on my Intel Mac. It also got deprecated after a year or two.
So leaving things behind despite some pains is Apple's way to push people forward (e.g.: Optical media, ports, Rosetta 1, Adobe Flash, etc.).
To clarify, the complete sentence in my mind was "...after a year or two I got my Intel Mac". I got mine in Q3 2008, just before Unibody ones introduced.
So, I effectively got 2 years out of Rosetta 1, but didn't meant to say Apple supported it for two years only.
Sorry for the confusion.
Looks like I can't edit my comment anymore to clarify.
I read a comment somewhere, possibly here by an ex-Apple engineer who claimed that they optimized the thing mathematically for the performance it exhibits.
So, considering its silicon parts, Rosetta 2 is more of an Apple endeavor and technology.
On the other hand 5-7 years a very typical timespan for Apple. So, I don't think licensing fees were that important while ending support for it.
The original Rosetta was based on technology from Transitive which, as I recall, IBM bought. Don't know where Rosetta 2 fits in and any licensing associated with the original Rosetta was a long time ago.
I'm still not sure what's so impressive about the last 25 years of Windows and MacOS that means we need an absolute supercomputer by 2000 standard just to open a word document the same way we did back in Windows 2000
Didn’t Word used to be installed from 2 floppy disks? Now Calculator.app leaks 40 GB of memory. Software in this sorry state cannot be run on a supercomputer, it needs one of those theoretical ocean-boilers.
This is a false memory. The reason "splash screens" existed with little text banners updating you about the status of the program's initializers was because it took fucking forever to launch Word on a 90's PC.
The steep decline in software stability and usability has been quite impressive, I wasn’t expecting them to screw it up so fast. Alan Dye in particular is a true inspiration for those who subscribe to the Peter Principle.
I'm not very well versed in macOS internals, but I was a tech lead of a Debian derivative. I also write HPC software and manage relevant infrastructure from metal to user , so I believe I know some details about processor architectures, general hardware, Linux and *NIX systems in general.
The user-visible layer of an operating system is generally one of the simpler layers when it comes to code and maintain since it's build upon abstractions. However, the libraries powering these layers, esp. math-heavy and hardware-interacting ones are much more complex due to the innate complexity of the hardware in general.
Keeping multiple copies of a library, in two different architectures (even if it only changes in bit-length), where this simple bit-change needs different implementation strategies to work correctly is a pain by itself (for more information, ask Linux Kernel devs since they're also phasing out x86).
Moreover, x86 and x86_64 is a completely different mode on the processor. On top of that, x86 only mode is called "protected mode" and x86_64 is called "long mode", and running x86 under x86_64 is a sub-mode of "long mode", and is already complex enough at silicon level.
Same complexities apply to ARM and other processor architectures. Silicon doesn't care about the ISA much.
We have seen the effort of increasing performance on superscalar, out of order processors opened a new, untapped family of side-channel/speculative attacks already. So processors are complex, software is complex, and multiple architectures on the same hardware is exponentially complex. If you want to see how the sausages made, you can also research how Windows handles backwards compatibility problem (hint: by keeping complete Windows copies under a single Windows installation in ELI5 terms).
So, the impressive thing was making these multi-arch installations running for quite some time. We need to be able let things go and open some software and hardware budget for new innovations and improvements.
Addenda: Funnily, games are one of the harder targets for multi-arch systems since they are both math-heavy and somewhat closer to the hardware than most applications and are very sensitive to architecture changes. Scientific/computational software is also another family, and this interestingly contains databases and office software. Excel also had a nasty floating point bug back in time, and 32/64 bit installations of Microsoft Office has some feature differences since the beginning.
> So maybe there is hope that the core x86_64 binary translator will stick around for things like VM and emulation of generic (linux? wine?) binaries
It's mostly for their game-porting toolkit. They have an active interest in Windows-centric game developers porting their games to Mac, and that generally doesn't happen without the compatibility layer.
I'm sure there's lots of x86_64 specific code in the macOS userland that is much more than just a recompile - things like safari/javascriptcore JIT, various quartz composer core animation graphics stack and video encoder decoder stack libraries, as well as various objective-c low level pointer tagging and message passing ABI shenanigans and so on. This is probably why 32bit intel mac app support was dropped pretty hard pretty fast, as the entire runtime and userland probably required a lot of upkeep. As just one example, 32bit intel objective-c had "fragile instance variables" which was a can of worms.
It’s not like they were doing it to make me happy, they are doing it to sell Mac and lock people into the Apple ecosystem. Maybe there is a negligible % of people using it, possible m1 is 6 yrs old iirc
They barely just released Containerization Framework[0] and the new container[1] tool, and they are already scheduling a kneecapping of this two years down the line.
Realistically, people are still going to be deploying on x64 platforms for a long time, and given that Apple's whole shtick was to serve "professionals", it's really a shame that they're dropping the ball on developers like this. Their new containerization stuff was the best workflow improvement for me in quite a while.
Yeah, it kind of kills me that I am writing this on a Samsung Galaxy Book 3 Pro 360 running Windows 11 so that I can run Macromedia Freehand/MX (I was a beta-tester for that version) so that I can still access Altsys Virtuoso 2 files from my NeXT Cube (Virtuoso 2 ~= Macromedia Freehand 4) for a typeface design project I'm still working on (a digital revival of a hot metal typeface created by my favourite type designer/illustrator who passed in 1991, but whose widow was gracious enough to give me permission to revive).
I was _so_ hopeful when I asked the devs to revive the Nx-UI code so that FH/MX could have been a native "Cocoa" app....
> running Windows 11 so that I can run Macromedia Freehand/MX
Freehand still works on Windows 11? I’m happy for you, I never found a true replacement for it.
> a digital revival of a hot metal typeface created by my favourite type designer/illustrator who passed in 1991, but whose widow was gracious enough to give me permission to revive
Any reason you haven’t shared the name of the designer or the typeface? That story sounds interesting, I’d really welcome learning more.
Yes, fortunately. I despair of what I'm going to do when I no longer have such an option. Cenon is clunky, Inkscape's cross-platform nature keeps it from having many interface aspects which I depend on, and I'd rather give up digital drawing than use Adobe Illustrator (which despite using since v3.2 on college lab Macs and on my NeXT Cube I never found comfortable).
The designer/typeface are Warren Chappell's Trajanus, and his unreleased Eichenauer --- I read _The Living Alphabet_ (and his cousin Oscar Ogg's _The 26 Letters_) when I was very young, and met him briefly on a school field trip back when he was Artist-in-Residence at UVA and did a fair bit of research in their Rare Book Room, and even had a sample of the metal type (missing one character unfortunately).
It is currently stalled at my having scanned and drawn up one of each letter at each size which I have available, but only having two letters, _N_ and _n_ in all sizes --- probably shouldn't worry that much about the optical axis, since it was cut in metal in one master size and the other sizes made using a pantograph, but there were _some_ adjustments which I'd like to preserve. There is a digital version of Trajanus available, but it's based on the phototype. I've been working at recreating each character using METAFONT, encompassing the optical size variation in that programmatically, but it's been slow going (and once I'm done, I then have to work out how to make it into outlines....)
That's why like 80%+(?) of corporate world runs Windows client side for their laptops/workstations. They don't want to have to rewrite their shit whenever the OS vendor pushes an update.
Granted, that's less of an issue now with most new SW being written in JS to run in any browser but old institutions like banks, insurances, industrial, automation, retail chains, etc still run some ancient Java/C#/C++ programs they don't want to, or can't update for reasons but it keeps the lights on.
Which is why I find it adorable when people in this bubble think all those industries will suddenly switch to Macs.
One of my previous companies gave top of the line workstations with 4k touchscreens and i9s to literally everyone junior and below a particular grade. I'm quite sure they could've saved 1000s of dollars per laptop by going with a reasonable MacBook.
(Ironically, windows 11 + corporate bloatware made the laptops super laggy. Go figure.)
>but in general the more tech-forward the company is the less Windows there is at it.
reply
Only if you count food delivery apps, crypto Ponzi scheme unicorns, Ad-services and SaaS start-ups as "tech-forward" exclusively, because you're omitting a lot of other tech companies your daily life in the civilized world depends on, which operate mainly on Windows, like where I work now.
Is designing and building semiconductors not "technology"? Or MRI machines? Or jets? Or car engines?
It seems to talk about Rosetta 2 as a whole, which is what the containerization framework depends on to support running amd64 binaries inside Linux VMs (even though the kernel still needs to be arm)
Is there a separate part of Rosetta that is implemented for the VM stuff? I was under the impression Rosetta was some kind of XPC service that would translate executable pages for Hypervisor Framework as they were faulted in, did I just misunderstand how the thing works under the hood? Are there two Rosettas?
I cannot tell you about implementation difference but what I mean is that this only talks about Rosetta 2 for Mac apps. Rosetta for Linux is a feature of the Virtualization framework that’s documented in a completely different place. And this message says a part of Rosetta for macOS will stick around, so I would be surprised if they removed the Linux part.
On the Linux side, Rosetta is an executable that you hook up with binfmt to run AMD64 binaries, like how you might use Wine for windows binaries
> and given that Apple's whole shtick was to serve "professionals",
When was the last time this was true? I think I gave up on the platform around the new keyboards, who clearly weren't made for typing, and the non-stop "Upgrade" and "Upgrade" notifications that you couldn't disable, just push forward into the future. Everything they've done since them seems to have been to impress the Average Joe, not for serving professionals.
That's literally sponsored content/an ad by a company who makes money managing Apple devices, of course they'll say it's "mission critical", on a website meant to promote Apple hardware.
Happen to have some less biased source saying anything similar, ideally not sponsored content?
This mostly died because Apple's new security framework doesn't allow for unsigned kexts or writable root. This also killed most fuse implementations and therefore kneecapped stuff like SSHFS.
Maybe Apple should consider looking into those sort of things before they promise it'll be in the OS, and even ship some barely working version of it. This was maybe around 2008 sometime so I might misremember how broken what they shipped as preview was.
It was implemented in some of the earlier Leopard beta's iirc. Possible speculation from my side, but it was probably removed due to licensing once Oracle expressed interest in acquiring Sun Microsystems.
It doesn’t say if that is going away. The message calls out another part as sticking around:
> Beyond this timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks.
Since the Linux version of Rosetta requires even less from the host OS, I would expect it to stay around even longer.
Yes that was my first thought as well, and as the images aren't designed to be run on a mac specifically, like a native app might be, there is no expectation for the developers to create a native apple silicon version. This is going to be a pretty major issue for a lot of developers
Case in point - Microsoft's SQL Server docker image, which is x86-only with no hint of ever being released as an aarch64 image.
I run that image (and a bunch of others) on my M3 dev machine in OrbStack, which I think provides the best docker and/or kubernetes container host experience on macOS.
I’ve worked in DevOps and companies I’ve worked for put the effort in when M1 came out, and now local images work fine. I honestly doubt it will have a huge impact. ARM instances on AWS, for example, are much cheaper, so there’s already lots of incentive to support ARM builds of images
In our small shop, I definitely made sure all of our containers supported aarch64 when die M1 hit the scene. I'm a Linux + Thinkpad guy myself, but now that I've got an x13s, even I am running the aarch64 versions!
It depends. Mostly it is choosing the right base image architecture. For rust and golang we can trivially cross compile and just plunk the binary in the appropriate base image. For JVM based apps it is the same because we just need to have the jars in the right place. We can do this on either architecture.
The only hold out is GraalVM which doesn’t trivially support cross compilation (yet).
We're mostly a PHP / JS (with a little Python on the side) shop, so for our own code it's mostly a matter of the right base image. Building our own images is done on an x86-64 machine, with the aarch64 side of things running via qemu.
But Docker images don't necessarily have ARM64 support. If you are exclusively targeting x64 servers, it rarely makes sense to support both ARM64 and AMD64 platforms for development environment/tests, especially if the product/app is non-trivial.
And it looks like Rosetta 2 for containers will continue to be supported past macOS 28 just fine. It's Rosetta 2 for Mac apps that's being phased out, and not even all of that (they'll keep it for games that don't need macOs frameworks to be kept around in Intel format).
Parent doesn't want to merely run ARM64 Linux/Docker images. They want to run Intel images. Lots of reasons for that, from upstream Docker images not available to ARM64, to specific corporate setups you want to replicate as close as possible, or who aren't portable to ARM64 without huge effort.
Yep, this is another reason I've needed the use of x86-64 images, as although they should be technically the same when rebuilt for ARM, they aren't always, so using the same architecture image which is run in production, will sometimes catch edge case bugs the ARM version doesn't. Admittedly it's not common, but I have had it happen. Obviously there is also the argument that the x86-64 image is being translated, so isn't the same as production anyway, but I've found that to have far less bugs than the different architecture
> Obviously there is also the argument that the x86-64 image is being translated, so isn't the same as production anyway
I've never seen this make a practical difference. I'm sure you can spot differences if you look for them (particularly at the hardware interface level) but qemu has done this for decades and so has apple.
I'm aware, I use ARM images all the time, I was trying to indicate that the usual refrain that the developers have had years to migrate their software to apple silicon, doesn't really apply to docker images. It's only the increase in use of ARM elsewhere (possibly driven by the great performance of macs running apple silicon) which has driven any migration of docker images to have ARM versions
That's not really the point though right? It means that pulling and using containers that are destined for x86 will require also building arm64 versions. Good news is buildx has the ability to build arm64 on x86, bad news is people will need to double up their build steps, or move to arm in production.
What they talk about is Rosetta's macOS frameworks compiled for Intel being kept around (which macOS Intel apps use, like if you run some old <xxx>.app that's not available for Apple Silicon).
The low-level Rosetta as a translation layer (which is what containers use) will be kept, and they will even keep it for Intel games, as they say in the OP.
It's not just images; any software the images pull down must also support ARM64 now as well. For example, the official Google Chrome binaries used by Puppeteer for headless browsing/scraping don't have a Linux ARM build.
The announcement doesn't actually say they are removing the Rosetta emulation. Rosetta 2 as a complete snapshot of macOS system frameworks is not the same thing as what is now called the virtualisation framework
Generally speaking of Rosetta means Rosetta 2 since Rosetta 1 is deprecated. It is very difficult to say what they are meaning.
The deprecation is mentioned in the context of Rosetta translation environment [1]. Rosetta for Linux uses same wording [2].
For example, Docker at least used to use this same binary translation internally year ago (the same tech as deprecation is mentioned). I don't know how it is today.
How does this work currently? I was under the impression that Docker for Mac already ran containers in an x86 VM. Probably outdated info, but I’m curious when that changed.
Docker on Mac runs containers in a VM, but the VM is native the cpu architecture and takes advantage of hardware virtualization.
You can of course always use qemu inside that vm to run non-native code (eg x86 on Apple Silicon), however this is perceived as much slower than using Rosetta (instead of qemu).
My reasons for leaving Apple had nothing to do with this decision. I was already no longer working on Rosetta 2 in a day-to-day capacity, although I would still frequently chat with the team and give input on future directions.
Just went through that thread, I can't believe this wasn't a team of like 20 people.
It's crazy to me that apple would put one guy on a project this important. At my company (another faang), I would have the ceo asking me for updates and roadmaps and everything. I know that stuff slows me down, but even without that, I don't think I could ever do something like this... I feel like I do when I watch guitar youtubers, just terrible
I hope you were at least compensated like a team of 20 engineers :P
Sometimes (often?), one very dedicated and focused person is better than a team of 20+. In fact companies would do well to recognize these situations and accommodate them better.
This is amazing. I wonder what it took to port MacOS from PowerPC to Intel. Every assembly language part must be rewritten, that’s for sure. Anything else?
Seems premature. My scanner software, SnapScan, still regularly updated, requires Rosetta. Abbyy FineReaser, the best Mac OCR, requires Rosetta. Although they may be related, as the SnaScan software does OCR with the FineReader engine.
The M1 chip and Rosetta 2 were introduced in 2020. macOS 28 will be released in 2027. 7 years seems like plenty of time for software vendors to make the necessary updates. If Apple never discontinues Rosetta support, vendors will never update their software to run natively on Apple chips.
This is also consistent with Apple’s previous behavior with backwards compatibility, where Apple would provide a few years of support for the previous platform but will strongly nudge developers and users to move on. The Classic environment in Mac OS X that enabled classic Mac OS apps to run didn’t survive the Intel switch and was unavailable in Leopard even for PowerPC Macs, and the original Rosetta for PowerPC Mac OS X applications was not included starting with Lion, the release after Snow Leopard.
I think you probably should not buy Apple hardware. It is not a guarantee they have ever offered that their software would behave consistently across updates. If this mattered to me, I would have done some research and rapidly found out that Apple has done this every few years for the last 30 years.
The hardware isn't (as far as I'm aware) changing. Please don't move the goalposts for hardware ownership (we just be able to do with our hardware as we please) to also include indefinite support from vendors. That just makes us looks like childish crybabies.
If you were instead asking for hardware documentation, or open-sourcing of Rosetta once sunset, then we're on the same team.
I never asked for an infinite window of software support, though. I merely want the features that I had when I bought the laptop, for as long as the OS supports my machine. The response is always "blame the third-parties" when apps break, but oftentimes the devs already made their money and moved on. The onus is on Apple to support their OS' software if they want to have my money.
Open-sourcing is one solution, but knowing Apple it's not a likely one. Their "we know best" mindset is why I quit dailying Macs entirely - it's not sustainable outside the mobile dev business. A computer that supports 32-bit binaries, OpenGL or x86 translation when you bought it should be able to retain that capability into the future. Anything less is planned obselecense, even if you want to argue there's a silver lining to introducing new tech. New tech should be competitive on-merits, not because it's competitor was forcibly mutilated.
> The onus is on Apple to support their OS' software if they want to have my money
Apple has done this exact same thing for every architecture change and every API they sunset, but you gave them your money anyways. Their history with discontinuing software support and telling users to harang third-party devs isn't exactly a secret.
At what point in history have you owned a particular piece of hardware for use with a particular piece of never-to-be-updated software and installed a major OEM operating system release a full 7 years after release without issue?
I doubt such a thing has ever happened in the history of consumer-facing computing.
> At what point in history have you owned a particular piece of hardware for use with a particular piece of never-to-be-updated software and installed a major OEM operating system release a full 7 years after release without issue?
Linux users do it all the time with WINE/Proton. :-)
Before you complain about the term 'major OEM operating system'; Ubuntu is shipped on major OEMs and listed in the supported requirements of many pieces of hardware and software.
> I doubt such a thing has ever happened in the history of consumer-facing computing.
Comments like this show how low standards have fallen. Mac OS X releases have short support lengths. The hardware is locked down-you need a massive RE effort just to get Linux to work. The last few gens of x86 Mac hardware did not have as much, but it was still locked down. M3 or M4 still do not have a working installer. None of this is funded by Apple to get it working on Linux or to get Windows ARM working on it as far as I know.
In comparison, my brother in-law found an old 32bit laptop that had Windows 7. It forced itself without his approval to update to Windows 10. It had support for 10 years from Microsoft with just 10. 7 pushed that 10 to... hmm... 13+ years of support?
Not sure what you are saying. If you saying you need the gamedev to recompile for arm you can run a virtualization layer, just like Mac and Windows. My friend has had the best results with: https://fex-emu.com/
Not the same here. The user didn't have to get different binaries when they changed hardware, and that was a big selling point for the hardware. And now it's going to break in an arbitrary software update.
> At what point in history have you owned a particular piece of hardware for use with a particular piece of never-to-be-updated software and installed a major OEM operating system release a full 7 years after release without issue?
> I doubt such a thing has ever happened in the history of consumer-facing computing.
Come on. I've done that and still do: I use an ancient version of Adobe Acrobat that I got with a student discount more than 10 years ago to scan documents and manipulate PDFs. I'd probably switch to an open source app, if one were feature comparable, but I'm busy and honestly don't have the time to wade through it all (and I've got a working solution).
Adobe software is ridiculously overpriced, and I'm sure many, many people have done the same when they had perpetual-use licenses.
> At what point in history have you owned a particular piece of hardware [...] and installed a major OEM operating system release a full 7 years after release without issue?
A few years ago, I installed Windows 10 on a cheap laptop from 2004—the laptop was running Windows XP, had 1GB of memory, a 32-bit-only processor, and a 150GB hard drive. The computer didn't support USB boot, but once I got the installer running, it never complained that the hardware was unsupported.
To be fair, the computer ran horrendously slow, but nothing ever crashed on me, and I actually think that it ran a little bit faster with Windows 10 than with Windows XP. And I used this as my daily driver for about 4 months, so this wasn't just based off of a brief impression.
Yes. Still, there are ways to do it anyway, from Dosbox to WineVDM. Unlike MacOS where having even 32 bit app (e.g. half of Steam games that supported Macos to begin with) means you're fucked
You can use dosbox and x86 virtual machines just fine in macOS (with the expected performance loss) right now, without Rosetta. macOS is still Turing complete.
Technically speaking, you can run anything on anything since stuff Turing complete. Practically speaking however....
E.g. i have half of macos games in my steam library as a 32-bit mac binaries. I don't know a way to launch them at any reasonable speed. Best way to do it is to ditch macos version altogether and emulate win32 version of the game (witch will run at reasonable speed via wine forks). Somehow Win32 api is THE most stable ABI layer for linux & mac
> my steam library as a 32-bit mac binaries. I don't know a way to launch them at any reasonable speed.
To be fair, it's the emulation of x86-32 with the new ARM64 architecture that causes the speed problems. That transition is also why MacBooks are the best portables, in terms of efficiency, that you can buy right now.
All ARM chips have crippled x86-32 performance, because they're not x86-32 chips. You'll find the same (generally worse) performance issues trying to run ARM64 code with x86-64.
>Windows running on a 64-bit host no longer runs 16-bit binaries.
Which isn't an issue since Windows 95 was not a 16-bit OS, that was MS-DOS. For 16-bit DOS apps there's virtualization things like DOSbox or even HW emulators.
This isn't a new or unique move; Apple has never prioritized backwards compatibility.
If you're a Mac user, you expect this sort of thing. If running neglected software is critical to you, you run Windows or you keep your old Macs around.
It's a bizarre assumption that this is about "neglected software."
A lot of software is for x64 only.
If Rosetta2 goes away, Parallels support for x64 binaries in VMs likely goes away too. Parallels is not neglected software. The x64 software you'd want to run on Parallels are not neglected software.
This is a short-sighted move. It's also completely unprecedented; Apple has dropped support for previous architectures and runtimes before, but never when the architecture or runtime was the de facto standard.
Nevertheless, running x64 software including Docker containers on aarch64 VMs does use Rosetta. There's still a significant valid use case that has nothing to do with neglected software.
I seem to remember 68k software working (on PowerPC Macs) until Classic was killed off in Leopard? I'm likely misremembering the length of time, but it seems like that was the longest backwards-compatibility streak Apple had.
There are leftovers from older versions of macOS and severely neglected apps in Tahoe too. Sure, they might have been given a new icon, or adopted the new system styling, but they have not been updated for ages.
There's a lot of Win95 software that you can't run too. Microsoft puts a lot of work into their extensive backlog of working software. It's not just "good engineering" it's honest to god fresh development.
The main problem is not native software, but virtualization, since ARM64 hardware is still quite uncommon for Windows/Linux, and we need Rosetta for decent performance when running AMD64 in virtual machines.
There is lots of existing software (audio plugins, games, etc.) that will never see an update. All of that software will be lost. Most new software has ARM or universal binaries. If some vendors refuse to update their software, it's their problem. Windows still supports 32-bit applications, yet almost all new software is 64-bit.
I think this is exactly what they're issuing this notice to address. Rosetta performs so well that vendors are pretty okay just using it as long as possible, but a two year warning gives a clear signal that it's time to migrate.
One problem from Apple’s perspective is that it continues to cost them money to maintain both the translation layer and the x86_64 frameworks on an ongoing basis.
I mean, is it really an excessive burden to keep a "too popular" feature alive for users? Features users pay for cost money to build and maintain. These aren't unique situations.
It would be different if the feature wasn't popular at all but that doesn't seem to be the case.
It doesn't seem especially popular to me, so... citation needed? It's not being discontinued for being too popular, that's for sure.
Apple doesn't want to maintain it forever, and a handful of legacy apps will never be bothered to update to native Apple Silicon support unless it means losing access to their user base. Apple has given them plenty of time to do it naturally, and now Apple is giving them a stronger reason and a couple more years to get it done. Apple is not randomly discontinuing it with no notice; two years is plenty of time for maintained software to get over the finish line.
At the end of the day, Apple doesn't want to pay to maintain this compatibility layer for forever, and Apple's customers will have a better experience in the long run if the software they are using is not running through an extra translation layer.
There will always be some niche users who want this feature to remain forever, but it's clearly not a significant enough percentage of users for Apple to be worried about that, or else Apple would maintain it forever.
I usually agree with Apple but I don't agree with this. Rosetta 28 is basically magic, why would they take away one of their own strongest features? If they want big name apps to compile to Apple Silicon, why can't they exert pressure through their codesigning process instead?
The “big name apps” have already moved to Apple Silicon. Rosetta helped them with that process a few years ago. We’re down to the long tail apps now. At some point, Rosetta is only helping a couple people and it won’t make sense to support it. I just looked, and right now on my M1 Air, I have exactly one x86 app running, and I was honestly surprised to find that one (Safari plug-in). Everything else is running ARM. My workload is office, general productivity, and Java software development. I’m sure that if you allow your Mac to report back app usage to Apple, they know if you’re using Rosetta or not, and if so, which apps require it. I suspect that’s why they’re telegraphing that they are about ready to pull the plug.
2. In the resulting window, click the "More Info..." button. This will open the System Settings window.
3. Scroll to the bottom of that window and click "System Report."
4. In the left side of the resulting window, under "Software," click "Applications." This will provide a list of installed applications. One of the columns for sorting is "Kind"; all apps that are x86 will be listed with the kind, "Intel."
2. From the CPU or memory tab, look at the “Kind” column. It’ll either say “Apple” or “Intel.” If the Kind column isn’t visible, right-click on the column labels and select Kind.
In macOS 26, you can see every Rosetta app that has recently run on your machine by going to System Information and then Software / Rosetta Software. It includes the "Fallback Reason" (e.g. if you manually forced the app under Rosetta or if it was an Intel-only binary).
FWIW, I have zero Rosetta apps on my M1 laptop and I've been a Mac user since the earliest days.
I'm super aware of the issues involved--I oversaw the transition from PPC to Intel at a university back in the day, using OG Rosetta. Even then, we had users who would only stop using their PPC apps when you took them from their cold, dead hands.
How much die area does it use that could be used for performance? How much engineering time does it use? Does it make sense to keep it around, causing ~30% more power usage/less performance?
There are many acceptable opposing answers, depending on the perspective of backwards compatibility, cost, and performance.
My naive assumption is that, by the time 2027 comes around, they might have some sort of slow software emulation that is parity to, say, M1 Rosetta performance.
> One of the key reasons why Rosetta 2 provides such a high level of translation efficiency is the support of x86-64 memory ordering in the M1 SoC. The SoC also has dedicated instructions for computing x86 flags.
While true, we're not talking about the chips losing TSO; Apple plans to keep Rosetta 2 for games and it has to remain fast because, well, it's video games. It also seems like they plan to keep their container tool[1]. This means they can't get rid of TSO at the silicon level and I have not heard this discussed as a possibility. We're only discussing the loss of the software support here. The answer to "How much die area does it use that could be used for performance?" is zero--they have chosen to do a partial phase-out that doesn't permit them to save the die space. They'd need to kill all remaining Rosetta 2 usage in order to cull the die space, and they seem to be going out of their way not to do this.
> We're only discussing the loss of the software support here
Schematically "Rosetta 2" is multiple things:
- hardware support (e.g TSO)
- binary translation (AOT + JIT)
- fat binaries (dylibs, frameworks, executables)
- UI (inspector checkbox, arch(1) command, ...)
My bet is that beyond the fancy high-level "Rosetta 2" word what will happen is that they'll simply stop shipping fat x86_64+aarch64 system binaries+frameworks[0], while the remainder remains.
> Rosetta is a software translation layer, not a hardware translation layer. It doesn't take any die space.
There is hardware acceleration in place that that only exists for it to, as you just stated, give it acceptable performance.
It does take up die space, but they're going to keep it around because they've decided to reduce the types of applications supported by Rosetta 2 (and the hardware that it exists only for it) will support.
So, seems like they've decided they can't fight the fact that gaming is a Windows thing, but there's no excuse for app developers.
Sure, this seems to be a restatement of my post, which started with "While true...", rather than a disagreement. I was pointing out which one of the "many acceptable opposing answers" Apple had chosen. They can't use that die area for performance because they're still using it even after this phase-out. (I'm not the person who wrote the original post.)
So, the way to "use die area for performance" is to add more cache and branch predictor space. Because of this, anything that costs a lot of code size does consume it because it's using the cache up.
They were pretty quick to sunset the PPC version of Rosetta as well. It forces developers to prioritize making the change, or making it clear that their software isn’t supported. It
The one I have my eye on is Minecraft. While not mission critical in anyway, they were fairly quick to update the game itself, but failed to update the launcher. Last time I looked at the bug report, it was close and someone had to re-open it. It’s almost like the devs installed Rosetta2 and don’t realize their launcher is using it.
Owning a Mac has always meant not relying on 3P software. Forget printer/scanner drivers. Even if they target macOS perfectly, there will come a day when you need to borrow a Windows PC or old Mac to print.
It happens to be ok for me as a SWE with basic home uses, so their exact target user. Given how many other people need their OS to do its primary job of running software, idk how they expect to gain customers this way. It's good that they don't junk up the OS with absolute legacy support, but at least provide some kind of emulation even if it's slow.
I spent what I would consider to be a lot of money for a unitasker Fujitsu scanner device and am just astounded by how unmaintained and primitive the software is. I only use it on a Windows machine though, so I'm not in the same boat.
Phasing out Rosetta 2 seems like a reasonable move. Maintaining backward compatibility indefinitely adds complexity and technical debt. Apple has supported Intel-based systems for a long time, and this step aligns with their goal of keeping macOS streamlined for Apple Silicon.
This seems to basically only apply to full-fledged GUI apps and excludes e.g. games, so potentially stuff like Rosetta for CLI isn't going anywhere either
But games are full fledged GUI apps. At a minimum they have a window.
It’s really unclear what it means to support old games but not old apps in general.
I would think the set of APIs used by the set of all existing Intel Mac games probably comes close to everything. Certainly nearly all of AppKit, OpenGL, and Metal 1 and 2, but also media stuff (audio, video), networking stuff, input stuff (IOHID etc).
So then why say only games when the minimum to support the games probably covers a lot of non games too?
I wonder if their plan is to artificially limit who can use the Intel slices of the system frameworks? Like hardcode a list of blessed and tested games? Or (horror) maybe their plan is to only support Rosetta for games that use Win32 — so they’re actually going to be closing the door on old native Mac games and only supporting Wine / Game Porting Toolkit?
Games use a very small portion of the native frameworks. Most would be covered by Foundation, which they have to keep working for Swift anyway (Foundation is being rewritten in Swift) and just enough to present a window + handle inputs. D3DMetal and the other translation layers remove the need to keep Metal around.
That’s a much smaller target of things to keep running on Intel than the whole shebang that they need to right now to support Rosetta.
I don’t agree. My point is their collective footprint in terms of the macOS API surface (at least as of 2019 or so) is pretty big. I’m not just speculating here, I work in this area so I have a pretty good idea of what is used.
As I said in my first comment, it's at least Cocoa (Foundation + AppKit), AVFoundation, Metal, OpenGL, and then all of the lower level frameworks and libraries those depend on (which may or may not be used directly by individual games). If you want a concrete example from something open source, go look at what SDL depends on, it's everything I listed and then some. It's also not uncommon for games to have launchers or startup windows that contain additional native UI, so assume you really do need all of AppKit, you couldn't get away with cutting out something like NSTableView or whatever.
So my point remains, if Apple has to continue providing Intel builds of all of these frameworks, that means a lot of other apps could also continue to run. But ... Apple says they won't, so how are they going to accomplish this? That's the mystery to me.
If you'd like to see an interesting parallel, go look at how Microsoft announced supporting DirectX 12 on Windows 7 for a blessed apps list - basically because Blizzard whined hard enough and was a big enough gorilla to demand it.
That's one implementation, yeah, just have a list somewhere of approved software and make an artificial limitation. But their announcement is so vague, it's hard to say.
And then the next question is why? It's not like they've ever promised much compatibility for old software on new macOS. Why not let it be just best effort, if it runs it runs?
I'd also love to see the source code of the embedded M68K emulator for PPC Macs. I believe there are two versions -- one interpreter style and one dynarec style.
Check out Asahi Linux, they run on Apple Silicon and have translation for 32 and 64 bit x86, so they even go further than what Rosetta achieved. Open Source as well.
For those unfamiliar with Apple’s new version-numbering system, this is the version that will be released in 2027, presumably around September or October of that year.
Hopefully this means macOS 27 will be a Snow Leopard type release to focus on bug fixes, performance, and the overall experience, rather than focusing on new features.
It's a myth that Snow Leopard was a bug fix release. Mac OS X 10.6.0 was much buggier than 10.5.8, indeed brought several new severe bugs. However, Mac OS X 10.6 received two years of minor bug fix updates afterward, which eventually made it the OS that people reminiscence about now.
Apple's strict yearly schedule makes "another Snow Leopard" impossible. At this point, Apple has accumulated so much technical debt that they'd need much more than 2 years of minor bug fix updates.
> It's a myth that Snow Leopard was a bug fix release.
> Mac OS X 10.6.0 was much buggier than 10.5.8
Somebody who worked on Snow Leopard has already disagreed with you here about those things:
> As the person who personally ran 10.6 v1.1 at Apple (and 10.5.8), you are wrong(ish).
> Snow Leopard's stated goal internally was reducing bugs and increasing quality. If you wanted to ship a feature you had to get explicit approval. In feature releases it was bottom up "here is what we are planning to ship" and in Snow Leopard it was top down "can we ship this?".
> During that time period my team and I triaged every single Mac OS X bug coming into the company every morning. Trust me, SL was of higher quality than Leopard.
> Apple's strict yearly schedule makes "another Snow Leopard" impossible. At this point, Apple has accumulated so much technical debt that they'd need much more than 2 years of minor bug fix updates.
I don’t think the schedule matters. They just over-commit every time. I said elsewhere:
> [Apple] were never building and have never built software at a sustainable pace, even before the yearly cadence. They race ahead with tech debt then never pay it off, so the problem gets progressively worse.
> A while back, that merely manifested as more and more defects over time.
> More recently, they began failing to ship on time and started pre-announcing features that would ship later.
> And now they’ve progressed to failing to ship on time, pre-announcing features that would ship later, and then failing to ship those features later.
> This is not the yearly cadence. This is consistently committing to more than they are capable of, which results in linear growth of tech debt, which results in rising defects and lower productivity over time. It would happen with any cadence.
> Somebody who worked on Snow Leopard has already disagreed with you here about those things:
It's instructive to read the entire thread, not just the few sentences you quoted. For example, that person later admits, "So yeah, if you are comparing the most stable polished/fixed/stagnant last major version with the brand new 1.0 major version branch, the newer major is going to be buggier. That would be the case with every y.0 vs x.8."
> I don’t think the schedule matters. They just over-commit every time.
That's a distinction without a difference. Apple has committed to releasing major OS updates every year on schedule. That's a recipe for over-committment, because they need to produce enough changes to market it as a major release.
The "no new features" gimmick of Snow Leopard was a marketing lie but was also unique. It's a gimmick that Apple pulled only once, and it couldn't be repeated frequently by Apple without making a mockery of the whole annual schedule. Maybe they could do it a second time now, but in general the annual schedule is still a major problem for a number of reasons.
It should also be noted that Snow Leopard itself took 2 years to produce after Leopard.
Not sure why you’re downvoted because you’re right.
Snow leopard brought a huge amount of under the covers features. It was a massive release. The only reason it had that marketing was because they didn’t have a ton of user facing stuff to show
That is more or less what users asking for another Snow Leopard want: a release that doesn't have gratuitous UI churn and superficial changes, doesn't break the end user's muscle memory, but instead focuses on deep-seated and long-standing issues under the hood. If the right thing for the OS in the long term is to replace an entire subsystem instead of applying more band-aid fixes, then take the time to do a proper job of it.
lapcat loves his straw man about OS X 10.6.0 having plenty of bugs, but that misses the point of Snow Leopard. Of course a release that makes changes as fundamental as re-writing the Finder and QuickTime to use the NeXT-derived frameworks rather than the classic Mac OS APIs, and moving most of the built-in apps to 64-bit, is going to introduce or uncover plenty of new bugs. But it fixed a bunch of stubborn bugs and architectural limitations, and the new bugs mostly got ironed out in a reasonable time frame. (Snow Leopard was probably one of the better examples of Apple practicing what they preach: cleaning out legacy code and modernizing the OS and bundled apps the way they usually want third-party developers to do to their own apps.)
Fixing architectural bugs is still fixing bugs—just at a deeper level than a rapid release schedule driven by marketable end-user features easily allows for.
> a release that doesn't have gratuitous UI churn and superficial changes
There have actually been quite a few of those releases. Some of the California-themed updates have been practically indistinguishable from the previous versions. Of course Tahoe and Big Sur brought huge UI changes, but those are the exceptions, not the norm.
> focuses on deep-seated and long-standing issues under the hood
Which issues would those be, specifically?
> If the right thing for the OS in the long term is to replace an entire subsystem
Which subsystems need replacement? You claim that this is what people mean by wanting another Snow Leopard, but which subsystems do people want replaced?
> misses the point of Snow Leopard
I haven't missed the point of Snow Leopard. You're conflating two entirely different things: (1) the point of Snow Leopard as conceived by Apple in 2008-ish and (2) why people in 2025 look back fondly at Snow Leopard. My claim is that the fond memories are the result of the quality and stability that were themselves the result of 2 full years of bug fixes AFTER the initial release of Snow Leopard. Whereas the initial quality of Snow Leopard was not great, just like the initial quality of all major OS updates is not great. Major updates invariably make software buggier, and the quality comes only after much time spent refining the new stuff.
My contention is that the marketing lie of "no new features", which is naturally very memorable, is the reason that a lot of people associate Snow Leopard with bug fixes and quality, but that's not actually what 10.6.0 brought, and the quality came much later in time.
I'm not saying that Snow Leopard didn't bring valuable changes. I'm just saying that Snow Leopard existed in various stages over 2 years, and the high quality version of Snow Leopard that we remember fondly now is actually late-stage Snow Leopard, not early-stage Snow Leopard, and those 2 years of minor bug fix releases were crucial. Moreover, that's what we need now, a long series of minor bug fix updates, not any new major updates. The bug backlog has become a mountain.
> Of course a release that makes changes as fundamental as re-writing the Finder and QuickTime to use the NeXT-derived frameworks rather than the classic Mac OS APIs, and moving most of the built-in apps to 64-bit, is going to introduce or uncover plenty of new bugs.
Which is why I think it's very wrong to claim that people want "another Snow Leopard". Snow Leopard II released in 2026 would be much buggier than even macOS Tahoe, which is precisely what people do NOT want, a bunch more bugs.
> But it fixed a bunch of stubborn bugs
Which bugs exactly?
> Fixing architectural bugs is still fixing bugs
Which architectural bugs do you have in mind, or more relevantly, which architectural bugs do people in general have in mind when saying that they want another Snow Leopard?
I lost access to decades of my albums which can no longer open on my MacBooks. Some open partially running Ableton Live with Rosetta. My record label recently reached out asking for stems for an old song for a sync deal with Rocket League — after spending a week trying to revive the old sessions I concluded that it was impossible and they were forever lost thanks to apples complete abandonment of backwards compatibility. It’s heart breaking really.
I've already lost my "studio" (a few appliances in the corner of my room) due to upgrade from windows 7 to 10. Now it will happen again after I migrated to mac. I guess the "studio" should be left alone when it comes to upgrades. I'm starting to believe, that a "studio" is a set of software AND hardware, so I guess I won't sell my mac to buy new, but rather maintain it with given software and hardware on it, just maybe unplug it from the internet.
-- EDIT --
or just move back to windows, but I can't imagine it with the current state of AI bloat
For sure. But I'd be surprised if a significant number of those setups were running recent versions of Mac OS, especially in older studios. Stability is preferable to new features since old studio hardware is often very reliable and studio engineers are wary of ruining compatibility with system upgrades
I can just imagine the Apple statement, like they did with flash/Flash.
‘We fully support the Studio.’
Edit: After hunting around without success, I’m now doubting my memory. I thought I could remember Jobs dismissively replying to a question about Adobe Flash that Apple supported flash (memory).
Maybe I made that up?
I guess this is another way of Apple saying x86 is dead. Would have loved if Intel and AMD joined force to open up x86. Instead they are following the same path as POWER, likely doing it when it is too little too late.
Aw that bums me out, brings back a lot of memories. Though I assume it’s been effectively dead for a while.
I haven’t dabbled with hackintoshes in nearly a decade, I stepped away around the time iMessage started needing those extensive hacks to work. Things seemed to shift away from driver/bootloader gaps to faking Apple hardware. Years earlier, I had an Asus Eee PC (remember “netbooks”?) that ran macOS without any major issues. I even built a machine that I believed I could hackintosh easily, though it never quite worked as well as I hoped.
The era of random companies selling pre-built Hackintoshes was so cool. Kids these days probably wouldn’t even believe it if you told them, like how Netflix used to actually send you a DVD in the mail. :)
This is very frustrating. As if they couldn't afford to continue it. And at the same time they keep making the system more and more closed, so that you can't even run applications without Apple's permission. I don't understand why people still buy such products.
Ok, then try to run a pre-compiled macOS M1 compatible application on your new Sequoia system, such as https://github.com/rochus-keller/oberonsystem3/ or https://github.com/rochus-keller/leancreator/. Requires quite some tricks so that at least some applications run without Apple's benedictions, but the tricks don't work for all such applications; and as it looks, they will also remove the last remaining work-arounds in future.
From the page, inside an large block marked “Important”:
> Beyond this timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks.
This is awful. I love playing games on my MBP and the latest crossover releases have been amazing in the ability to play almost all windows PC games at full speed. Losing rosetta means crossover is dead.
You would hope that apple would open source it, but they are one of the worst companies in the world for open sourcing things. Shame on all their engineers.
From the OP: "Beyond [the two-year] timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks."
What are you talking about? There's Do I have enough RAM to run Slack,
all my Chrome tabs, and a terminal program; there's what terminal program shall I run today: Ghost edition; there's Can I get Colima to run, now with docker DLC. There's Kubernetes on Mac: Kind edition; there's Let's with Tart!; Nix is for Ops: New and more obtuse config edition. With so many fun games to play, who's got time for anything else?
Just few days ago something updated and my virtual desktop switching now behaves erratically. I'm pressing <Super>+<1>, it changes to desktop 1 with vscode opened. And immediately it starts typing "1" into vscode. Seems to bug with all X applications. I fixed it for vscode to make it work under wayland, but now it doesn't draw border around vscode window. Another irritation and I have other X apps.
It works, it's free, I love it. But it's so not polished and it'll never be. I miss macOS polish, where basic things just work.
> virtual desktops
> vscode
> wayland
Sounds like you have a misconfigured system. Jokes aside, this looks like a bug in your WM. Macs may be more polished, but my point was not about polish.
Now, it's important to note that people were attempting to resolve issues. The transitions weren't always clean, but the results are usually great. For example, moving to pipewire is possible the greatest advancement of audio ever. Linux audio finally doesn't suck. Xfree86 to Xorg was likewise great. For the last few years of X11, I usually didn't have to modify the config. I kind of don't care about init systems most of the time. The only major complaint for systemd is that disk I/O on embedded systems is kind of an issue, but things like Alpine are better there and Alpine doesn't use systemd.
With that said, I think the real issue is that people dislike advancements that break things. Early in Pulse's life, people absolutely hated it. Early in Wayland's life, people absolutely hated it, but it wasn't default so no one complained. With Windows and macOS, stuff changes seemingly constantly and randomly and breaks things, so people hate it. Saying, however, that Linux doesn't change seems a little daft to me. It changes faster than anything else on small levels, and different distributions have breaking changes at different rates.
You don't have to install gnome, kde, wayland or systemd. You are just talking about your preferences masked as something that “had to be done”. I only had to fiddle with audio on the raspberry pi when connecting bluetooth. Everything works out of the box nowadays. If wayland was a good protocol, the user would not have to know about it.
I wasn't saying that anything had to be done, nor was I saying that each change was good or bad (except for the audio and Xfree86 to Xorg). My preferences really don't enter into it. I was saying that Linux systems do indeed change, and the idea of learn once and you're done is nonsense.
I use less than 10 gui programs on linux. They never change. The command line programs do not change either. Unless the devs get a dumb idea to rewrite them in Rust, because they sunk so many hours into learning it.
For a few years now it's been feeling like Apple are pushing devs away and are more interested in catering for general consumers. Just look at what DHH has written and said about it, and his move to Omarchy
The system prevents you from mixing
arm64 code and x86_64 code in the
same process. Rosetta translation
applies to an entire process,
including all code modules that the
process loads dynamically.
I've been using this VST from Arturia (Minimoog V) since they distributed it for free back in like 2011 or 2012, and it runs as well on my M1 Mac as it did on my previous Intel Macs.
I mean, it's literally the same DMG from way back when and there's no chance it doesn't run under Rosetta, but I run Ableton natively!
Seems like you're trying to load an Intel-only plugin binary in a native ARM application. This doesn't work. DAW and plugins must use the same archicture. You would either have to run Ableton in Rosetta or use a plugin bridge. (This is similar to Windows if you want to run 32-bit plugins in a 64-bit DAW.)
Yes, that's how you do it. I have written a VST plugin host for Pure Data and SuperCollider and it supports sandboxing/bridging. It's not rocket science. I'm not sure why Ableton never bothered to implement this.
I'm guessing they don't want to maintain and build and test x86_64 versions of all the macos libraries like Appkit and UIKit (including large changes like liquid glass) when they are no longer shipping x86_64 macOS versions. Which is not entirely unreasonable as I'm sure it takes a lot of effort to keep the whole ui library stack working properly on multiple archs.
Perhaps that's what they're hinting about with the note about a "subset of Rosetta". So maybe there is hope that the core x86_64 binary translator will stick around for things like VM and emulation of generic (linux? wine?) binaries, but they don't want to maintain a whole x86_64 macOS userspace going forward.
Space savings from not shipping fat binaries for everything will probably also be not insignificant. Or make room for a new fat binary for a future "arm64v2" :)
> ...they don't want to maintain and build and test x86_64 versions...
This feels wrong. Apple sold Intel-based Macs until early June 2023. The last one was the 2019 Mac Pro model.
Ending support for Rosetta in macOS around 2028 also means ending support for any x86_64 versions of software. This means that those unfortunate users who bought an Intel Mac Pro in 2023 only got five years of active usability.
Just because the latest OS isn't able to be installed on older hardware does not mean the hardware in no longer usable. I know people to this day that still run the last 2012 cheese grater MacPros with Snow Leopard as daily work machines. They still use Final Cut 7 on them to capture content from tapes. At this point, they are very fancy dedicated video recorders, but they still run and are money making devices.
You're right; I still have a 2010 MBP w/8GB of RAM and a SSD upgrade I made to it years ago. My mother still uses her similar vintage MBP with the same upgrades. These work just fine for most non-work tasks.
That doesn't mean that I expect these things to be updated or supported 15y after I bought them. I am absolutely certain I made the back $850 I originally paid (edu discount) + the ~$250 in upgrades over the years and I'm entirely ok with just letting it limp along until it physically dies. I think most people have similar expectations.
I still have my 2011 MBP with very similar upgrades, but unfortunately, it has the known bad Nvidia GPU that has been repaired multiple times. The last time it was taken in for repair, Apple said they were no longer supporting the repair. It's still usable as long as nothing tries to access the GPU, but as modern web tries to use GPU it would crash the laptop constantly.
I liked out in that mine never developed any issues with the GPU itself. Though it was stolen in 2014, so who knows longer term. My daughter is still running my (iirc 2014) model. I've been relatively happy with my 16gb M1 Air, aside from my own vision issues.
Lucky you, so to speak. Back in the day I had the same one, but it would pass their diagnostics, so they wouldn't repair it, though I could literally make it crash in front of the Genius Bar techs reliably and repeatedly (essentially the same way, by trying to do anything that hit the GPU a certain way - websites, Photoshop). "Sorry, our diagnostic tool says it's not the GPU". At one point I even demanded they do a completely fresh install of the OS. On first login, I fire up Safari, go to a certain site, crash. Restart, go to a different site, crash. "Sorry."
The last security update for Snow Leopard was in 2013. Friends don't let friends connect software that vulnerable to the internet.
The hardware can be ok, the walled garden is not.
Production networks like these are typically not on the internet. That's a bit of information that I take for granted that people not familiar with would not.
I don't think the ability to cross-compile things will go away when Rosetta is phased out, though.
But how can you test it if your ARM-based Mac cannot run it? Most software vendors will simply stop making x86_64 builds.
Keep older hardware at hand?
Sure! The point is that it wasn't necessary because of Rosetta. For example, I no longer have an Intel-based Mac, but I still want to build and test for x86_64.
I understand where you are coming from and commend you for trying to support your users (I'd do the same!), but I don't think Apple marketed Rosetta 2 as a permanent solution after the transition.
Another aspect is, a Mac stops getting software updates after ~7 years, and then the API level starts to drift between the latest macOS releases.
So, after 10 year mark, you can't get the latest versions of the applications already since the features developers use aren't available in the older macOS versions and you can't run the software anyway.
There’s someone out there who wants to build for PowerPC. At some point you have to say it’s a tiny piece of the market and making a few people spend $300 for old hardware is better than maintaining back compat forever.
The difference is there is still a lot of x86 software written for windows, which you will need x86 emulation to run it through whiskey/crossover on a mac.
And for x86-64 Windows builds, you should be testing using an x86-64 Windows machine, not Rosetta 2
More issues generally arise from supporting/qualifying older OS versions than supporting specific architectures in my experience, so developers keep around older hardware or VMs for that purpose. In some other circumstances Rosetta may not be sufficient for testing older Intel hardware (one example is work on GPU)
Apple always phases out these kinds of technologies after some time to keep the ecosystem tidy and give a last push to developers to abandon legacy code.
In this iteration, it might also allow some simplification of the silicon since Mx chips have some black magic to mimic x86 (mostly in memory access IIRC) to allow Rosetta to work that fast. IOW, Rosetta 2 is not a software only magic this time.
I remember using the first Rosetta to play Starcraft on my Intel Mac. It also got deprecated after a year or two.
So leaving things behind despite some pains is Apple's way to push people forward (e.g.: Optical media, ports, Rosetta 1, Adobe Flash, etc.).
> It also got deprecated after a year or two.
It was five years, from 2006 to 2011. Rosetta 2 will have been there for seven years (currently at five).
https://en.wikipedia.org/wiki/Rosetta_(software)
To clarify, the complete sentence in my mind was "...after a year or two I got my Intel Mac". I got mine in Q3 2008, just before Unibody ones introduced.
So, I effectively got 2 years out of Rosetta 1, but didn't meant to say Apple supported it for two years only.
Sorry for the confusion.
Looks like I can't edit my comment anymore to clarify.
Not sure it's only about tidiness. Rosetta 1 was licensed from a third party and Apple didn't want to keep paying the license fees.
I don't know if this is the situation with Rosetta 2.
I read a comment somewhere, possibly here by an ex-Apple engineer who claimed that they optimized the thing mathematically for the performance it exhibits.
So, considering its silicon parts, Rosetta 2 is more of an Apple endeavor and technology.
On the other hand 5-7 years a very typical timespan for Apple. So, I don't think licensing fees were that important while ending support for it.
The original Rosetta was based on technology from Transitive which, as I recall, IBM bought. Don't know where Rosetta 2 fits in and any licensing associated with the original Rosetta was a long time ago.
If they hadn't deprecated 32 bit we would still be able to play Halo on mac.
This is the perfect comment because 1) it’s true, and 2) it can be read as supportive, a complaint, or just a neutral observation.
The problem is, keeping older architectures alive creates an exponential workload, grinding everything to halt.
So, even though I feel what you are saying, we can't have every nice thing we want, at the same time.
What has been so impressive about the last 5 years of MacOS releases?
I'm still not sure what's so impressive about the last 25 years of Windows and MacOS that means we need an absolute supercomputer by 2000 standard just to open a word document the same way we did back in Windows 2000
Didn’t Word used to be installed from 2 floppy disks? Now Calculator.app leaks 40 GB of memory. Software in this sorry state cannot be run on a supercomputer, it needs one of those theoretical ocean-boilers.
Word 4.0 for DOS from 1987, sure.
This is a false memory. The reason "splash screens" existed with little text banners updating you about the status of the program's initializers was because it took fucking forever to launch Word on a 90's PC.
The steep decline in software stability and usability has been quite impressive, I wasn’t expecting them to screw it up so fast. Alan Dye in particular is a true inspiration for those who subscribe to the Peter Principle.
https://en.wikipedia.org/wiki/Peter_principle
I'm not very well versed in macOS internals, but I was a tech lead of a Debian derivative. I also write HPC software and manage relevant infrastructure from metal to user , so I believe I know some details about processor architectures, general hardware, Linux and *NIX systems in general.
The user-visible layer of an operating system is generally one of the simpler layers when it comes to code and maintain since it's build upon abstractions. However, the libraries powering these layers, esp. math-heavy and hardware-interacting ones are much more complex due to the innate complexity of the hardware in general.
Keeping multiple copies of a library, in two different architectures (even if it only changes in bit-length), where this simple bit-change needs different implementation strategies to work correctly is a pain by itself (for more information, ask Linux Kernel devs since they're also phasing out x86).
Moreover, x86 and x86_64 is a completely different mode on the processor. On top of that, x86 only mode is called "protected mode" and x86_64 is called "long mode", and running x86 under x86_64 is a sub-mode of "long mode", and is already complex enough at silicon level.
Same complexities apply to ARM and other processor architectures. Silicon doesn't care about the ISA much.
We have seen the effort of increasing performance on superscalar, out of order processors opened a new, untapped family of side-channel/speculative attacks already. So processors are complex, software is complex, and multiple architectures on the same hardware is exponentially complex. If you want to see how the sausages made, you can also research how Windows handles backwards compatibility problem (hint: by keeping complete Windows copies under a single Windows installation in ELI5 terms).
So, the impressive thing was making these multi-arch installations running for quite some time. We need to be able let things go and open some software and hardware budget for new innovations and improvements.
Addenda: Funnily, games are one of the harder targets for multi-arch systems since they are both math-heavy and somewhat closer to the hardware than most applications and are very sensitive to architecture changes. Scientific/computational software is also another family, and this interestingly contains databases and office software. Excel also had a nasty floating point bug back in time, and 32/64 bit installations of Microsoft Office has some feature differences since the beginning.
How much worse they make things.
ARM/Apple-Silicon support?
Is there not an emulator at this point?
> Or make room for a new fat binary for a future "arm64v2" :)
Or, one can dream: RVA23
> So maybe there is hope that the core x86_64 binary translator will stick around for things like VM and emulation of generic (linux? wine?) binaries
It's mostly for their game-porting toolkit. They have an active interest in Windows-centric game developers porting their games to Mac, and that generally doesn't happen without the compatibility layer.
System library calls from x86 don’t get converted into arm64 by Rosetta? I coulda sworn Microsoft’s emulator did that
> including large changes like liquid glass
They could just revert all that large change with no loss to the users.
Best take
It’s basically just a recompile though.
I'm sure there's lots of x86_64 specific code in the macOS userland that is much more than just a recompile - things like safari/javascriptcore JIT, various quartz composer core animation graphics stack and video encoder decoder stack libraries, as well as various objective-c low level pointer tagging and message passing ABI shenanigans and so on. This is probably why 32bit intel mac app support was dropped pretty hard pretty fast, as the entire runtime and userland probably required a lot of upkeep. As just one example, 32bit intel objective-c had "fragile instance variables" which was a can of worms.
This is <1% of the total code that Apple writes
Yeah, the most important, least readable, and oldest code. It's exactly the stuff that's expensive to maintain.
Until it isn't
Can you enable TSO for ARM executables?
Yes but I don't see how that is relevant
It’s not like they were doing it to make me happy, they are doing it to sell Mac and lock people into the Apple ecosystem. Maybe there is a negligible % of people using it, possible m1 is 6 yrs old iirc
Closer to 5 years old
They barely just released Containerization Framework[0] and the new container[1] tool, and they are already scheduling a kneecapping of this two years down the line.
Realistically, people are still going to be deploying on x64 platforms for a long time, and given that Apple's whole shtick was to serve "professionals", it's really a shame that they're dropping the ball on developers like this. Their new containerization stuff was the best workflow improvement for me in quite a while.
[0] https://github.com/apple/containerization
[1] https://github.com/apple/container
Apple has always been like this, there are other options when backwards compatibility is relevant feature.
Yeah, it kind of kills me that I am writing this on a Samsung Galaxy Book 3 Pro 360 running Windows 11 so that I can run Macromedia Freehand/MX (I was a beta-tester for that version) so that I can still access Altsys Virtuoso 2 files from my NeXT Cube (Virtuoso 2 ~= Macromedia Freehand 4) for a typeface design project I'm still working on (a digital revival of a hot metal typeface created by my favourite type designer/illustrator who passed in 1991, but whose widow was gracious enough to give me permission to revive).
I was _so_ hopeful when I asked the devs to revive the Nx-UI code so that FH/MX could have been a native "Cocoa" app....
> running Windows 11 so that I can run Macromedia Freehand/MX
Freehand still works on Windows 11? I’m happy for you, I never found a true replacement for it.
> a digital revival of a hot metal typeface created by my favourite type designer/illustrator who passed in 1991, but whose widow was gracious enough to give me permission to revive
Any reason you haven’t shared the name of the designer or the typeface? That story sounds interesting, I’d really welcome learning more.
Yes, fortunately. I despair of what I'm going to do when I no longer have such an option. Cenon is clunky, Inkscape's cross-platform nature keeps it from having many interface aspects which I depend on, and I'd rather give up digital drawing than use Adobe Illustrator (which despite using since v3.2 on college lab Macs and on my NeXT Cube I never found comfortable).
The designer/typeface are Warren Chappell's Trajanus, and his unreleased Eichenauer --- I read _The Living Alphabet_ (and his cousin Oscar Ogg's _The 26 Letters_) when I was very young, and met him briefly on a school field trip back when he was Artist-in-Residence at UVA and did a fair bit of research in their Rare Book Room, and even had a sample of the metal type (missing one character unfortunately).
It is currently stalled at my having scanned and drawn up one of each letter at each size which I have available, but only having two letters, _N_ and _n_ in all sizes --- probably shouldn't worry that much about the optical axis, since it was cut in metal in one master size and the other sizes made using a pantograph, but there were _some_ adjustments which I'd like to preserve. There is a digital version of Trajanus available, but it's based on the phototype. I've been working at recreating each character using METAFONT, encompassing the optical size variation in that programmatically, but it's been slow going (and once I'm done, I then have to work out how to make it into outlines....)
That's why like 80%+(?) of corporate world runs Windows client side for their laptops/workstations. They don't want to have to rewrite their shit whenever the OS vendor pushes an update.
Granted, that's less of an issue now with most new SW being written in JS to run in any browser but old institutions like banks, insurances, industrial, automation, retail chains, etc still run some ancient Java/C#/C++ programs they don't want to, or can't update for reasons but it keeps the lights on.
Which is why I find it adorable when people in this bubble think all those industries will suddenly switch to Macs.
they use Windows because it's ostensibly cheap and there's momentum. I don't think any modern tech company is majority Windows.
One of my previous companies gave top of the line workstations with 4k touchscreens and i9s to literally everyone junior and below a particular grade. I'm quite sure they could've saved 1000s of dollars per laptop by going with a reasonable MacBook.
(Ironically, windows 11 + corporate bloatware made the laptops super laggy. Go figure.)
It surely is outside US, and countries with similar income level.
https://www.accio.com/business/operating-system-market-share...
That's overall market share. Agree Windows use is high but in general the more tech-forward the company is the less Windows there is at it.
>but in general the more tech-forward the company is the less Windows there is at it. reply
Only if you count food delivery apps, crypto Ponzi scheme unicorns, Ad-services and SaaS start-ups as "tech-forward" exclusively, because you're omitting a lot of other tech companies your daily life in the civilized world depends on, which operate mainly on Windows, like where I work now.
Is designing and building semiconductors not "technology"? Or MRI machines? Or jets? Or car engines?
So only 13% of the world desktop users might be employeed at a tech-forward company.
Might, because the number is even less, when we differenciate between companies and home use.
> more tech-forward
That may be surprising for people here, but technology is not synonymous with software.
The OP says nothing about Rosetta for Linux.
It seems to talk about Rosetta 2 as a whole, which is what the containerization framework depends on to support running amd64 binaries inside Linux VMs (even though the kernel still needs to be arm)
Is there a separate part of Rosetta that is implemented for the VM stuff? I was under the impression Rosetta was some kind of XPC service that would translate executable pages for Hypervisor Framework as they were faulted in, did I just misunderstand how the thing works under the hood? Are there two Rosettas?
I cannot tell you about implementation difference but what I mean is that this only talks about Rosetta 2 for Mac apps. Rosetta for Linux is a feature of the Virtualization framework that’s documented in a completely different place. And this message says a part of Rosetta for macOS will stick around, so I would be surprised if they removed the Linux part.
On the Linux side, Rosetta is an executable that you hook up with binfmt to run AMD64 binaries, like how you might use Wine for windows binaries
Rosetta Linux executable can be used without host hardware/software support; for example, you can run it on AWS's Graviton instances.
However, to get performance benefits, you still need to have hardware support, and have Rosetta installed on macOS [1].
TFA is quite vague about what is being deprecated.
[1] https://developer.apple.com/documentation/virtualization/run...
The "other" part of Rosetta is having all system frameworks being also compiled for x86_64, and being supported running in this configuration.
There are a lot of projects with arm containers on docker hub. It’s not hard to build multi platform containers.
> and given that Apple's whole shtick was to serve "professionals",
When was the last time this was true? I think I gave up on the platform around the new keyboards, who clearly weren't made for typing, and the non-stop "Upgrade" and "Upgrade" notifications that you couldn't disable, just push forward into the future. Everything they've done since them seems to have been to impress the Average Joe, not for serving professionals.
The current macbook pro was basically a checklist of items professionals wanted away from consumer new shiny thing.
And then they introduced liquid glass, because professionals don't need an easily readable UI to work with.
> Everything they've done since them seems to have been to impress the Average Joe, not for serving professionals.
"CIOs say Apple is now mission critical for the enterprise" [1]
[1]: https://9to5mac.com/2025/10/25/cios-say-apple-is-now-mission...
That's literally sponsored content/an ad by a company who makes money managing Apple devices, of course they'll say it's "mission critical", on a website meant to promote Apple hardware.
Happen to have some less biased source saying anything similar, ideally not sponsored content?
Well this kinda screws me over running docker on macos. Not all images I use have an arm version.
https://github.com/apple/container
They released this a while ago which has hints of supporting amd64 beyond the Rosetta end date.
Believing in hints from Apple about software? Sweet summer child.
Still waiting for ZFS on OS X
This mostly died because Apple's new security framework doesn't allow for unsigned kexts or writable root. This also killed most fuse implementations and therefore kneecapped stuff like SSHFS.
Maybe Apple should consider looking into those sort of things before they promise it'll be in the OS, and even ship some barely working version of it. This was maybe around 2008 sometime so I might misremember how broken what they shipped as preview was.
It was implemented in some of the earlier Leopard beta's iirc. Possible speculation from my side, but it was probably removed due to licensing once Oracle expressed interest in acquiring Sun Microsystems.
It doesn’t say if that is going away. The message calls out another part as sticking around:
> Beyond this timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks.
Since the Linux version of Rosetta requires even less from the host OS, I would expect it to stay around even longer.
Yes that was my first thought as well, and as the images aren't designed to be run on a mac specifically, like a native app might be, there is no expectation for the developers to create a native apple silicon version. This is going to be a pretty major issue for a lot of developers
Case in point - Microsoft's SQL Server docker image, which is x86-only with no hint of ever being released as an aarch64 image.
I run that image (and a bunch of others) on my M3 dev machine in OrbStack, which I think provides the best docker and/or kubernetes container host experience on macOS.
I’ve worked in DevOps and companies I’ve worked for put the effort in when M1 came out, and now local images work fine. I honestly doubt it will have a huge impact. ARM instances on AWS, for example, are much cheaper, so there’s already lots of incentive to support ARM builds of images
In our small shop, I definitely made sure all of our containers supported aarch64 when die M1 hit the scene. I'm a Linux + Thinkpad guy myself, but now that I've got an x13s, even I am running the aarch64 versions!
How do you build multi-arch in CI? Do you cross-compile or do you have arm64 runners?
It depends. Mostly it is choosing the right base image architecture. For rust and golang we can trivially cross compile and just plunk the binary in the appropriate base image. For JVM based apps it is the same because we just need to have the jars in the right place. We can do this on either architecture.
The only hold out is GraalVM which doesn’t trivially support cross compilation (yet).
We're mostly a PHP / JS (with a little Python on the side) shop, so for our own code it's mostly a matter of the right base image. Building our own images is done on an x86-64 machine, with the aarch64 side of things running via qemu.
Apple Silicon is ARM64 which is supported by Linux and Docker.
But Docker images don't necessarily have ARM64 support. If you are exclusively targeting x64 servers, it rarely makes sense to support both ARM64 and AMD64 platforms for development environment/tests, especially if the product/app is non-trivial.
Or, if you just want to create multi-arch images for your project, on your Mac...so that your non-Mac customers can use them.
I guess now it makes sense. Got 3 years to turn on ARM builds.
No, it still doesn't make sense.
And it looks like Rosetta 2 for containers will continue to be supported past macOS 28 just fine. It's Rosetta 2 for Mac apps that's being phased out, and not even all of that (they'll keep it for games that don't need macOs frameworks to be kept around in Intel format).
Parent doesn't want to merely run ARM64 Linux/Docker images. They want to run Intel images. Lots of reasons for that, from upstream Docker images not available to ARM64, to specific corporate setups you want to replicate as close as possible, or who aren't portable to ARM64 without huge effort.
Yeah but many people are using x86-64 Docker images because they deploy on x86-64. Maybe ARM clouds will be more common by that time.
Yep, this is another reason I've needed the use of x86-64 images, as although they should be technically the same when rebuilt for ARM, they aren't always, so using the same architecture image which is run in production, will sometimes catch edge case bugs the ARM version doesn't. Admittedly it's not common, but I have had it happen. Obviously there is also the argument that the x86-64 image is being translated, so isn't the same as production anyway, but I've found that to have far less bugs than the different architecture
> Obviously there is also the argument that the x86-64 image is being translated, so isn't the same as production anyway
I've never seen this make a practical difference. I'm sure you can spot differences if you look for them (particularly at the hardware interface level) but qemu has done this for decades and so has apple.
Many container images are multi-arch, although probably not ones that are built in-house.
We built our in-house images multi-arch precisely for this reason!
I'm aware, I use ARM images all the time, I was trying to indicate that the usual refrain that the developers have had years to migrate their software to apple silicon, doesn't really apply to docker images. It's only the increase in use of ARM elsewhere (possibly driven by the great performance of macs running apple silicon) which has driven any migration of docker images to have ARM versions
That's not really the point though right? It means that pulling and using containers that are destined for x86 will require also building arm64 versions. Good news is buildx has the ability to build arm64 on x86, bad news is people will need to double up their build steps, or move to arm in production.
W/o rossetta i can't build x86_64 images anymore. Today i can setup OrbStack amd64 linux and build native amd64 images on my mac to put on my servers.
What they talk about is Rosetta's macOS frameworks compiled for Intel being kept around (which macOS Intel apps use, like if you run some old <xxx>.app that's not available for Apple Silicon).
The low-level Rosetta as a translation layer (which is what containers use) will be kept, and they will even keep it for Intel games, as they say in the OP.
Ask the maintainers to build arm images. Realistically they should be, unless the project uses lots of x86 assembly.
It's not just images; any software the images pull down must also support ARM64 now as well. For example, the official Google Chrome binaries used by Puppeteer for headless browsing/scraping don't have a Linux ARM build.
This isn't about the virtualisation support - it's about all the Mac system frameworks being available in the rosetta environment
The performance that makes containers usable currently depends on Rosetta on Linux as well. Removing the support makes them much less usable.
The announcement doesn't actually say they are removing the Rosetta emulation. Rosetta 2 as a complete snapshot of macOS system frameworks is not the same thing as what is now called the virtualisation framework
Generally speaking of Rosetta means Rosetta 2 since Rosetta 1 is deprecated. It is very difficult to say what they are meaning.
The deprecation is mentioned in the context of Rosetta translation environment [1]. Rosetta for Linux uses same wording [2].
For example, Docker at least used to use this same binary translation internally year ago (the same tech as deprecation is mentioned). I don't know how it is today.
[1]: https://developer.apple.com/documentation/apple-silicon/abou...
[2]: https://developer.apple.com/documentation/virtualization/run...
Doesn't Orbstack or Colima solve this?
if you run x86 code without rosetta (probably using the qemu) it will work painfully slow
How does this work currently? I was under the impression that Docker for Mac already ran containers in an x86 VM. Probably outdated info, but I’m curious when that changed.
Docker on Mac runs containers in a VM, but the VM is native the cpu architecture and takes advantage of hardware virtualization.
You can of course always use qemu inside that vm to run non-native code (eg x86 on Apple Silicon), however this is perceived as much slower than using Rosetta (instead of qemu).
Surely, as it is on Linux, QEMU can take over here in running the x86 images on ARM.
Is it slow? Absolutely. But you'd be insane to run it in production anyway.
Wanting it to be fast is not just about "running it on production".
A test suite that becomes 10x slower is already a huge issue.
That said, it doesn't seem llike Rosetta for container use is going anywhere. Rosetta for legacy Mac applications (the macOS level layer) is.
That won’t be going away, none of that requires any support from the host OS.
Are you running this via that travesty of a desktop app?
Back to the QEMU dark ages
That means Steam will release a native Apple Silicon client circa 2028. Exciting!
Steam already has Apple Silicon build in Steam Beta channel.
So it was not released yet. What is keeping Valve from release it? Apple is "giving a hand" now.
Ah, I guess it was wise for the original developer of Rosetta 2 to quit earlier this year. One of the people that I look up to.
https://news.ycombinator.com/item?id=42483895
My reasons for leaving Apple had nothing to do with this decision. I was already no longer working on Rosetta 2 in a day-to-day capacity, although I would still frequently chat with the team and give input on future directions.
Just went through that thread, I can't believe this wasn't a team of like 20 people.
It's crazy to me that apple would put one guy on a project this important. At my company (another faang), I would have the ceo asking me for updates and roadmaps and everything. I know that stuff slows me down, but even without that, I don't think I could ever do something like this... I feel like I do when I watch guitar youtubers, just terrible
I hope you were at least compensated like a team of 20 engineers :P
History doesn't repeat, but it does rhyme: the initial (re)bootstrapping of OS X for Intel was done by one person, too.
https://www.quora.com/Apple-company/How-does-Apple-keep-secr...
Sometimes (often?), one very dedicated and focused person is better than a team of 20+. In fact companies would do well to recognize these situations and accommodate them better.
This is amazing. I wonder what it took to port MacOS from PowerPC to Intel. Every assembly language part must be rewritten, that’s for sure. Anything else?
I think a single 10x developer is really good for this kind of system programming projects.
Thank you for the clarification!
Thank you for your work!
Seems premature. My scanner software, SnapScan, still regularly updated, requires Rosetta. Abbyy FineReaser, the best Mac OCR, requires Rosetta. Although they may be related, as the SnaScan software does OCR with the FineReader engine.
The M1 chip and Rosetta 2 were introduced in 2020. macOS 28 will be released in 2027. 7 years seems like plenty of time for software vendors to make the necessary updates. If Apple never discontinues Rosetta support, vendors will never update their software to run natively on Apple chips.
This is also consistent with Apple’s previous behavior with backwards compatibility, where Apple would provide a few years of support for the previous platform but will strongly nudge developers and users to move on. The Classic environment in Mac OS X that enabled classic Mac OS apps to run didn’t survive the Intel switch and was unavailable in Leopard even for PowerPC Macs, and the original Rosetta for PowerPC Mac OS X applications was not included starting with Lion, the release after Snow Leopard.
Honestly, for apple this is above and beyond. They've killed support with less fanfare and compatibility support than what we see here.
Bully on me for owning hardware and expecting it to behave consistently across OTA updates.
I think you probably should not buy Apple hardware. It is not a guarantee they have ever offered that their software would behave consistently across updates. If this mattered to me, I would have done some research and rapidly found out that Apple has done this every few years for the last 30 years.
But their new hardware is so good though, it's kind of hard to pass up
The hardware isn't (as far as I'm aware) changing. Please don't move the goalposts for hardware ownership (we just be able to do with our hardware as we please) to also include indefinite support from vendors. That just makes us looks like childish crybabies.
If you were instead asking for hardware documentation, or open-sourcing of Rosetta once sunset, then we're on the same team.
I never asked for an infinite window of software support, though. I merely want the features that I had when I bought the laptop, for as long as the OS supports my machine. The response is always "blame the third-parties" when apps break, but oftentimes the devs already made their money and moved on. The onus is on Apple to support their OS' software if they want to have my money.
Open-sourcing is one solution, but knowing Apple it's not a likely one. Their "we know best" mindset is why I quit dailying Macs entirely - it's not sustainable outside the mobile dev business. A computer that supports 32-bit binaries, OpenGL or x86 translation when you bought it should be able to retain that capability into the future. Anything less is planned obselecense, even if you want to argue there's a silver lining to introducing new tech. New tech should be competitive on-merits, not because it's competitor was forcibly mutilated.
> The onus is on Apple to support their OS' software if they want to have my money
Apple has done this exact same thing for every architecture change and every API they sunset, but you gave them your money anyways. Their history with discontinuing software support and telling users to harang third-party devs isn't exactly a secret.
Sure, then don't update.
At what point in history have you owned a particular piece of hardware for use with a particular piece of never-to-be-updated software and installed a major OEM operating system release a full 7 years after release without issue?
I doubt such a thing has ever happened in the history of consumer-facing computing.
Have you ever heard of Windows? Unlike Apple, they do care about backwards compatibility, and don’t randomly go removing features users depend on.
Also they aren't the only ones, this is pretty much standard across commercial UNIXes (the survivors), embedded OSes, mainframes and micros.
and the consequences are dire
Are they? IMO Windows going downhill has more to do with what is being added to it than what it is preserving compatibility for.
> At what point in history have you owned a particular piece of hardware for use with a particular piece of never-to-be-updated software and installed a major OEM operating system release a full 7 years after release without issue?
Linux users do it all the time with WINE/Proton. :-)
Before you complain about the term 'major OEM operating system'; Ubuntu is shipped on major OEMs and listed in the supported requirements of many pieces of hardware and software.
> I doubt such a thing has ever happened in the history of consumer-facing computing.
Comments like this show how low standards have fallen. Mac OS X releases have short support lengths. The hardware is locked down-you need a massive RE effort just to get Linux to work. The last few gens of x86 Mac hardware did not have as much, but it was still locked down. M3 or M4 still do not have a working installer. None of this is funded by Apple to get it working on Linux or to get Windows ARM working on it as far as I know.
In comparison, my brother in-law found an old 32bit laptop that had Windows 7. It forced itself without his approval to update to Windows 10. It had support for 10 years from Microsoft with just 10. 7 pushed that 10 to... hmm... 13+ years of support?
> Linux users do it all the time with WINE/Proton. :-)
And there’s a near 100% chance you’ll have to recompile/download pre-re-compiled binaries if moving to a completely different architecture. Same here.
Not sure what you are saying. If you saying you need the gamedev to recompile for arm you can run a virtualization layer, just like Mac and Windows. My friend has had the best results with: https://fex-emu.com/
Not the same here. The user didn't have to get different binaries when they changed hardware, and that was a big selling point for the hardware. And now it's going to break in an arbitrary software update.
> At what point in history have you owned a particular piece of hardware for use with a particular piece of never-to-be-updated software and installed a major OEM operating system release a full 7 years after release without issue?
> I doubt such a thing has ever happened in the history of consumer-facing computing.
Come on. I've done that and still do: I use an ancient version of Adobe Acrobat that I got with a student discount more than 10 years ago to scan documents and manipulate PDFs. I'd probably switch to an open source app, if one were feature comparable, but I'm busy and honestly don't have the time to wade through it all (and I've got a working solution).
Adobe software is ridiculously overpriced, and I'm sure many, many people have done the same when they had perpetual-use licenses.
> At what point in history have you owned a particular piece of hardware [...] and installed a major OEM operating system release a full 7 years after release without issue?
A few years ago, I installed Windows 10 on a cheap laptop from 2004—the laptop was running Windows XP, had 1GB of memory, a 32-bit-only processor, and a 150GB hard drive. The computer didn't support USB boot, but once I got the installer running, it never complained that the hardware was unsupported.
To be fair, the computer ran horrendously slow, but nothing ever crashed on me, and I actually think that it ran a little bit faster with Windows 10 than with Windows XP. And I used this as my daily driver for about 4 months, so this wasn't just based off of a brief impression.
Windows 95 was released... well, in 1995. In 2025 you can run apps targeting W95 just fine (and many 16-bit apps with some effort)
> In 2025 you can run apps targeting W95 just fine (and many 16-bit apps with some effort)
FWIW, Windows running on a 64-bit host no longer runs 16-bit binaries.
Yes. Still, there are ways to do it anyway, from Dosbox to WineVDM. Unlike MacOS where having even 32 bit app (e.g. half of Steam games that supported Macos to begin with) means you're fucked
You can use dosbox and x86 virtual machines just fine in macOS (with the expected performance loss) right now, without Rosetta. macOS is still Turing complete.
Technically speaking, you can run anything on anything since stuff Turing complete. Practically speaking however....
E.g. i have half of macos games in my steam library as a 32-bit mac binaries. I don't know a way to launch them at any reasonable speed. Best way to do it is to ditch macos version altogether and emulate win32 version of the game (witch will run at reasonable speed via wine forks). Somehow Win32 api is THE most stable ABI layer for linux & mac
> my steam library as a 32-bit mac binaries. I don't know a way to launch them at any reasonable speed.
To be fair, it's the emulation of x86-32 with the new ARM64 architecture that causes the speed problems. That transition is also why MacBooks are the best portables, in terms of efficiency, that you can buy right now.
All ARM chips have crippled x86-32 performance, because they're not x86-32 chips. You'll find the same (generally worse) performance issues trying to run ARM64 code with x86-64.
Rosetta 2 is pretty good at running x86-32. There's more registers on the destination, after all.
>Windows running on a 64-bit host no longer runs 16-bit binaries.
Which isn't an issue since Windows 95 was not a 16-bit OS, that was MS-DOS. For 16-bit DOS apps there's virtualization things like DOSbox or even HW emulators.
This isn't a new or unique move; Apple has never prioritized backwards compatibility.
If you're a Mac user, you expect this sort of thing. If running neglected software is critical to you, you run Windows or you keep your old Macs around.
It's a bizarre assumption that this is about "neglected software."
A lot of software is for x64 only.
If Rosetta2 goes away, Parallels support for x64 binaries in VMs likely goes away too. Parallels is not neglected software. The x64 software you'd want to run on Parallels are not neglected software.
This is a short-sighted move. It's also completely unprecedented; Apple has dropped support for previous architectures and runtimes before, but never when the architecture or runtime was the de facto standard.
https://docs.parallels.com/parallels-desktop-developers-guid...
Paralles x86_64 emulation doesn't depend on Rosetta.
> If Rosetta2 goes away, Parallels support for x64 VMs likely goes away too.
Rosetta 2 never supported emulating a full VM, only individual applications.
You're right. It looks like the new full VM emulation in 20.2 doesn't use Rosetta.
https://www.parallels.com/blogs/parallels-desktop-20-2-0/
Nevertheless, running x64 software including Docker containers on aarch64 VMs does use Rosetta. There's still a significant valid use case that has nothing to do with neglected software.
Edited my post above. Thanks for the correction.
The OP only applies to Rosetta for running x64 Mac apps, not running x64 Linux software in aarch64 Linux VMs.
I seem to remember 68k software working (on PowerPC Macs) until Classic was killed off in Leopard? I'm likely misremembering the length of time, but it seems like that was the longest backwards-compatibility streak Apple had.
Just because Microsoft does one thing doesn't mean Apple has to do the same.
That's not a good thing for other reasons; e.g. there are a lot of inconsistencies in modern Windows, like pieces of Windows 3.1 still in Windows 11.
There are leftovers from older versions of macOS and severely neglected apps in Tahoe too. Sure, they might have been given a new icon, or adopted the new system styling, but they have not been updated for ages.
There's a lot of Win95 software that you can't run too. Microsoft puts a lot of work into their extensive backlog of working software. It's not just "good engineering" it's honest to god fresh development.
That's not necessarily a good thing.
The main problem is not native software, but virtualization, since ARM64 hardware is still quite uncommon for Windows/Linux, and we need Rosetta for decent performance when running AMD64 in virtual machines.
There is lots of existing software (audio plugins, games, etc.) that will never see an update. All of that software will be lost. Most new software has ARM or universal binaries. If some vendors refuse to update their software, it's their problem. Windows still supports 32-bit applications, yet almost all new software is 64-bit.
I think this is exactly what they're issuing this notice to address. Rosetta performs so well that vendors are pretty okay just using it as long as possible, but a two year warning gives a clear signal that it's time to migrate.
If it's ok now then what's even the problem with letting it be?
One problem from Apple’s perspective is that it continues to cost them money to maintain both the translation layer and the x86_64 frameworks on an ongoing basis.
I mean, is it really an excessive burden to keep a "too popular" feature alive for users? Features users pay for cost money to build and maintain. These aren't unique situations.
It would be different if the feature wasn't popular at all but that doesn't seem to be the case.
It doesn't seem especially popular to me, so... citation needed? It's not being discontinued for being too popular, that's for sure.
Apple doesn't want to maintain it forever, and a handful of legacy apps will never be bothered to update to native Apple Silicon support unless it means losing access to their user base. Apple has given them plenty of time to do it naturally, and now Apple is giving them a stronger reason and a couple more years to get it done. Apple is not randomly discontinuing it with no notice; two years is plenty of time for maintained software to get over the finish line.
At the end of the day, Apple doesn't want to pay to maintain this compatibility layer for forever, and Apple's customers will have a better experience in the long run if the software they are using is not running through an extra translation layer.
There will always be some niche users who want this feature to remain forever, but it's clearly not a significant enough percentage of users for Apple to be worried about that, or else Apple would maintain it forever.
I usually agree with Apple but I don't agree with this. Rosetta 28 is basically magic, why would they take away one of their own strongest features? If they want big name apps to compile to Apple Silicon, why can't they exert pressure through their codesigning process instead?
The “big name apps” have already moved to Apple Silicon. Rosetta helped them with that process a few years ago. We’re down to the long tail apps now. At some point, Rosetta is only helping a couple people and it won’t make sense to support it. I just looked, and right now on my M1 Air, I have exactly one x86 app running, and I was honestly surprised to find that one (Safari plug-in). Everything else is running ARM. My workload is office, general productivity, and Java software development. I’m sure that if you allow your Mac to report back app usage to Apple, they know if you’re using Rosetta or not, and if so, which apps require it. I suspect that’s why they’re telegraphing that they are about ready to pull the plug.
How do you check if you're running any x86 apps?
1. From the Apple menu, click "About This Mac."
2. In the resulting window, click the "More Info..." button. This will open the System Settings window.
3. Scroll to the bottom of that window and click "System Report."
4. In the left side of the resulting window, under "Software," click "Applications." This will provide a list of installed applications. One of the columns for sorting is "Kind"; all apps that are x86 will be listed with the kind, "Intel."
You can replace steps 1–3 with “Open /System/Applications/Utilities/System Information.app”.
Adobe Acrobat, Steam, and PDF Reader Pro...
To see what’s running,
1. Go into Activity Monitor
2. From the CPU or memory tab, look at the “Kind” column. It’ll either say “Apple” or “Intel.” If the Kind column isn’t visible, right-click on the column labels and select Kind.
In macOS 26, you can see every Rosetta app that has recently run on your machine by going to System Information and then Software / Rosetta Software. It includes the "Fallback Reason" (e.g. if you manually forced the app under Rosetta or if it was an Intel-only binary).
FWIW, I have zero Rosetta apps on my M1 laptop and I've been a Mac user since the earliest days.
I'm super aware of the issues involved--I oversaw the transition from PPC to Intel at a university back in the day, using OG Rosetta. Even then, we had users who would only stop using their PPC apps when you took them from their cold, dead hands.
There's this Silicon app that scans your disk for them: https://github.com/DigiDNA/Silicon.
How much die area does it use that could be used for performance? How much engineering time does it use? Does it make sense to keep it around, causing ~30% more power usage/less performance?
There are many acceptable opposing answers, depending on the perspective of backwards compatibility, cost, and performance.
My naive assumption is that, by the time 2027 comes around, they might have some sort of slow software emulation that is parity to, say, M1 Rosetta performance.
Rosetta is a software translation layer, not a hardware translation layer. It doesn't take any die space.
Hardware acceleration [1]:
> One of the key reasons why Rosetta 2 provides such a high level of translation efficiency is the support of x86-64 memory ordering in the M1 SoC. The SoC also has dedicated instructions for computing x86 flags.
[1] https://en.wikipedia.org/wiki/Rosetta_(software)
While true, we're not talking about the chips losing TSO; Apple plans to keep Rosetta 2 for games and it has to remain fast because, well, it's video games. It also seems like they plan to keep their container tool[1]. This means they can't get rid of TSO at the silicon level and I have not heard this discussed as a possibility. We're only discussing the loss of the software support here. The answer to "How much die area does it use that could be used for performance?" is zero--they have chosen to do a partial phase-out that doesn't permit them to save the die space. They'd need to kill all remaining Rosetta 2 usage in order to cull the die space, and they seem to be going out of their way not to do this.
[1] https://github.com/apple/container -- uses Rosetta translation for x64 images.
> We're only discussing the loss of the software support here
Schematically "Rosetta 2" is multiple things:
- hardware support (e.g TSO)
- binary translation (AOT + JIT)
- fat binaries (dylibs, frameworks, executables)
- UI (inspector checkbox, arch(1) command, ...)
My bet is that beyond the fancy high-level "Rosetta 2" word what will happen is that they'll simply stop shipping fat x86_64+aarch64 system binaries+frameworks[0], while the remainder remains.
[0]: or rather, heavily cull
> Rosetta is a software translation layer, not a hardware translation layer. It doesn't take any die space.
There is hardware acceleration in place that that only exists for it to, as you just stated, give it acceptable performance.
It does take up die space, but they're going to keep it around because they've decided to reduce the types of applications supported by Rosetta 2 (and the hardware that it exists only for it) will support.
So, seems like they've decided they can't fight the fact that gaming is a Windows thing, but there's no excuse for app developers.
Sure, this seems to be a restatement of my post, which started with "While true...", rather than a disagreement. I was pointing out which one of the "many acceptable opposing answers" Apple had chosen. They can't use that die area for performance because they're still using it even after this phase-out. (I'm not the person who wrote the original post.)
So, the way to "use die area for performance" is to add more cache and branch predictor space. Because of this, anything that costs a lot of code size does consume it because it's using the cache up.
You can most likely use Vuescan, I use that with an old ScanSnap i500 (or something)
[1] https://www.hamrick.com
Love VueScan for my film scanner!
They were pretty quick to sunset the PPC version of Rosetta as well. It forces developers to prioritize making the change, or making it clear that their software isn’t supported. It
The one I have my eye on is Minecraft. While not mission critical in anyway, they were fairly quick to update the game itself, but failed to update the launcher. Last time I looked at the bug report, it was close and someone had to re-open it. It’s almost like the devs installed Rosetta2 and don’t realize their launcher is using it.
Rosetta for PPC apps was supported from the first Intel Macs released in January 2006 until 10.7 Lion was released in July 2011.
So just over five years? If Apple phase out Rosetta 2 in macOS 28, then it will have been supported for seven years.
Owning a Mac has always meant not relying on 3P software. Forget printer/scanner drivers. Even if they target macOS perfectly, there will come a day when you need to borrow a Windows PC or old Mac to print.
It happens to be ok for me as a SWE with basic home uses, so their exact target user. Given how many other people need their OS to do its primary job of running software, idk how they expect to gain customers this way. It's good that they don't junk up the OS with absolute legacy support, but at least provide some kind of emulation even if it's slow.
I spent what I would consider to be a lot of money for a unitasker Fujitsu scanner device and am just astounded by how unmaintained and primitive the software is. I only use it on a Windows machine though, so I'm not in the same boat.
This is Apple's "get your shit together and port to ARM64, you have 2 years" warning.
If you're not willing to commit to supporting the latest and greatest, you shouldn't be developing for Apple.
Phasing out Rosetta 2 seems like a reasonable move. Maintaining backward compatibility indefinitely adds complexity and technical debt. Apple has supported Intel-based systems for a long time, and this step aligns with their goal of keeping macOS streamlined for Apple Silicon.
Backwards compatibility may be many things, but it is never technical debt.
This seems to basically only apply to full-fledged GUI apps and excludes e.g. games, so potentially stuff like Rosetta for CLI isn't going anywhere either
But games are full fledged GUI apps. At a minimum they have a window.
It’s really unclear what it means to support old games but not old apps in general.
I would think the set of APIs used by the set of all existing Intel Mac games probably comes close to everything. Certainly nearly all of AppKit, OpenGL, and Metal 1 and 2, but also media stuff (audio, video), networking stuff, input stuff (IOHID etc).
So then why say only games when the minimum to support the games probably covers a lot of non games too?
I wonder if their plan is to artificially limit who can use the Intel slices of the system frameworks? Like hardcode a list of blessed and tested games? Or (horror) maybe their plan is to only support Rosetta for games that use Win32 — so they’re actually going to be closing the door on old native Mac games and only supporting Wine / Game Porting Toolkit?
Games use a very small portion of the native frameworks. Most would be covered by Foundation, which they have to keep working for Swift anyway (Foundation is being rewritten in Swift) and just enough to present a window + handle inputs. D3DMetal and the other translation layers remove the need to keep Metal around.
That’s a much smaller target of things to keep running on Intel than the whole shebang that they need to right now to support Rosetta.
I don’t agree. My point is their collective footprint in terms of the macOS API surface (at least as of 2019 or so) is pretty big. I’m not just speculating here, I work in this area so I have a pretty good idea of what is used.
Could you give examples at least of what you think that big collective footprint might include?
Bear in mind that a large chunk of Mac gaming right now that needs translation are windows games translated via crossover.
As I said in my first comment, it's at least Cocoa (Foundation + AppKit), AVFoundation, Metal, OpenGL, and then all of the lower level frameworks and libraries those depend on (which may or may not be used directly by individual games). If you want a concrete example from something open source, go look at what SDL depends on, it's everything I listed and then some. It's also not uncommon for games to have launchers or startup windows that contain additional native UI, so assume you really do need all of AppKit, you couldn't get away with cutting out something like NSTableView or whatever.
So my point remains, if Apple has to continue providing Intel builds of all of these frameworks, that means a lot of other apps could also continue to run. But ... Apple says they won't, so how are they going to accomplish this? That's the mystery to me.
If you'd like to see an interesting parallel, go look at how Microsoft announced supporting DirectX 12 on Windows 7 for a blessed apps list - basically because Blizzard whined hard enough and was a big enough gorilla to demand it.
That's one implementation, yeah, just have a list somewhere of approved software and make an artificial limitation. But their announcement is so vague, it's hard to say.
And then the next question is why? It's not like they've ever promised much compatibility for old software on new macOS. Why not let it be just best effort, if it runs it runs?
I’ve always been amazed by Rosetta, such an incredible piece of engineering. But I wonder if we’ll ever see its source code opened up.
It feels like keeping it alive could really help long-term x64 support on Apple Silicon, even if Apple decides to move on.
I'd also love to see the source code of the embedded M68K emulator for PPC Macs. I believe there are two versions -- one interpreter style and one dynarec style.
Check out Asahi Linux, they run on Apple Silicon and have translation for 32 and 64 bit x86, so they even go further than what Rosetta achieved. Open Source as well.
For those unfamiliar with Apple’s new version-numbering system, this is the version that will be released in 2027, presumably around September or October of that year.
Hopefully this means macOS 27 will be a Snow Leopard type release to focus on bug fixes, performance, and the overall experience, rather than focusing on new features.
No. Only Steve Jobs could have pulled this.
Modern day Apple cannot. A bugfix-only release is not going to sell anything.
Why would it mean that?
It's a myth that Snow Leopard was a bug fix release. Mac OS X 10.6.0 was much buggier than 10.5.8, indeed brought several new severe bugs. However, Mac OS X 10.6 received two years of minor bug fix updates afterward, which eventually made it the OS that people reminiscence about now.
Apple's strict yearly schedule makes "another Snow Leopard" impossible. At this point, Apple has accumulated so much technical debt that they'd need much more than 2 years of minor bug fix updates.
https://lapcatsoftware.com/articles/2023/11/5.html
> It's a myth that Snow Leopard was a bug fix release.
> Mac OS X 10.6.0 was much buggier than 10.5.8
Somebody who worked on Snow Leopard has already disagreed with you here about those things:
> As the person who personally ran 10.6 v1.1 at Apple (and 10.5.8), you are wrong(ish).
> Snow Leopard's stated goal internally was reducing bugs and increasing quality. If you wanted to ship a feature you had to get explicit approval. In feature releases it was bottom up "here is what we are planning to ship" and in Snow Leopard it was top down "can we ship this?".
> During that time period my team and I triaged every single Mac OS X bug coming into the company every morning. Trust me, SL was of higher quality than Leopard.
— https://news.ycombinator.com/item?id=43431675#43439348
> Apple's strict yearly schedule makes "another Snow Leopard" impossible. At this point, Apple has accumulated so much technical debt that they'd need much more than 2 years of minor bug fix updates.
I don’t think the schedule matters. They just over-commit every time. I said elsewhere:
> [Apple] were never building and have never built software at a sustainable pace, even before the yearly cadence. They race ahead with tech debt then never pay it off, so the problem gets progressively worse.
> A while back, that merely manifested as more and more defects over time.
> More recently, they began failing to ship on time and started pre-announcing features that would ship later.
> And now they’ve progressed to failing to ship on time, pre-announcing features that would ship later, and then failing to ship those features later.
> This is not the yearly cadence. This is consistently committing to more than they are capable of, which results in linear growth of tech debt, which results in rising defects and lower productivity over time. It would happen with any cadence.
— https://news.ycombinator.com/item?id=43436105
> Somebody who worked on Snow Leopard has already disagreed with you here about those things:
It's instructive to read the entire thread, not just the few sentences you quoted. For example, that person later admits, "So yeah, if you are comparing the most stable polished/fixed/stagnant last major version with the brand new 1.0 major version branch, the newer major is going to be buggier. That would be the case with every y.0 vs x.8."
> I don’t think the schedule matters. They just over-commit every time.
That's a distinction without a difference. Apple has committed to releasing major OS updates every year on schedule. That's a recipe for over-committment, because they need to produce enough changes to market it as a major release.
The "no new features" gimmick of Snow Leopard was a marketing lie but was also unique. It's a gimmick that Apple pulled only once, and it couldn't be repeated frequently by Apple without making a mockery of the whole annual schedule. Maybe they could do it a second time now, but in general the annual schedule is still a major problem for a number of reasons.
It should also be noted that Snow Leopard itself took 2 years to produce after Leopard.
Not sure why you’re downvoted because you’re right.
Snow leopard brought a huge amount of under the covers features. It was a massive release. The only reason it had that marketing was because they didn’t have a ton of user facing stuff to show
That is more or less what users asking for another Snow Leopard want: a release that doesn't have gratuitous UI churn and superficial changes, doesn't break the end user's muscle memory, but instead focuses on deep-seated and long-standing issues under the hood. If the right thing for the OS in the long term is to replace an entire subsystem instead of applying more band-aid fixes, then take the time to do a proper job of it.
lapcat loves his straw man about OS X 10.6.0 having plenty of bugs, but that misses the point of Snow Leopard. Of course a release that makes changes as fundamental as re-writing the Finder and QuickTime to use the NeXT-derived frameworks rather than the classic Mac OS APIs, and moving most of the built-in apps to 64-bit, is going to introduce or uncover plenty of new bugs. But it fixed a bunch of stubborn bugs and architectural limitations, and the new bugs mostly got ironed out in a reasonable time frame. (Snow Leopard was probably one of the better examples of Apple practicing what they preach: cleaning out legacy code and modernizing the OS and bundled apps the way they usually want third-party developers to do to their own apps.)
Fixing architectural bugs is still fixing bugs—just at a deeper level than a rapid release schedule driven by marketable end-user features easily allows for.
> a release that doesn't have gratuitous UI churn and superficial changes
There have actually been quite a few of those releases. Some of the California-themed updates have been practically indistinguishable from the previous versions. Of course Tahoe and Big Sur brought huge UI changes, but those are the exceptions, not the norm.
> focuses on deep-seated and long-standing issues under the hood
Which issues would those be, specifically?
> If the right thing for the OS in the long term is to replace an entire subsystem
Which subsystems need replacement? You claim that this is what people mean by wanting another Snow Leopard, but which subsystems do people want replaced?
> misses the point of Snow Leopard
I haven't missed the point of Snow Leopard. You're conflating two entirely different things: (1) the point of Snow Leopard as conceived by Apple in 2008-ish and (2) why people in 2025 look back fondly at Snow Leopard. My claim is that the fond memories are the result of the quality and stability that were themselves the result of 2 full years of bug fixes AFTER the initial release of Snow Leopard. Whereas the initial quality of Snow Leopard was not great, just like the initial quality of all major OS updates is not great. Major updates invariably make software buggier, and the quality comes only after much time spent refining the new stuff.
My contention is that the marketing lie of "no new features", which is naturally very memorable, is the reason that a lot of people associate Snow Leopard with bug fixes and quality, but that's not actually what 10.6.0 brought, and the quality came much later in time.
I'm not saying that Snow Leopard didn't bring valuable changes. I'm just saying that Snow Leopard existed in various stages over 2 years, and the high quality version of Snow Leopard that we remember fondly now is actually late-stage Snow Leopard, not early-stage Snow Leopard, and those 2 years of minor bug fix releases were crucial. Moreover, that's what we need now, a long series of minor bug fix updates, not any new major updates. The bug backlog has become a mountain.
> Of course a release that makes changes as fundamental as re-writing the Finder and QuickTime to use the NeXT-derived frameworks rather than the classic Mac OS APIs, and moving most of the built-in apps to 64-bit, is going to introduce or uncover plenty of new bugs.
Which is why I think it's very wrong to claim that people want "another Snow Leopard". Snow Leopard II released in 2026 would be much buggier than even macOS Tahoe, which is precisely what people do NOT want, a bunch more bugs.
> But it fixed a bunch of stubborn bugs
Which bugs exactly?
> Fixing architectural bugs is still fixing bugs
Which architectural bugs do you have in mind, or more relevantly, which architectural bugs do people in general have in mind when saying that they want another Snow Leopard?
RIP a ton of older audio plugins.
I lost access to decades of my albums which can no longer open on my MacBooks. Some open partially running Ableton Live with Rosetta. My record label recently reached out asking for stems for an old song for a sync deal with Rocket League — after spending a week trying to revive the old sessions I concluded that it was impossible and they were forever lost thanks to apples complete abandonment of backwards compatibility. It’s heart breaking really.
Could you not open the project on a windows computer or older mac?
I also think current Native Instruments luncher "Native Access" still requires rosetta for the installation :)))
I've already lost my "studio" (a few appliances in the corner of my room) due to upgrade from windows 7 to 10. Now it will happen again after I migrated to mac. I guess the "studio" should be left alone when it comes to upgrades. I'm starting to believe, that a "studio" is a set of software AND hardware, so I guess I won't sell my mac to buy new, but rather maintain it with given software and hardware on it, just maybe unplug it from the internet.
-- EDIT --
or just move back to windows, but I can't imagine it with the current state of AI bloat
It's just a choice between competent AI bloat (Microsoft) vs. laughable non-functional AI bloat (Apple).
Photoshop plugins also.
macOS has been sending mixed signals to musicians since Catalina. I'd be surprised if people are still seriously using it for studio work.
There are tons of musicians on Mac, and it gets lots of studio use. I'd say at least 50% of music studios are on Macs from what I've seen.
For sure. But I'd be surprised if a significant number of those setups were running recent versions of Mac OS, especially in older studios. Stability is preferable to new features since old studio hardware is often very reliable and studio engineers are wary of ruining compatibility with system upgrades
I can just imagine the Apple statement, like they did with flash/Flash.
‘We fully support the Studio.’
Edit: After hunting around without success, I’m now doubting my memory. I thought I could remember Jobs dismissively replying to a question about Adobe Flash that Apple supported flash (memory). Maybe I made that up?
I guess this is another way of Apple saying x86 is dead. Would have loved if Intel and AMD joined force to open up x86. Instead they are following the same path as POWER, likely doing it when it is too little too late.
Bring back Rosetta 1.
User mode emulation for PPC and Intel Mac apps.
That means the end of the Hackintosh era if the OS won't run x86, I imagine it won't install on x86 either.
Tahoe is officially the last version to support x86.
I never liked the idea, either get Apple, or get one of the other OSes.
It was like getting a Fiat Coupe with a Ferrari logo.
Aw that bums me out, brings back a lot of memories. Though I assume it’s been effectively dead for a while.
I haven’t dabbled with hackintoshes in nearly a decade, I stepped away around the time iMessage started needing those extensive hacks to work. Things seemed to shift away from driver/bootloader gaps to faking Apple hardware. Years earlier, I had an Asus Eee PC (remember “netbooks”?) that ran macOS without any major issues. I even built a machine that I believed I could hackintosh easily, though it never quite worked as well as I hoped.
The era of random companies selling pre-built Hackintoshes was so cool. Kids these days probably wouldn’t even believe it if you told them, like how Netflix used to actually send you a DVD in the mail. :)
This is very frustrating. As if they couldn't afford to continue it. And at the same time they keep making the system more and more closed, so that you can't even run applications without Apple's permission. I don't understand why people still buy such products.
And at the same time they keep making the system more and more closed, so that you can't even run applications without Apple's permission.
This is simply not true.
> This is simply not true.
Ok, then try to run a pre-compiled macOS M1 compatible application on your new Sequoia system, such as https://github.com/rochus-keller/oberonsystem3/ or https://github.com/rochus-keller/leancreator/. Requires quite some tricks so that at least some applications run without Apple's benedictions, but the tricks don't work for all such applications; and as it looks, they will also remove the last remaining work-arounds in future.
Uhhh, “don’t work for all applications” needs more context here. What the hell are you talking about?
Running Windows in Parallels. Even when running Windows ARM version, you still need Rosetta to run Windows x86 binaries.
Ok, application developers will maybe update by then.
But this is another way for Apple to say "do not trust us for your gaming needs no matter what PR says".
From the page, inside an large block marked “Important”:
> Beyond this timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks.
Does this mean end to macOS wine gaming? From what I know you need to be using x86_64 build of wine on macos to run the x86 built windows games.
No, follow the link. The article states that a subset of Rosetta 2 will remain for things like games.
May be M7 CPU will run qemu emulation of x86 fast enough for Rosetta to not be required.
Would be nice if they open sourced Rosetta, so that the community could continue support.
By the time this happens, it will have been a 7 year transition. That isn't too bad considering the original Rosetta only got 5.
I do have sympathy for those that still use this in their daily work flow, but also... this is Apple. This is how they have always rolled.
Meanwhile, I can still run my apps from the 90’s on my ARM laptop on Windows. That’s two architectures back to be clear: ARM64 -> x86-64 -> x86
Hopefully this will finally push Sonos to produce an Apple Silicon binary.
This is awful. I love playing games on my MBP and the latest crossover releases have been amazing in the ability to play almost all windows PC games at full speed. Losing rosetta means crossover is dead.
You would hope that apple would open source it, but they are one of the worst companies in the world for open sourcing things. Shame on all their engineers.
From the OP: "Beyond [the two-year] timeframe, we will keep a subset of Rosetta functionality aimed at supporting older unmaintained gaming titles, that rely on Intel-based frameworks."
What about the newer games that are maintained, just not supported on anything but windows?
Isn’t that part of Rosetta also used in their own Game Porting Toolkit?
mac for gaming is just not a good idea
What are you talking about? There's Do I have enough RAM to run Slack, all my Chrome tabs, and a terminal program; there's what terminal program shall I run today: Ghost edition; there's Can I get Colima to run, now with docker DLC. There's Kubernetes on Mac: Kind edition; there's Let's with Tart!; Nix is for Ops: New and more obtuse config edition. With so many fun games to play, who's got time for anything else?
Fortunately, whoever has money for a Mac can also afford hardware that will actually run games.
A Macbook (air) is no longer a crazy expensive unobtainable thing, just a perfectly reasonable mid-price choice for most people.
Whereas a good graphics card alone is still insane money.
Just use linux. You learn it once and it works forever.
Linux is so not polished.
Just few days ago something updated and my virtual desktop switching now behaves erratically. I'm pressing <Super>+<1>, it changes to desktop 1 with vscode opened. And immediately it starts typing "1" into vscode. Seems to bug with all X applications. I fixed it for vscode to make it work under wayland, but now it doesn't draw border around vscode window. Another irritation and I have other X apps.
It works, it's free, I love it. But it's so not polished and it'll never be. I miss macOS polish, where basic things just work.
> virtual desktops > vscode > wayland Sounds like you have a misconfigured system. Jokes aside, this looks like a bug in your WM. Macs may be more polished, but my point was not about polish.
If only this were true.
Stuff in Linux changes. Not quite as frequently, but it does change and in major ways that require significant amounts of relearning.
Example 1: audio
OSS -> Alsa -> Random Layers on top of Alsa -> Pulse -> Pipewire
Example 2: init
SysV -> OpenRC || runit || s6 || upstart -> systemd
Examples 3: desktops
KDE 1/2/3 -> KDE 4/Plasma
GNOME 1 -> GNOME 2 -> GNOME3+
Example 4: networking
ifconfig -> ip
Example 5:
Xfree86 -> Xorg -> Wayland
Now, it's important to note that people were attempting to resolve issues. The transitions weren't always clean, but the results are usually great. For example, moving to pipewire is possible the greatest advancement of audio ever. Linux audio finally doesn't suck. Xfree86 to Xorg was likewise great. For the last few years of X11, I usually didn't have to modify the config. I kind of don't care about init systems most of the time. The only major complaint for systemd is that disk I/O on embedded systems is kind of an issue, but things like Alpine are better there and Alpine doesn't use systemd.
With that said, I think the real issue is that people dislike advancements that break things. Early in Pulse's life, people absolutely hated it. Early in Wayland's life, people absolutely hated it, but it wasn't default so no one complained. With Windows and macOS, stuff changes seemingly constantly and randomly and breaks things, so people hate it. Saying, however, that Linux doesn't change seems a little daft to me. It changes faster than anything else on small levels, and different distributions have breaking changes at different rates.
Yes, Pulse pushed me to make my first hackintosh and move from Linux on the desktop to Mac OS on the desktop.
Good job, Poettering.
You don't have to install gnome, kde, wayland or systemd. You are just talking about your preferences masked as something that “had to be done”. I only had to fiddle with audio on the raspberry pi when connecting bluetooth. Everything works out of the box nowadays. If wayland was a good protocol, the user would not have to know about it.
I wasn't saying that anything had to be done, nor was I saying that each change was good or bad (except for the audio and Xfree86 to Xorg). My preferences really don't enter into it. I was saying that Linux systems do indeed change, and the idea of learn once and you're done is nonsense.
Linux is by far the OS with the highest amount of churn.
I use less than 10 gui programs on linux. They never change. The command line programs do not change either. Unless the devs get a dumb idea to rewrite them in Rust, because they sunk so many hours into learning it.
My company only hands out Macs. So I use Linux in a VM for embedded development. Works great.
And now I'm getting an Apple Silicon machine in a few months to replace my Intel Mac and I'm out of luck.
Forever only if one never updates.
Depends on the distro.
Using Linux since 1995, I wonder which one from Distrowatch is safe from this.
I did not mean actually forever. The sun will burn out in the future.
For a few years now it's been feeling like Apple are pushing devs away and are more interested in catering for general consumers. Just look at what DHH has written and said about it, and his move to Omarchy
It will be interesting to see whether they keep optional TSO in their SoCs after Rosetta 2 is no longer working.
As is tradition.
Tangentially, this was surprising
I've been using this VST from Arturia (Minimoog V) since they distributed it for free back in like 2011 or 2012, and it runs as well on my M1 Mac as it did on my previous Intel Macs.I mean, it's literally the same DMG from way back when and there's no chance it doesn't run under Rosetta, but I run Ableton natively!
Seems like you're trying to load an Intel-only plugin binary in a native ARM application. This doesn't work. DAW and plugins must use the same archicture. You would either have to run Ableton in Rosetta or use a plugin bridge. (This is similar to Windows if you want to run 32-bit plugins in a 64-bit DAW.)
AU plugins work, the AU framework itself spins up a separate process to host the translated Intel plugin.
Yes, that's how you do it. I have written a VST plugin host for Pure Data and SuperCollider and it supports sandboxing/bridging. It's not rocket science. I'm not sure why Ableton never bothered to implement this.