Ubuntu now requires more RAM than Windows 11

(howtogeek.com)

136 points | by jnord 9 hours ago ago

165 comments

  • senfiaj 7 hours ago

    From my understanding this is an official statement, not a benchmark result.

    > The change isn't about the core operating system becoming resource-hungry. Instead, it reflects the way people use computers today—multiple browser tabs, web apps, and multitasking workflows, all of which demand additional memory.

    So it is more about the 3rd party software instead of OS or desktop environment. Actually, nowadays it's recommended to have 8+ GB of RAM, regardless of OS.

    I just checked the memory usage on Ubuntu 24.04 LTS after closing all the browser tabs. It's about 2GB of 16GB total RAM. 26.04 LTS might have higher RAM usage but it seems unlikely that it will get anywhere close to 6GB.

    • HauntingPin 6 hours ago

      Also, the Windows 11 requirements are ludicrous.

      https://www.microsoft.com/en-us/windows/windows-11-specifica...

      4GB of RAM? What? I guess if your minimum is "able to start Windows and eventually reach the desktop", sure? I wouldn't even use Windows 11 with 8GB even though it would theoretically be okay.

      • winrid 5 hours ago

        Win11 IOT runs great on 4gb if that matters :) I have a few machines in the field running it and my java app, still over a gig free usually.

      • mpyne 4 hours ago

        > 4GB of RAM? What? I guess if your minimum is "able to start Windows and eventually reach the desktop", sure? I wouldn't even use Windows 11 with 8GB even though it would theoretically be okay.

        Not okay as soon as you throw on the first security tool, lol.

        I work in an enterprise environment with Win 11 where 16 GB is capped out instantly as soon as you open the first browser tab thanks to the background security scans and patch updates. This is even with compressed memory paging being turned on.

    • mpol 7 hours ago

      It's not just the applications, the installer doesn't even start up with 1GiB of memory. With 2GiB of memory it does start up. You could (well, I would :) ) blame it on the Gnome desktop, but it is very different from what I would have expected.

      I just tested this with 25.10 desktop, default gnome. With 24.04 LTS it doesn't even start up with 2GiB.

      • senfiaj 2 hours ago

        So, you mean when RAM is 2 GiB with 25.10 the installer started up but didn't with 24.04? What about being able to install and then boot the installed Ubuntu?

    • rdsubhas an hour ago

      That's subjective and I would be more comfortable if that's called as Recommended memory, not Minimum memory.

      Minimum memory as in this change sets a completely different expectation.

    • panarky 7 hours ago

      If you run Windows 11 with Microsoft Teams and Microsoft Outlook on a 4GB machine you're gonna have a bad day.

    • Lerc 7 hours ago

      I know 2GB isn't very heavy in OS terms these days, but it's still enough to hold nearly 350 uncompressed 1080p 24-bit images.

      There's rather a lot of information in a single uncompressed 1080p image. I can't help but wonder what it all gets used to for.

      • array_key_first 23 minutes ago

        A lot of it is optimizing applications for higher-memory devices. RAM is completely worthless if it's not used, so ideally you should be running your software with close to maximum RAM usage for your device. Of course, the software developer doesn't necessarily know what device you will be using, or how much other software will be running, so they aim for averages.

        For example, Java applications will claim much more memory than they need for the heap. Most of that memory will be unused, but it's necessary to have a faster running application. If you've ever run a Java app at consistently 90% heap usage, you know it grinds to an absolute halt with constant collection.

        The same is true for caching techniques. Reading from storage is slow, so it often makes sense to put stuff in RAM even if you're not using it very often.

      • senfiaj 7 hours ago

        I also believe that this memory usage might be decreased significantly, but I don't know how much (and how much is worth it). Some RAM usage might be useful, such as caching or for things related with graphics. Some is a cumulative bloat in applications caused by not caring much or duplication of used libraries.

        But I remember in 2016 Fedora Gnome consumed about 1.6GB of RAM on my PC with 2GB of RAM a decade ago. Considering that after a decade the standard Ubuntu Gnome consumes only 400MB more RAM and also that my new laptop has 16GB of RAM (the system might use more RAM when more RAM is installed), I think the increase is not that bad for a decade. I thought it would be much worse.

        • jonhohle 6 hours ago

          Buy why that much? The first computer I bought had 192MB of RAM and I ran a 1600x1200 desktop with 24-bit color. When Windows 2000 came out, all of the transparency effects ran great. Office worked fine, Visual Studio, 1024x768 gaming (I know that’s quite a step down from 1080p).

          What has changed? Why do I need 10x the RAM to open a handful of terminals and a text editor?

          • Someone 4 hours ago

            > and I ran a 1600x1200 desktop with 24-bit color

            > What has changed? Why do I need 10x the RAM to open a handful of terminals and a text editor?

            It’s not a factor of ten, but a 4K monitor has about four times as many pixels. Cached font bitmaps scale with that, photos take more memory, etc.

            > When Windows 2000 came out

            In those times, when part of a window became uncovered, the OS would ask the application to redraw that part. Nowadays, the OS knows what’s there because it keeps the pixels around, so it can bitblit the pixels in.

            Again, not a factor of ten, but it contributes.

            The number of background processes likely also increased, and chances are you used to run fewer applications at the same time. Your handful of terminals may be a bit fuller now than it was back then.

            Neither of those really explain why you need gigabytes of RAM nowadays, though, but they didn’t explain why Windows 2000 needed whatever it needed at its time, either.

            The main real reason is “because we can afford to”.

          • senfiaj 5 hours ago

            Partly because we have more layers of abstraction. Just an extreme example, when you open a tiny < 1KB HTML file on any modern browser the tab memory consumption will still be on the order of tens, if not hundreds of megabytes. This is because the browser has to load / initialize all its huge runtime environment (JS / DOM / CSS, graphics, etc) even though that tiny HTML file might use a tiny fraction of the browser features.

            Partly because increased RAM usage can sometimes improve execution speed / smoothness or security (caching, browser tab isolation).

            Partly because developers have less pressure to optimize software performance, so they optimize other things, such as development time.

            Here is an article about bloat: https://waspdev.com/articles/2025-11-04/some-software-bloat-...

          • tosti 4 hours ago

            2 Programmers sat at a table. One was a youngster and the other an older guy with a large beard. The old guy was asked: "You. Yeah you. Why the heck did you need 64K of RAM?". The old man replied, "To land on the moon!". Then the youngster was asked: "And you, why oh why did you need 4Gig?". The youngster replied: "To run MS-Word!"

          • winrid 5 hours ago

            Higher res icons probably add a couple hundred megs alone

        • KronisLV 6 hours ago

          I remember running Xubuntu (XFCE) and Lubuntu (LXDE, before LXQt) on a laptop with 4 GB of RAM and it was a pretty pleasant experience! My guess is that the desktop environment is the culprit for most modern distros!

          • abenga 5 hours ago

            Gnome 50 and its auxilliary services on my machine uses maybe 400MB.

            The culprit is browsers, mostly.

      • adgjlsfhk1 7 hours ago

        well to start, you likely have 2 screen size buffers for current and next frame. The primary code portion is drivers since the modern expectation is that you can plug in pretty much anything and have it work automatically.

        • Lerc 7 hours ago

          How often do you plug in a new device without a flurry of disk activity occurring?

    • CoolGuySteve 6 hours ago

      No because as far as we know 26.04 won't enable zswap or zram whereas Windows and MacOS both have memory compression technology of some sort. So Ubuntu will use significantly more memory for most tasks when facing memory pressure.

      Apparently it's still in discussion but it's April now so seems unlikely.

      Kind of weird how controversial it is considering DOS had QEMM386 way back in 1987.

      • cogman10 6 hours ago

        Zswap is a no brainer. I have to wonder why the hesitancy.

      • bzzzt 6 hours ago

        QEMM386 for DOS did not have a memory compression feature. Only one of the later versions for Windows 3.1 did.

        • roryirvine 4 hours ago

          CPUs really weren't up to the job in the pre-Pentium/PowerPC world. Back then, zip files used to take an appreciable number of seconds to decompress, and there was a market for JPEG viewers written in hand-optimised assembly.

          That's why SoftRAM gained infamy - they discovered during testing that swapping was so much faster than compression that the released version simply doubled the Windows swap file size and didn't actually compress RAM at all, despite their claims (and they ended up being sued into oblivion as a result...)

          Over on the Mac, RAMDoubler really did do compression but it a) ran like treacle on the 030, b) needed to do a bunch of kernel hacks, so had compatibility issues with the sort of "clever" software that actually required most RAM, and c) PowerMac users tended to have enough RAM anyway.

          Disk compression programs were a bit more successful - DiskDoubler, Stacker, DoubleSpace et al. ISTR that Microsoft managed to infringe on Stacker's patents (or maybe even the copyright?) in MS DOS 6.2, and had to hastily release DOS 6.22 with a re-written version free of charge as a result. These were a bit more successful because they coincided with a general reduction in HDD latency that was going on at roughly the same time.

  • goalieca 8 hours ago

    I hear a lot from linux users that found gtk 2 era on x11 as pretty close to perfect. I know i had run ubuntu and after boot it used far less than 1GB. The desktop experience was perhaps even slightly more polished than what we have today. Not much has fundamentally changed except the bloat and a regression on UX where they started chasing fads.

    I suppose the most major change on RAM usage is electron and the bloated world of text editors and other simple apps written in electron.

    • john01dav 8 hours ago

      Just stick XFCE on a modern minimal-ish (meaning not Ubuntu, mainly) distribution and you'll have this with modern compatibility. Debian and Fedora are both good options. If you want something more minimal as your XFCE basd, there are other options too.

      • mrob 7 hours ago

        XFCE is saddled with its GTK requirement, and GTK gets worse with every version. Even though XFCE is still on GTK3, that's a big downgrade from GTK2 because it forces you to run Wayland if you don't want your GUI frame rate arbitrary capped at 60 fps.

        For people wanting the old-fashioned fast and simple GUI experience, I recommend LXQt.

        • jstanley 7 hours ago

          What use is there in display frame rates above 60 fps?

          • tuetuopay 6 hours ago

            Outside of gaming, not much. However, now that I'm used to a 144Hz main monitor, there is no world where I would get back. You just feel the difference.

            So basically, no use when you've not tasted 120+Hz displays. And don't because once you do, you won't go back.

            • bogwog 6 hours ago

              I have a 165hz display that I use at 60hz. Running it at max speed while all I'm doing is writing code or browsing the web feels like a waste of electricity, and might even be bad for the display's longevity.

              But for gaming, it really is hard to go back to 60.

              • tuetuopay 3 hours ago

                Mine supports variable refresh rate, which means for most desktops tasks (I.e when nothing is moving), it runs at 48Hz.

                Incredibly, Linux has better support than windows for it on the desktop: DWM runs full blast, while sway supports VRR on the desktop. Windows will only enable it for games (and games that support it). Disclaimer: Wayland compositor required.

                It’s not enabled by default on e.g. sway because on some GPU and monitor combos, it can make the display flicker. But if you can, give it a try!

                • bogwog an hour ago

                  I use KDE + Nvidia, and last I looked into it, it only worked if you had one monitor enabled. That's fine for gaming, not for working.

                  But it has been a while since I've tried it, maybe I should look into it again

          • mrob 7 hours ago

            It makes it easier to treat the computer as part of your own body, allowing operation without conscious thought, as you would a pencil or similar hand tool.

          • TacticalCoder 5 hours ago

            > What use is there in display frame rates above 60 fps?

            On a CRT monitor the difference between running at 60 Hz and even a just slightly better 72 Hz was night and day. Unbearable flickening vs a much better experience. I remember having some little utility for Windows that'd allow the display rate to be 75 (not 72 but 75). Under Linux I was writing modelines myself (these were the days!) to have the refresh rate and screen size (in pixels) I liked: I was running "weird" resolutions like 832x604 @ 75 Hz instead of 800x600 @ 60 Hz, just to gain a little bit more screen real estate and better refresh rate.

            Now since monitors started using flat panels: I sure as heck have no idea if 60 fps vs 120 fps or whatever change anything for a "desktop" usage. I don't think the problem of the image fading too quickly at 60 Hz that CRT had is still present. But I'm not sure about it.

          • Tade0 6 hours ago

            I, for one, lose track of the mouse way less often at 165Hz.

            • M95D 3 hours ago

              You need a bigger cursor.

            • jstanley 6 hours ago

              I lose track of the mouse less often at 1024x768!

      • Imustaskforhelp 8 hours ago

        MXLinux is really great for something like xfce and I really loved the snapshotting feature of it too. Highly recommended.

        • imcritic 7 hours ago

          You spelled Debian wrong.

    • okeuro49 7 hours ago

      I used gtk2, it was ok, but I preferred Ubuntu's Unity interface when it came out.

      Gnome 3 seems similar to Unity nowadays, and it is pretty good.

      I find it much easier to use than Windows or Mac, which is credit to the engineers who work on it.

    • synergy20 7 hours ago

      it's always the browser, each tab is at least 100MB, electronjs is also a browser. the gtk or whatever is nothing before the browser

    • shevy-java 8 hours ago

      The whole linux stack got bigger though - just look at what you need now to compile stuff, cmake, meson/ninja, mesa, llvm and so forth. gtk2 was great; GTK is now a GNOMEy-toolkit only, controlled by one main corporation. Systemd increased the bloat factor too - and also gathers age data of users now (https://github.com/systemd/systemd/pull/40954).

      I guess one of the few smaller things would be wayland, but this has so few features that you have to wonder why it is even used.

      • curt15 6 hours ago

        >The whole linux stack got bigger though - just look at what you need now to compile stuff, cmake, meson/ninja, mesa, llvm and so forth

        Those are all development tools. Has the runtime overhead grown proportionally, and what accounts for the extra weight?

        • array_key_first 16 minutes ago

          Runtime-wise we use more garbage collected languages now. Java and such are great and can be very high performance, the real cost though is memory. GC languages need much more memory for book keeping, but they also need much more memory to be performant. Realistically, a Java app needs 10x the amount of memory as a similar C++ application to get good performance. That's because GC languages only perform well when most of their heap is unused.

          As a side-note, that's how GC languages can perform so well in benchmarks. If you run benchmarks that generate huge amounts of garbage or consistently run the heap at 90%+ usage, that's when you'll see that orders of magnitude slowdown.

          Oh also containers, lots more containerized applications on modern Linux desktops.

      • goalieca 7 hours ago

        I’ve been using cmake since early 2000s when i was hacking on the vtk/itk toolkit. Compiling a c++ program hasn’t gotten any better/worse. FWIW, I always used the curses interface for it.

      • ScislaC 7 hours ago

        Is the option of legal compliance a bad thing? They have corporate customers.

        If there's no opt-out, that's a different story.

        • GrayShade 7 hours ago

          It's plain FUD. systemd always had fields for the full name, email address and location. They were optional, just like the date of birth. Bad systemd!

          • anthk 6 hours ago

            Is not FUD; the full name, email and the rest were not META/corporations mandated, which are lobbying for it so they can earn money with users' preferences. Get your spyware to somewhere else.

            If META's business model is not lucrative, is not my problem.

            • gruez 5 hours ago

              >which are lobbying for it so they can earn money with users' preferences

              Given it's a field where you can put absolutely anything in (and probably randomize, if you want), how is this different than the situation today, where random sites ask you for your birthday (also unverified)? Moreover Meta already has your birthday. It's already mandated for account creation, so claims of "so they can earn money with users' preferences" don't make any sense.

              • anthk 4 hours ago

                Keep gaslighting:

                https://www.theregister.com/2026/03/24/foss_age_verification...

                Good luck when most libre users toss RH/Debian because of this and embrace GNU.

                • gruez 3 hours ago

                  >Keep gaslighting:

                  This is against HN guidelines: " Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith."

                  >The contents of the field will be protected from modification except by users with root privileges.

                  So... most users?

    • superkuh 7 hours ago

      Yep. I still develop Gtk2 applications today. It's a very snappy and low resource usage toolkit aimed entirely at desktop computers. None of that "mobile" convergence. I suppose you could put Gtk2 applications into containers of some sort but since Gtk2 has (luckily) been left alone by GNOME for decades it's a stable target (like NES or N64 is a stable target) and there's no need for it.

      Most of the bloat these days is from containers and Canonical's approach to Ubuntu since ~2014 has been very heavy on using upstream containers so they don't have to actually support their software ecosystem themselves. This has lead to severe bloat and bad graphical theming and file system access.

      • WD-42 6 hours ago

        Can you point us to some of these gtk2 applications that you’ve been writing recently?

        • superkuh 3 hours ago

          Sure, one is connmapperl. It is a server/client application where the server is a GUI map of the world that shows all the various clients collected IP established connections via geoip lookup (local). It stores everything in sqlite db and has a bunch of config/filtering options; http://superkuh.com/connmapperl.html Technically a fork of X11 connmap I made because I coulnd't get it to run on my old X11, but with many, many more features (like offline whois from raw RIR dumps, the db, the hilbert mapping, the replays of connection history, etc).

          Another one is memgaze, a program to vizualize linux process virtual memory spaces as RGB images and explore them using various binary visualization and sonification tools. Ie, you can just click a hilbert map of all processes then in the new window click around inside the image of that particular process' virtual ram and then listen to it interpreted as an 8bit wav, or find an extract images, for example. Or search for strings, run digraph analysis, etc. http://superkuh.com/memgaze-page.html

          Or feeed.pl, my very quick and low resource usage feed reader for 1000+ feeds written in Perl/Gtk2 that is text only (no html, no images, etc). It is really handy for loading .opml files and finding and fixing broken feeds using the heuteristics I hard coded in to find feed urls. http://superkuh.com/blog/2025-09-13-2.html

          These are a few I made 2025-26 that other people might care to use. But I have a lot more that just scratch my own particular itches. Like a Perl/Gtk2 version of MS Paint that interprets arbitrary loaded and painted images as sound, or the things that I use to monitor my ISP uptime/speed, etc.

    • IshKebab 4 hours ago

      That's rose-tinted. I remember specifically switching to KDE because GTK apps of the day segfaulted all the time. Unfortunately KDE then screwed things up massively with Plasma (remember the universally loathed kidney bean?) and it's really only recovered recently.

      And to say the desktop experience was more polished than what we have now is laughable. I remember that you couldn't have more than one application playing sound at the same time. At one point you had to manually configure Xfree86 to be aware that your mouse had a middle button. And good luck getting anything vaguely awkward like WiFi or suspend-to-ram working.

      The Linux desktop is in a vastly better position now, even taking the Wayland mess into account.

  • intothemild 8 hours ago

    Two things

    First, it sounds like this 6gb requirement is more like a suggestion/recommendation than a requirement. I also am curious if it actually actively uses all 6gb. From my own usage of Linux over the years the OS itself isn't using that much ram, but the application is, which is almost always the browser.

    Secondly. I haven't used Ubuntu desktop in years. So I have no real idea if this is something specific to them, but I do use Fedora, so I would imagine that the memory footprint cannot be too different. Whilst I could easily get away with <8gb ram, you really kind of don't want too if you're going to be doing anything heavier than web browsing or editing documents. Dev work? Or CAD, Design etc etc. But this isn't unique to Linux.

    • heelix 8 hours ago

      Ubuntu just raised the minimum RAM requirement from 4gb to 6. While it might have been possible to run anything with a GUI on 4, I can't imagine that is a good experience.

      When they turned Centos into streams, I cut my workstation over to Ubuntu. It has been a reasonable replacement. Only real issues were when dual booting Win10 horked my grub and snap being unable to sort itself on occasion. When they release 26 as an LTS, I'm planning to update. You are spot on - the desktop itself is reasonably lean. 100+ tabs in Firefox... less so. Mind you, the amount of RAM in the workstations I'm using could buy a used car these days.

      • hhh 7 hours ago

        I don't really get it. I have ran fleets of thousands of devices running Chrome in a container on Ubuntu server, and it's a nice experience. It took a lot to make it nice, but once it was there it was rock solid. This was with 1GB ram on a Pi 3. When we swapped to Pi4, we just had thousands on gigabytes of ram and thousands of cpu cores unused.

      • bee_rider 7 hours ago

        Does Firefox really not unload the tabs in that case?

        • foepys 6 hours ago

          It does. You can also do it by hand via the right-click on tab menu

    • sunshine-o 7 hours ago

      I happened to install Fedora Silverblue on a computer a few days ago and looked quickly at the memory usage after boot: it was about 6gb ! I usually run Alpine or FreeBSD, so I thought: great that thing consumes 10x the RAM.

      I believe Fedora and Ubuntu use about the same set of technologies: systemd, wayland, Gnome, etc. so it is about the same.

      Apart from working out of the box I do not really know what those distros have and I don't. I just have to admit managing network interfaces is really easy in Gnome.

      With the skyrocketing price of RAM this might finally be the year of the Linux desktop. But it is not gonna be Gnome I guess.

  • whatevaa 8 hours ago

    Win11 barely works with 4GB. Like, you can have a browser with youtube on and that's it, 90%+ memory usage. I know because that is one of my media PC (instead of smart tv).

    Can't move to Linux because it's Intel Atom and Intel P-state driver for that is borked, never fixed.

    • oreally 7 hours ago

      Today's browsers tend to be huge memory hogs too. Software's attitude of "there's always more memory" is coming back to bite them as prices of ram increase.

      • senfiaj 7 hours ago

        IMHO, browsers might prioritize execution speed somewhat more than memory. There is the Pareto tradeoff principle where it's unlikely to optimize all the parameters - if you optimize one, you are likely to sacrifice others. Also more memory consumption (unlike CPU) doesn't decrease power efficiency that much, so, more memory might even help with that by reducing CPU usage with caching.

        • oreally 7 hours ago

          TBH your comments come off as either very misleading or just uneducated on the nature of performance. Troubling indeed.

          • senfiaj 7 hours ago

            Can you enlighten me why it's misleading or uneducated?

  • reilly3000 6 hours ago

    > Linux's advantage is slowly shrinking

    This is garbage writing. Linux’s advantages are numerous and growing. Ubuntu ≠ Linux. WRT RAM requirements, Win 11’s 4GB requirement isn’t viable for daily use and won’t represent any practical machine configuration that has the requisite TPM 2 module. On the other side, the Linux ecosystem offers a wide variety of minimal distributions that can run on ancient hardware.

    Maybe I’m just grouchy today but I would flag this content if sloppy MS PR was a valid reason.

    • leni536 6 hours ago

      FWIW I find even KDE plasma on wayland perfectly viable on a 4 GiB budget notebook. Windows runs horribly on the same hardware.

    • osigurdson 6 hours ago

      Agree. I'm able to do development, run multiple containerized services (including Postgres, NATS, etc), have 10 browser tabs open, all on an 8 GiB laptop running Arch. I have a desktop with 64GiB as well but realized there is no point using it most of the time.

    • listless 6 hours ago

      I agree. And even on Ubuntu, the performance vs same specs on Windows is ridiculously better.

      Apps are still a huge gap on Linux, but as an OS, I choose it every time over Windows and MacOS.

  • bityard 8 hours ago

    Since the dawn of time, Microsoft has published the minimum system requirements needed to run Windows, not what you need to actually do something useful with it.

  • duckmysick 6 hours ago

    For comparison, here are the official hardware recommendations for Debian: https://www.debian.org/releases/stable/amd64/ch03s04.en.html

    "With Desktop" has 1GB minimum and 2GB recommended - along with Pentium 4, 1GHz cpu.

    • NekkoDroid 3 hours ago

      > "With Desktop" has 1GB minimum and 2GB recommended - along with Pentium 4, 1GHz cpu.

      This seems like a recommendation to just really get to the desktop itself + maybe some light usage. Anything more than that and the "recommendation" is fairly useless with the memory hog of apps that are commonly used.

  • wrxd 6 hours ago

    The framing of the article is very odd.

    It says that Ubuntu increase the requirements not because of the OS itself but to have a better user experience when people have many browser tabs opened. Then it compares to Windows which has lower nominal requirements but higher requirements in practice to get a passable user experience.

  • laweijfmvo 6 hours ago

      > Linux's advantage is slowly shrinking
    
    Ubuntu is not Linux. Also, would love to see Windows running on 4GB.
  • hnarn 3 hours ago

    I switched to Debian a long time ago for both desktops and servers. For me personally I don’t see what value prop Ubuntu even has anymore, apart from maybe having ZFS in the kernel. Support maybe? I’ve never used it personally so I don’t know if it’s any good, but for any serious shop willing to spend money on support I’d probably go with RHEL anyway.

  • Someone1234 8 hours ago

    Windows 11's 4 GB minimum is dishonest. You cannot reasonably run it on that little, it is far too bloated at this point. Even LTSC benefits from 6 GB, and that is substantially cut-down compared to retail/enterprise.

    I'd say Windows 11's real minimal is 8 GB in 2026, with the recommended being 16 GB.

    PS - And even at 8 GB, it hits 100% usage and pages under moderate load or e.g. Windows Update running in the background.

  • fxj 6 hours ago

    God I miss openstep and CDE. It needs 16MB RAM (yes MB!) and together with a lighweight firefox clone you get everything you need. Eye candy is nice to have but not at that cost.

    • yjftsjthsd-h 2 hours ago

      > I miss openstep and CDE

      Why miss things that are still around? I dunno how close GNUstep is, but the original CDE is still here, open source and ported to most unix-likes.

  • dombiscoff 7 hours ago

    Why is this here? Extreme clickbait for those without tech literacy

  • TheChaplain 5 hours ago

    Many commenters blows up here but you have to see this from the non-informed consumer perspective I think.

    What I mean is, yes, WE know Win11 barely works with 4GB and WE know that 6gb is quite generous for a Linux machine, but they don't.

    The general public isn't as informed as we think they are (which is proven by 75 million people last election).

  • teo_zero 5 hours ago

    > Canonical isn’t making 6GB memory a hard requirement for Ubuntu 26.04. It will still install on machines that fall below the minimum requirement, but users will have to deal with slower performance.

    I think we have quite different definition of "minimum requirement", then.

  • groundzeros2015 8 hours ago

    But we already know Ubuntu is the “worst” (most like modern windows, setup for media consumption, etc).

    You can install Debian and it gives you all that you are familiar with from Ubuntu.

  • opengrass 7 hours ago

    I changed to devuan, now it uses 75 MB ram on idle.

    • yjftsjthsd-h 2 hours ago

      Including GUI? (And if so, what desktop/wm?)

  • bjackman 8 hours ago

    The article itself acknowledges that the headline is bullshit:

    > The change isn't about the core operating system becoming resource-hungry. Instead, it reflects the way people use computers today—multiple browser tabs, web apps, and multitasking workflows

    Basically the change reflects the fact that, at this level of analysis (how much RAM do I need in my consumer PC), the OS is irrelevant these days. If you use a web browser then that will dominate your resource requirements and there's nothing Linux can do about that.

    • crimsonnoodle58 7 hours ago

      Exactly. The headline is clickbait.

      It doesn't matter how efficient your kernel or DE is if users expect to be able to load bloated websites in Chrome.

    • dwedge 7 hours ago

      The headline is clickbait and the acknowledgement is LLM

      • SV_BubbleTime 7 hours ago

        I also feel bad for human em dash fans…

        • dwedge 3 hours ago

          It isn't about the X emdash it's about the Y

          It's slightly off from llm content but reads like someone touched it up afterwards

  • orliesaurus 7 hours ago

    How much RAM does Omarchy use? Anyone running the OS after the media hyped it a couple of months back?

    • tuetuopay 6 hours ago

      It's Arch based, with (iirc) Hyperland as it's "DE", so really not much memory I'd guess.

      My desktop runs Arch with Sway (so quite close), three monitors, and uses ~400MB ram after boot. Most of it are the framebuffers. All the rest is eaten by Firefox, rust-analyzer and qemu.

  • gchamonlive 7 hours ago

    With arch+hyprland I hit 5GiB for a zen browser instance with 15+ tabs and a kitty instance with 15+ windows across 5 tabs, with codex and vim running.

    If ram is a problem there's always alternatives. The impediment is always having to rethink your workflow or adopting someone else's opinion.

  • estimator7292 7 hours ago

    Last time I touched an Ubuntu system, I had to diagnose why the machine suddenly had no available disk space.

    1.5TB in /var/log

    All from the Firefox snap package complaining every millisecond about some trivial Snap permission.

    I'm glad I chose an OS without goddamn Snap. It's been unadulterated pain every time I've ever interacted with it.

    • jgrowl 4 hours ago

      YES! Snap drove me to debian sid and haven't looked back. Snap is probably fine, but don't force me to use it.

    • anthk 6 hours ago

      Trisquel 12 Mate -codenamed Ecne- with the Xanmod kernel to cover propietary drivers, that's a more libre start than Ubuntu. If everything works with the libre kernel, you can toss the Xanmod kernel in the spot.

  • trekkie99 8 hours ago

    Is this a Ubuntu issue or a Gnome issue? What about Lubuntu, Kubuntu, etc?

    • mcswell 8 hours ago

      The article suggests that Xubuntu (which uses xfce instead of Gnome) uses much less memory. I don't know how true that is, but it seems reasonable that xfce uses some less memory.

      • groundzeros2015 8 hours ago

        We expect xfce is much more efficient (it has more basic features) but is that the cause? Are you just subtracting out a big part from a higher baseline?

      • dathinab 7 hours ago

        sure probably even git a bit less,

        but I still would recommend 6 GiB.

        no matter of the OS

        the problem here is more the programs you run on top of the OS (browser, electron apps, etc.)

        realistic speaking you should budged at least 1GiB for you OS even if it's minimalist, and to avoid issues make it 2GiB of OS + some emergency buffer, caches, load spikes etc.

        and 2GiB for your browser :(

        and 500MiB for misc apps (mail, music, etc.)

        wait we are already at 4.5 GiB I still need open office ....

        even if xfc would safe 500 MiB it IMHO wouldn't matter (for the recommendation)

        and sure you can make it work, can only have one tab open at a time, close the browser every time you don't need it, not use Spotify or YT etc.

        but that isn't what people expect, so give them a recommendation which will work with what they expect and if someone tries to run it at smaller RAM it may work, but if it doesn't it at least isn't your fault

    • bee_rider 8 hours ago

      It is not actually an issue. The article isn’t based on any technical aspects of the OS, just the reported system requirements.

    • alternatex 8 hours ago

      If it was a Gnome issue it would also be a Fedora issue though, no?

      • embedding-shape 8 hours ago

        Depends on the packaging no? I'm not sure you get 100% the same experience even with the same Gnome version across Fedora, Ubuntu and Arch, do you?

    • zekica 8 hours ago

      I think this is a snap issue.

      • trekkie99 7 hours ago

        I’d imagine that all of Canonical’s flavors/spins ship with snap, so if the resources are lighter on say xubuntu then it’s probably not snap.

        Snap still kinda egh though ;-D

    • dathinab 7 hours ago

      neither, they didn't measure anything

      they compared the Ubuntu minimal recommended RAM to Windows absolute minimal RAM requirements.

      but Windows has monetary incentives (related to vendors) to say they support 4GiB of RAM even if windows runs very shitty on it, on the other had Ubuntu is incentivized to provider a more realistic minimum for convenient usage

      I mean taking a step back all common modern browsers under common usage can easily use multiple GiB of memory and that is outside of the control of the OS vendor. (1)

      As consequence IMHO recommending anything below 6 GiB is just irresponsible (iff a modern browser is used) _not matter what OS you use_.

      ---

      (1): If there is no memory pressure (i.e. caches doesn't get evicted that fast, larger video buffers are used, no fast tab archiving etc.) then having YT playing likely will consume around ~600-800 MiB.(Be aware that this is not just JS memory usage but the whole usage across JS, images, video, html+css engine etc. For comparison web mail like proton or gmail is often roughly around 300MiB, Spotify interestingly "just" around 200MiB, and HN around 55MiB.

  • Synaesthesia 8 hours ago

    The amount of people still on less than 8gb of memory is really small.

    • b00ty4breakfast 7 hours ago

      I won't stand for this erasure!

    • mcswell 8 hours ago

      On the contrary, those are mostly really overweight people, so the amount of them is quite large. The number of them is, however, small. :)

  • TacticalCoder 5 hours ago

    I had a machine (an AMD 3700X with 32 GB of RAM and a fast NVMe SSD) on which I used to run Debian. Then about 2.5 years ago I bought a new one and gave my wife the 3700X: I figured out she'd be more at ease so I installed Ubuntu on it.

    I couldn't understand why everything was that slow compared to Debian and didn't want to bother looking into it so...

    After a few weeks: got rid of Ubuntu, installed her Debian. A simple "IceWM" WM (I use the tiling "Awesome WM" but that's too radical for my wife) and she loves it.

    She basically manages her two SMEs entirely from a browser: Chromium or Firefox (but a fork of Firefox would do too).

    It works so well since years now that for her latest hire she asked me to set her with the same config. So she's now got one employee on a Debian machine with the IceWM WM. Other machines are still on Windows but the plan is to only keep one Windows (just in case) and move the other machines to Debian too.

    Unattended upgrades, a trivial firewall "everything OUT or IN but related/established allowed" and that's it.

    • jgrowl 5 hours ago

      I had used ubuntu back in the day, and when I came back to linux a bit ago I immediately installed it again.

      I don't remember all of my frustrations, but I remember having a lot of trouble with snap. Specifically, it really annoyed me that the default install of firefox was the snap version instead of native. I want that to be an opt-in kind of thing. I found that flatpak just worked better anyway.

      I almost tried making the switch to arch, but I've been pretty happy running debian sid (unstable) since. The debian installer is just more friendly to me for getting encrypted drives and partitions set up how I want.

      It's not for everyone, but I like the structured rolling updates of sid and having access to the debian ecosystem too much to switch to something else at this point.

      I use sway with a radeon card for my primary and have a secondary nvidia card for games and AI stuff.

      It has its warts, but I love my debian+sway setup

  • pharrington 7 hours ago

    Fat chance, Satya!

  • jmclnx 8 hours ago

    >Linux's advantage is slowly shrinking

    Maybe in some ways, yes. But there are distros out there that can run easily in as little as 1G RAM. And I heard people have used it with far less.

    I also remember hearing Ubuntu moved to default to Wayland, if true I have to wonder if defaulting to Wayland is part of the problem because Gnome / KDE on Wayland will use far more memory than FVWM / Fluxbox on X11.

    FWIW, you can do a lot just from the console without a GUI w/Linux and any BSD, in that case the RAM usage will be tiny compared to Windows and Apple.

    • danparsonson 8 hours ago

      Not to mention that 'lower memory usage' is only one of many benefits and, at least before the prices went mad, hardly the most important one on the list.

      • rantingdemon 7 hours ago

        Practically speaking most people would want a GUI though.

    • justsomehnguy 7 hours ago

      > But there are distros out there that can run easily in as little as 1G RAM

      It always make me chuckle when I hear this. Default server (ie no GUI at all) installation of a RHEL derivative just outright dies silently with 1GB of RAM if there is no swap. Sure with the enabled swap it no longer dies but to say what the performance is anywhere performant is to lie to yourself.

      • b00ty4breakfast 5 hours ago

        RHEL is not the be-all end-all of minimalist linux, even sans GUI. Puppy Linux, with a full WM, is completely usable with a single gig of ram. That's obviously a different use-case from RHEL but the point stands.

        • justsomehnguy 22 minutes ago

          If the point is the minimal footprint then MS-DOS would win. Now install and run at least 70% of the software available even without EPEL in Puppy?

          RHEL/RHEL-like, just like Ubuntu is thevgeneral purpose distroes and the point of the minimal sysrq for running is for them, not for the excercises in the RAM golfing.

  • anthk 8 hours ago

    1: ZRAM exists

    2: Win11 is not usable with 4GB

    3: Trisquel 12 Ecne exists. You might need Xanmos as a propietary kernel because of hardware, but try to blacklist mei and mei_me first in some .conf file at /lib/modprobe.d. Value your privacy.

    Trisquel Mate with zram-config and some small tweaks can work with 4GB of RAM even with a browser with dozens of Tabs, at least with UBlock Origin.

    • moogly 8 hours ago

      The fact that I couldn't tell if point number 3 was a joke or not makes me confident we've still not seen the year of the Linux desktop.

  • nickpsecurity 8 hours ago

    I was testing them on a HP laptop I bought for $200 with 4GB of RAM.

    Windows, its default, used so much memory that there was not much left for apps.

    Ubuntu used 500MB less than Windows in system monitor. I think it was still 1GB or more. It also appeared to run more slowly than it used to on older hardware.

    Lubuntu used hundreds of MB less than Ubuntu. It could still run the same apps but had less features in UI (eg search). It ran lightening fast with more, simultaneous apps.

    (Note: That laptop's Wifi card wouldn't work with any Linux using any technique I tried. Sadly, I had to ditch it.)

    I also had Lubuntu on a 10+ year old Thinkpad with an i7 (2nd gen). It's been my daily machine for a long time. The newer, USB installers wouldn't work with it. While I can't recall the specifics, I finally found a way to load an Ubuntu-like interface or Ubuntu itself through the Lubuntu tech. It's now much slower but still lighter than default Ubuntu or Windows.

    (Note: Lubuntu was much lighter and faster on a refurbished Dell laptop I tested it on, too.)

    God blessed me recently by a person who outright gave me an Acer Nitro with a RTX and Windows. My next step is to figure out the safest way to dual boot Windows 11 and Linux for machine learning without destroying the existing filesystem or overshrinking it.

    • heelix 7 hours ago

      Consider a dedicated SSD for each OS. You should have a couple M2 slots in the laptop. What you can do is remove (or disable) the Windows SSD, install Linux on the second drive, and then add back the windows drive. Select the drive at startup you want to be in on boot and default the drive you want to spend most of your time in. I did that on my XPS and it was trouble free. Linux can mount your NTFS just fine, without having to consider it from a boot/grub perspective.

      https://community.acer.com/en/kb/articles/16556-how-to-upgra...

      Looks like you got space for 2 drive.

      • nickpsecurity 6 hours ago

        That's a terrific idea. It might address the other problem that I'd have little space for Linux apps. Thanks!

    • j16sdiz 7 hours ago

      > Ubuntu used 500MB less than Windows in system monitor.

      Those number meant nothing comparing across OS. Depends on how they counts shared memory and how aggressive it cache, they can feel very different.

      The realistic benchmark would be open two large applications (e.g. chrome + firefox with youtube and facebook - to jack up the memory usage), switch between them, and see how it response switching different tasks.

      • nickpsecurity 6 hours ago

        Thanks for the critiques and the tips. I might try that in future testing.

    • keithnz 8 hours ago

      windows always optimistically loads a lot, almost no matter how much ram you have

    • srean 7 hours ago

      For the life of me I couldn't understand why anyone would downvote parent comment. Nothing offensive or disagreeable here.

  • dangus 8 hours ago

    I imagine the choice of desktop environment has most to do with RAM requirements in Linux.

    Unrelated to this, despite Ubuntu’s popularity, I think it’s one of the worst distro choices out there, especially for including old kernels for essentially no discernible reason.

    I wouldn’t go so far as defending Microslop but I do get tired of the Apple fanboys accusing Windows of being bloated and running poorly.

    They seem to defend Apple’s 8GB machines by saying that Apple systems perform better than Windows with the same amount of RAM. This claim is entirely unsubstantiated.

    Windows has a lot of problems but performance and memory efficiency is not one of them. We should recall that Microsoft actually reduced RAM usage and minimum requirements between windows 7 and 8 as they wanted to get into the tablet game, and Windows has remained efficient with memory since then as Microsoft wants Windows to come with cheap Chromebook-like hardware and other similar low-end systems.

    • akdev1l 8 hours ago

      MacOS handles memory pressure better than Linux imo (at least for interactive use cases)

      I have seen MacOS overcommit up to 50% of memory and still have the system be responsive.

      Yesterday I filled up my ram accidentally on Fedora and even earlyoom took several minutes to trigger and in the meantime the system was essentially non-responsive

      • mcswell 8 hours ago

        The plural of 'anecdote' is not 'data'.

        • dingaling 5 hours ago

          It's exactly what it is

          How do you think data is created? It's lots of anecdotes, normalised.

      • DullJZ 8 hours ago

        macOS uses solid-state drives to do swap to help increase virtual memory. I can run multiple browsers and IDEs smoothly on my 8GB MacBook.

      • worthless-trash 8 hours ago

        This is with earlyoom/systemd-oomd enabled ?

        • akdev1l 3 hours ago

          Yeah, Fedora ships systemd-oomd

          It did eventually work to but it took a while. It also did not killed the culprit runaway processes somehow but it did kill enough stuff for me to regain control of the system.

        • kergonath 8 hours ago

          From my experience it does not help much, and I still get occasional freezes when a program misbehaves on Linux. It’s not a huge problem, but it is a problem and it exists; I have been dealing with it for about 15 years with no significant improvement.

          • worthless-trash 7 hours ago

            The earlyoom/oomd changes are quite recent.. I've had a 'better' experience, but I guess it's not really fixed yet.

    • morphid-rabbit 8 hours ago

      Best kernel recommendations? Mainly for extremely long running (2+ years) SaaS applications. Stability overall. Is running a handful of docker containers and some binaries.

      • embedding-shape 8 hours ago

        Linux LTS if stability is the most important. Realistically, "normal" Linux works for for it too, not sure you have to care as deeply about what kernel you use as you seem to do.

        On my desktop I use linux-cachyos-bore-lto which seems to give me a slight performance boost in compilation times compared to the regular kernel, but I've had at least one crash that I've unable to attribute to any other specific issue, so could be the kernel I suppose, I wouldn't use it on a server nonetheless.

      • elcritch 8 hours ago

        FreeBSD is touted for long running and stability.

      • dangus 8 hours ago

        For desktop use, I find that being on the latest stable kernels is best because you get things like recent AMD graphics drivers and support for recent hardware and laptops. I’m in an arch-based distro and my kernel updates all the time. I’ve never had an issue. The stability benefits of LTS seem completely useless in comparison. Just my opinion though.

        If you’re running applications as in a server that’s an entirely different discussion. I have been assuming we are talking about desktop users who are not serving anything.

        E.g., if I go out and buy a 2026 Panther Lake laptop with a new WiFi 7 chip or what have you, I’m going to want a distro with the latest kernels so that I don’t have hardware issues. If I install the default Ubuntu download it’s going to almost certainly have problems.

    • dryarzeg 8 hours ago

      I am not an Apple fan, so I'll just tell my story. And yeah, it may be biased, and I may not understand something important.

      So around ~3 years ago or so I bought a lightweight low-end laptop (Intel Core i3, 14 inch display, 8GB of RAM) for everyday stuff so I could easily bring it with me everywhere I need to go (I mean, everywhere I would need it). It came with Windows 11 pre-installed. Now, for you to understand, previously, like ~10 years ago or so I had a Windows 7 system and it was pretty neat. And I remembered when people were switching from Windows 7 to Windows 8 or 10, they blamed the new OS version just like right now the Windows 11 was blamed; yet everyone got used to it, it received some fixes, improvements, etc; so I thought "well, maybe Windows 11 is not so bad, I should try it out at least just for the sake of curiosity".

      And now, the clean installation of the Windows 11 that came to my was requiring like ~20 seconds to fully boot up to the login window. I know that my laptop is not best of the best, but still... After a startup, with no apps opened, there was like ~4 GB of RAM usage just out of nowhere; so effectively I was limited to ~4GB of RAM to run something I want to. Bluetooth drivers were terrible (at the time) - sometimes I was able to connect to my headphones and sometimes I wasn't, while they were working with all of my other devices perfectly. Then there was also this hellish "Antimalware Executable" - and I know how it sounds, I have nothing against anti-virus software, but when it randomly shows up several times per day, eats all of your processing power (like ~80% of CPU usage, and note that I have 8 cores ~3 GHz here), heats up your laptop to the point that fan starts screaming... that was not very good, to speak softly. Battery usage was also a disappointment - sometimes it couldn't last for just 3 hours, while the most heavy thing I was doing during that time was compilation of some software.

      I was trying, I was re-configuring, I was applying patches... and finally I got fed up with all of this bloat, broken updates and other garbage. So I just backed up all of my important files and data to external drive and installed Linux Mint (because in this particular case I just needed working laptop). And wow, it just worked! Now at startup I get like ~1 GB RAM usage at most (this actually depends on the DE I use, so numbers could be different from time to time), battery life improved, no more weird Bluetooth issues, no more random bloatware... it just works, and that's it.

      I know that distros like Mint are focused on stability and efficiency, so maybe the comparison is a bit unfair. But hell, even while I don't have anything against Windows 7 or Windows 8, the recent Windows 11 is a real combination of bloatware and spyware. So performance and memory efficiency is, actually, the problem here. Or at least it was a problem last time I tried it.

      Now, again, I may be wrong somewhere, maybe I missed something out. If I did - please point it out.

      • dangus 8 hours ago

        I just logged in to my MacBook Air M2 (24GB RAM) with no programs open and it’s reserving 8.3GB of RAM and using 500MB of swap.

        My Framework laptop running CachyOS with KDE Plasma with nothing open except System Monitor reserves 4GB with 500MB in swap (I enabled swap for sleep to hibernate, normally there’s no swap).

        Reserving RAM doesn’t mean there’s a performance problem.

        Most of the things you’re talking about in your comment have nothing to do with RAM usage and memory efficiency. You’re complaining about some annoying preinstalled OEM software [1], bad drivers, fan noise, battery life, and windows updates. That stuff isn’t great but a lot of it doesn’t have anything to do with Windows RAM efficiency itself.

        If you download the Windows ISO from Microsoft and clean install you’ll have a pretty nice experience. I think Microsoft needs to crack down on OEM software additions.

        As far as slow boot up times/slow initial setup I’ll remind you that Macs also have that as an issue during first boot and spend a lot of time doing initial indexing.

        Linux mint is a great distro and I also prefer Linux to both Mac and Windows as well. Mostly my commentary is on the subject of people claiming Microsoft Windows is bad with RAM when we now see some Linux distros asking for more RAM than Windows. I think it’s quite clear that RAM isn’t the problem with Windows, it’s a lot of other things and the surrounding ecosystem.

        [1] I have to assume you’re talking about some third party antimalware program because the Microsoft one absolutely does not behave how you describe.

        • dryarzeg 7 hours ago

          > Reserving RAM doesn’t mean there’s a performance problem.

          It does in my own experience (so it may not be a problem for you, I agree, but it is a problem for me). Because when OS allocates ~50% percents of RAM for itself and isn't letting it go, then other software simply can't use it. Therefore, you're limited. Your potential performance is capped at certain level just because your OS decided to allocate half or more of your system RAM. Why? Well, just because it wants to.

          > have nothing to do with RAM usage or performance

          Well, to be honest, most of them don't. But would you please explain then, why it takes around 20 seconds just to boot up, while for the aforementioned Linux Mint (and I'll clarify that it's currently 22.3 for me, the latest version, it was 22.1 at the time as far as I remember) it's only around ~3-4 second to take me to the login screen and then another second (at most) to load everything after I have logged in? Could you also, please, explain how does it happen that even GNOME's Nautilus file explorer takes less RAM and far less CPU usage than Microsoft's Explorer (and I won't even mention Thunar, that's kinda unfair)? What about "Start" menu in Windows which spiked up CPUs just by opening/closing? There's a lot of performance issues, both with RAM and CPU usage.

          I'm not saying that these problems are unique to Windows, no; but saying that Windows doesn't have any performance issues is not really true.

          > I think it’s quite clear that RAM isn’t the problem with Windows, it’s a lot of other things and the surrounding ecosystem.

          I agree with you here. That's true. A large part of the problem comes not from the actual operating system, but from the application software. I thought once that well, maybe if RAM shortages will last longer than for just one or two years, that will be bad, but also, maybe - just maybe - some software developers will start to think at least a bit more about optimization...

        • dryarzeg 7 hours ago

          > [1] I have to assume you’re talking about some third party antimalware program because the Microsoft one absolutely does not behave how you describe.

          Editing without specifying that you have edited your reply is not very good, you know. But okay.

          Actually, I'm talking about the Windows-shipped Microsoft Defender process (at least it seems to come from Microsoft Defender). I have not seen anything third-party installed on my laptop at the time, and it actually behaved just like I described. I should also remind you that it is a low-end laptop, that's just Intel Core i3-N305, it's not the most powerful CPU in the world - just 8 cores, 8 threads and 3.80 GHz of max boost frequency.

          If you think that I'm lying, then just search for "antimalware executable high CPU usage" in any search engine. You will find a plenty of complaints and even some guides on how to deal with it.

          • dangus 3 hours ago

            Does it behave like this all the time or just at specific moments?

            I find on my Windows system it's only doing things when specific actions are happening.

            Right now the antimalware executable process is using 196.4 MB of memory and 0% CPU for me as I type this.

            When I download an executable from the Internet and run it, the CPU usage spikes to 8-10% briefly and the RAM usage goes up by 30MB or so.

            I have a much higher-end CPU than that, 6 cores 12 threads (AMD Ryzen 5600X3D)

            In my experience the executable is pretty much doing nothing unless I'm opening up an exe that's trying to elevate privileges or if it's doing an active periodic scan.

    • Alifatisk 8 hours ago

      > I wouldn’t go so far as defending Microslop

      > I do get tired of the Apple fanboys accusing Windows of being bloated and running poorly.

      > Windows has a lot of problems but performance and memory efficiency is not one of them.

      I can't even describe how much your experience differs from mine. I would never have imagine someone to utter such sentence about Windows in todays day and age.

      For everyone else reading this, a couple of advice I have gotten that made me suffer less with Windows is to replace Windows search with Everything (by Voidtools) and replace Explorer with Filepilot (filepilot.tech).

      On a older machine, I switched to Tiny10.

      • dangus 3 hours ago

        I also use Everything. But I'm not sure how Windows Search not being the best has much to do with the system's overall RAM usage.

        Explorer works fine for me but File Pilot does look cool. I'll give it a try. (Good luck replacing Finder on Mac, is that even possible?)

        I only use Windows for desktop and if I was clean installing I'd probably switch it to Linux. My laptop is Linux then I share a macOS system with my partner which I occasionally use for things that require Mac.

        I wouldn't say I suffer at all with Windows. It's fine. It runs, it performs well, it's stable. I can't speak to other people who have different experiences. I usually assume they're using some kind of OEM abomination while I used the plain ISO downloaded from Microsoft, and I've already gone through the ~10 minutes of effort to turn off the annoying stuff.

        I sold my personal Mac and switched to Linux on Framework 13" after Liquid Glass came out. It was almost as jarring and poorly executed as Windows 8. Well, okay, maybe that's going too far.

        (The other problem with my MacBook was the tiny amount of storage was growing difficult to work with, much easier to toss a 2TB SSD into a Framework and finally be done with worrying about storage)

    • anthk 8 hours ago

      Both Ubuntu and Trisquel have backports for mainline and LTS kernels. Also, GNU/Linux has ZRAM, 4GB can work as 6.

  • lokinorkle 7 hours ago

    Switched to SuSE a few years ago, still love it

  • shevy-java 8 hours ago

    Linux needs to go back to engineering again.

  • a155 7 hours ago

    Maybe if FOSS was less focused on reverse engineering proprietary technology they could make products people LIKE. I say this as someone who learned about firmware because of several listeners and one group having the aim of reverse engineering my new Apple ecosystem that is now falling apart after signal traps. My crime was working for an ISP and the media, but I reported on Scienos not techbros. Yawn.

    I knew they were fucking with my virtual memory cause theirs sucks, the partition schemes on this Mac mini were ridiculous and the helpers weren’t stealing my information.

  • curt15 8 hours ago

    Given that efficiency one of Linux's most touted advantages, what in the world is Ubuntu's PR department thinking? Ubuntu isn't providing any more functionality than when its memory requirement was 4GB. What is hogging all that extra ram?

    • embedding-shape 8 hours ago

      > what in the world is Ubuntu's PR department thinking?

      The same as any other corporate PR department: "At least now when people run it with N GB of RAM, we can just point to the system requirements and say 'This is what we support' rather than end up in a back-and-forth"

      If you expect them to have any sort of long-term outlook on "Lets be careful with how developers view our organization", I think you're about a decade too late for Canonical.

    • shadowgovt 8 hours ago

      No official reason given, so all the tech press is basically speculating (if someone finds a source that does a teardown, please share; I can't seem to locate one). I think my favorite piece of speculation is that it reflects an anticipated modern workload of using the OS as a vector to launch a web browser and open multiple tabs in it, which is just going to be a memory hog as experienced by most Ubuntu users.

    • imtringued 7 hours ago

      I don't know what Ubuntu is doing with the RAM but I'm constantly swapping with all of the 16GB RAM filled on my work laptop with Ubuntu.

      At home I have a desktop running Arch plus Gnome with 32GB RAM and I am at 7GB on a normal day and below 16GB at all times unless I run an LLM.

    • crest 8 hours ago

      The sad answer is: nobody cares.

    • dangus 8 hours ago

      Besides the correct answer that Canonical sucks, I would argue that “efficiency” is not a selling point to get someone to use a desktop operating system.

      Mainstream users and business organizations don’t really understand that concept and would prefer to see how the operating system enables their use cases and workflows.