Software Rot

(permacomputing.net)

102 points | by pabs3 7 hours ago ago

85 comments

  • noduerme 2 hours ago

    I wish I could write all the business logic I write on an NES and never have to worry about requirements going bad. I guess the thing is, if you're writing anything on top of a network layer of any kind, eventually it's going to require patches unless you literally own all the wires and all the nodes in the network, like a secure power plant or some money clearing system in a bank that's been running the same COBOL since the 1960s. And since you're probably not writing code that directly interfaces with the network layer, you're going to be reliant on all the libraries that do, which in turn will be subject to change at the whims of breaking changes in language specs and stuff like that, which in turn are subject to security patches, etc.

    In other words, if you need your software to live in the dirty world we live in, and not just in a pristine bubble, things are gonna rot.

    Picking tools and libraries and languages that will rot less quickly however seems like a good idea. Which to me means not chaining myself to anything that hasn't been around for a decade at least.

    I got royally screwed because 50-60% of my lifetime code output before 2018, and pretty much all the large libraries I had written, were in AS3. In a way, having so much code I would have maintained become forced abandonware was sort of liberating. But now, no more closed source and no more reliance on any libs I don't roll or branch and heavily modify myself.

  • forgotmypw17 5 hours ago

    This and Lindy Effect factors a lot into my choices for what to use for my projects. My choice for a project I want to be as maintenance-free as possible are special subsets of ASCII/txt, SQLite, Perl, Bash, PHP, HTML, JS, CSS. The subsets I choose are the parts of these languages which have persisted the longest.

    Using the Lindy Effect for guidance, I've built a stack/framework that works across 20 years of different versions of these languages, which increases the chances of it continuing to work without breaking changes for another 20 years.

    • eviks 4 hours ago

      This dogmatic approach means you lose out on ergonomics by using poorly designed tools like bash and perl, so you incur those costs all the time for little potential benefit far away in the future (after all, that effect is just a broad hypothesis)

      • argomo 4 hours ago

        It has to be weighed against all the time spent learning, evaluating, and struggling with new tools. Personally, I've probably wasted a lot of time learning /new/ that I should have spent learning /well/.

        • eviks 3 hours ago

          Right, nothing is free, but switching costs is a different argument.

      • blueflow 29 minutes ago

        Its not "far away in the future". Every other IT job right now is supporting, maintaining and fixing legacy software. These are the software choices of the past and you pay them in manpower.

      • zeta0134 3 hours ago

        Very helpfully, python has stuck around for just as long and is almost always a better choice against these two specific tools for anything complicated. It's not perfect, but I'm much more likely to open a random python script I wrote 6 years ago and at least recognize what the basic syntax is supposed to be doing. Bash beyond a certain complexity threshold is... hard to parse.

        Python's standard library is just fine for most tasks, I think. It's got loads of battle tested parsers for common formats. I use it for asset conversion pipelines in my game engines, and it has so far remained portable between windows, linux and mac systems with no maintenance on my part. The only unusual crate I depend on is Pillow, which is also decently well maintained.

        It becomes significantly less ideal the more pip packages you add to your requirements.txt, but I think that applies to almost anything really. Dependencies suffer their own software rot and thus vastly increase the "attack surface" for this sort of thing.

        • sebtron 3 hours ago

          Python is a very bad example because of the incompatibility between Python 2 and Python 3. All my pre-2012 Python code is now legacy because of this, and since most of it is not worth updating I will only be able to run it as long as there are Python 2 interpreters around.

          I like Python as a language, but I would not use it for something that I want to be around 20+ years from now, unless I am ok doing the necessary maintenance work.

        • forgotmypw17 3 hours ago

          My main problem with python is that a script I wrote 6 years ago (or even 1 year ago) is not likely to run without requiring modifications.

          If it's me running it, that's fine. But if it's someone else that's trying to use installed software, that's not OK.

          • Falkon1313 2 hours ago

            It depends largely on what you're doing with it. True, I would never want to have to talk a customer through setting up and running a python system. I know there are ways to package them (like 37 different ways), but even that is confusing.

            However, a decade ago, a coworker and I were tasked with creating some scripts to process data in the background, on a server that customers had access to. We were free to pick any tech we wanted, so long as it added zero attack surface and zero maintenance burden (aside from routine server OS updates). Which meant decidedly not the tech we work with all day every day which needs constant maintenance. We picked python because it was already on the server (even though my coworker hates it).

            A decade later and those python scripts (some of which we had all but forgotten about) are still chugging along just fine. Now in a completely different environment, different server on a completely different hosting setup. To my knowledge we had to make one update about 8 years ago to add handling for a new field, and that was that.

            Everything else we work with had to be substantially modified just to move to the new hosting. Never mind the routine maintenance every single sprint just to keep all the dependencies and junk up to date and deal with all the security updates. But those python scripts? Still plugging away exactly as they did in 2015. Just doing their job.

          • esseph 2 hours ago

            This is one of the problems that containers help solve - no OS, just the dependencies required to run your code.

            • zeta0134 2 hours ago

              Python even has venv and other tooling for this sort of thing. Though, admittedly I seem to have dodged most of this by not seriously writing lots of python until after Python3 had already happened. With any luck the maintainers of the language have factored that negative backlash into future language plans, but we'll see.

              Mostly I recoiled in horror at bash specifically, which in addition to bash version, also ends up invisibly depending on a whole bunch of external environment stuff that is also updating constantly. That's sortof bash's job, so it's still arguably the right tool to write that sort of interface, but it ends up incredibly fragile as a result. Porting a complex bash script to a different distro is a giant pain.

              • gbin 2 hours ago

                How many times the virtualenv/pipenv/pyenv/... changed though? The package management also between wheels and setup and all the breakages.

                Even for somebody that did not aim to have python programs for 20y, python is definitely not a good example of a "pdf for programs"

                • aragilar 28 minutes ago

                  I think if you had chased every single latest hotness then you would have hit lots of breakages, but depending on what you are doing and where you are running (and what dependencies you are using) then I think you could easily have something from 10-15 years ago work today. Part of the trick would have been to aware enough to pick the boring long-term options (but at some level that applies to every language and ecosystem), but the other part is understanding what the tools are actually doing and how they are maintained.

              • esseph 2 hours ago

                uv means you have to download something and put it together.

                A container has that already done, including all supporting libraries.

                Edit: then ship the bash script in a container with a bash binary ;)

            • cjfd 2 hours ago

              It is also quite possible for old containers to no longer build.

              • esseph 2 hours ago

                That's why your build pipeline alerts you when tests no longer work, and then you have a release of the previous build still available for download at any time. This is how containers are released!

                • cjfd an hour ago

                  Sure. It still is burdensome, though. Now there are lots of nightly build from old projects that break at random times and require developer attention.

            • saurik 2 hours ago

              But now I have frozen an old language runtime and a bunch of old libraries into my environment, all of which are not just security hazards but interoperability landmines (a common one being lack of support for a new TLS standard).

              • esseph 2 hours ago

                Write a wrapper, don't expose the container.

                These are different problems from the distribution/bundling piece, they won't be solved the same way.

      • forgotmypw17 4 hours ago

        Is it still dogmatic if I consider Perl to be well-designed and have already evaluated more popular tools?

        • eviks 3 hours ago

          If "This and Lindy Effect" do not "factors a lot", but instead the major factor is you believe perl is better designed, then no, dogmatism of vague future risk is replaced with pragmatism of the immediate usefulness

        • jwrallie 3 hours ago

          At the point being discussed, which is not breaking backward compatibility, it indeed is arguably better than more popular tools, and I believe perl has other advantages too.

      • oguz-ismail 2 hours ago

        > poorly designed tools like bash and perl

        Skill issue, plus what's the alternative? Python was close until the 3.x fiasco

        • eviks 2 hours ago

          Indeed, double skill issue: one with the designers and the other one with the users having poor tool evaluation skills

          • oguz-ismail 2 hours ago

            > poor tool evaluation skills

            Both tools in question are installed everywhere and get the job done. There isn't much to evaluate, and nothing to compare against

            • eviks an hour ago

              > There isn't much to evaluate, and nothing to compare against

              These are exactly the skill issues I meant! Git gud in evaluating and you'll be able to come up with many more sophisticated evaluation criteria than the primitive "it's installed everywhere"

              • legends2k 29 minutes ago

                While there are other parameters I would consider like maintainability, ergonomics, mind share, ease of deployment, etc. The ubiquitous availability point triumphs most others though. Installation of new toolchain is usually a hassle when the same task can be done with existing tools. Also when I present it in a company setting installing new software and broadening the security attack surface is the first pushback I get.

    • akkartik 5 hours ago

      For similar considerations + considerations of code size and hackability, I lately almost always prefer the Lua eco-system: https://akkartik.name/freewheeling

  • codeflo 2 hours ago

    We as an industry need to seriously tackle the social and market dynamics that lead to this situation. When and why has "stable" become synonymous with "unmaintained"? Why is it that practically every attempt to build a stable abstraction layer has turned out to be significantly less stable than the layer it abstracts over?

    • dgoldstein0 2 hours ago

      So one effect I've seen over the last decade of working: if it never needs to change, and no one is adding features, then no one works on it. If no one works on it, and people quit / change teams / etc, eventually the team tasked with maintaining it doesn't know how it works. At which point they may not be suited to maintaining it anymore.

      This effect gets accelerated when teams or individuals make their code more magical or even just more different than other code at the company, which makes it harder for new maintainers to step in. Add to this that not all code has all the test coverage and monitoring it should... It shouldn't be too surprising there's always some incentives to kill, change, or otherwise stop supporting what we shipped 5 years ago.

      • codeflo an hour ago

        That's probably true, but you're describing incentives and social dynamics, not a technological problem. I notice that every other kind of infrastructure in my life that I depend upon is maintained by qualified teams, sometimes for decades, who aren't incentivized to rebuild the thing every six months.

    • bloppe 2 hours ago

      At any given moment, there are 6 LTS versions of Ubuntu. Are you proposing that there should be more than that? The tradeoffs are pretty obvious. If you're maintaining a platform, and you want to innovate, you either have to deprecate old functionality or indefinitely increase your scope of responsibilities. On the other hand, if you refuse to innovate, you slide into obscurity as everyone eventually migrates to more innovative platforms. I don't want to change anything about these market and social dynamics. I like innovation

    • Ygg2 2 hours ago

      > When and why has "stable" become synonymous with "unmaintained"?

      Because the software ecosystem is not static.

      People want your software to have more features, be more secure, and be more performant. So you and every one of your competitors are on an update treadmill. If you ARE standing (aka being stable) on the treadmill, you'll fall off.

      If you are on the treadmill you are accumulating code, features, and bug fixes, until you either get too big to maintain or a faster competitor emerges, and people flock to it.

      Solving this is just as easy as proving all your code is exactly as people wanted AND making sure people don't want anything more ever.

      • codeflo an hour ago

        > People want your software to have more features, have fewer bugs, and not be exploited. So you and every one of your competitors are on an update treadmill. If you ARE stable, you'll probably fall off. If you are on the treadmill you are accumulating code, features, bug fixes, until you either get off or a faster competitor emerges.

        Runners on treadmills don't actually move forward.

        • Ygg2 an hour ago

          Kinda the point of the threadmill metaphor. If you are standing on a threadmill, you will fall right off. It requires great effort to just stay at one spot.

  • foxrider 3 hours ago

    Python 2 situation opened my eyes to this. To this day I see a lot of py2 stuff floating around, especially around work environments. So much so, in fact, that I had to make scripts that automatically pull the sources of 2.7.18 and build them in the minimal configuration to run stuff.

    • Ygg2 2 hours ago

      Python 2 is a warning about doing backwards compatibility changes too late. As soon as you have a few massive libraries, your backward compatibility risks grow exponentially.

      C# did just as big of a change by going from type-erased to reified generics. It broke the ecosystem in two (pre- and post- reified generics). No one talks about it, because the ecosystem was so, so tiny, no one encountered it.

      • saurik 2 hours ago

        It certainly didn't help that they were annoying about it; like, they actively dropped some of the forward compatibility they had added (a key one being if you had already carefully used u and b prefixes on strings) in Python 3.0, and only added it back years later after they struggled to get adoption. If they had started their war with Python 3.5 instead of 3.0 it would be a lot less infuriating.

        • flomo an hour ago

          Not being a python dev, there must have been some huge superficial 'ick'. Back when, I was talking to a python guy and mentioned that Python 3 was coming out. He said something like "we're just going to ignore that until they sober-up and fix it." Which it seems like a lot of people actually did. (or they really sobered-up and rewrote in Go or something.)

      • vrighter an hour ago

        When did c# have type erasure?

        • Ygg2 40 minutes ago

          I do recall some paper mentioning it. But now I'm not sure if Google is gaslighting me or it never existed. But it seems you are right.

  • icameron 3 hours ago

    Nobody has a better ecosystem of “industrial marine grade code rot resistance” than Microsoft. That I can run the same .NET web app code compiled 20 years ago on a new Server 2025 is an easy experience unequaled by others. Or the same 30 year old VBA macros still doing their thing in Excel 365. There’s a company that knows how to do backwards compatibility.

    • js8 2 hours ago

      As somebody who works on IBM mainframes, I disagree. IBM is probably the best at forward application compatibility.

      People will laugh, but they should really look.

  • account42 17 minutes ago

    The OS/libraries changing is one example of software rot but another one is requirements changing and you can't completely eliminate that.

  • rgmerk 2 hours ago

    You can't build permanent software in a world where a) everything is connected to everything else, and b) hackers will exploit anything and everything they can get their hands on.

  • Daub 4 hours ago

    As a software user and teacher, I think about software rot a lot. My concern is that it has a tendency to grow by addition rather than replacement. New features are added whilst the fundamental limits of the architecture are left unattended to.

    The reason that Blender grew from being an inside joke to a real contender is the painful re-factoring it underwent between 2009 and 2011.

    In contrast, I can feel the fact that the code in After Effects is now over 30 years old. Its native tracker is slow and ancient and not viable for anything but the most simple of tasks. Tracking was 'improved' by sub-contracting the task to a sub-licensed version of Mocha via a truly inelegant integration hack.

    There is so much to be said for throwing everything away and starting again, like Apple successfully did with OSX (and Steve Job did to his own career when he left Apple to start Next). However, I also remember how Blackberry tried something similar and in the process lost most of their voodoo.

  • xenodium an hour ago

    While Emacs itself is not entirely immune to software rot (external dependencies and all), it’s truly amazing how little to no rot is experienced by elisp software (packages). If you find an Emacs package written 15 years ago, the chances of successfully running out of the box are incredibly high.

    • zkry 43 minutes ago

      I was thinking the exact same thing. As long as you're not depending on any external packages things are very stable. Like, if you're package depends on adding advice to some other package's random internal function, then yeah, it could easily break.

      It's a great feeling knowing any tool I write in Elisp will likely work for the rest of my life as is.

    • deafpolygon an hour ago

      That... has not been my experience.

  • joegibbs 2 hours ago

    I think don’t worry too much about trying to avoid it. Think 5-10 years ahead max, rather than 20.

    In 20-30 years there’s a good chance that what you’ve written will be obsolete regardless - even if programs from 1995 ran perfectly on modern systems they’d have very few users because of changing tastes. A word processor wouldn’t have networked collaborative editing (fine for GRRM though), an image editor wouldn’t have PNG support, and they wouldn’t be optimised for modern hardware (who would foresee 4K screens and GPUs back then - who knows how we’ll use computers in 2055).

    There are also always containers if the system needs those old versions.

    • whizzter 2 hours ago

      2005 is already 25 years ago, and what the author is hinting at is the difference in difficulty between keeping 1980s software running vs keeping 2005 software running is momentous.

      1980s NES software is "easy" as in emulating a CPU and the associated hardware (naturally there are corner cases in emulation timing that makes it a lot harder, but it's still a limited system).

      I used to make demos as mentioned in the article, the ones I did for DOS probably all work under DosBox. My early Windows demos on the other hand relied on a "bad" way of doing things with early DirectDraw versions that mimicked how we did things under DOS (ie, write to the framebuffer ourselves). For whatever reason the changes in Vista to the display driver model has made all of them impossible to run in fullscreen (luckily I wrote a GDI variant for windowed mode that still makes it possible to run).

      Even worse is some stuff we handle at an enterprise customer, Crystal Reports was even endorsed by Microsoft and AFAIK included in Visual Studio installs. Nowadays abandoned by MS and almost by it's owner (SAP), we've tried to maintain an customized printing applications for a customer, relying on obscure DLL's (and even worse the SAP installer builder for some early 2000s install technology that hardly works with modern Visual Studio).

      Both these examples depend on libraries being installed in a full system, sure one could containerize needed ones but looking at the problem with an archivist eyes, building custom Windows containers for thousands of pieces of software isn't going to be pretty (or even feasible in a legal sense both with copyright and activation systems).

      Now you could complain about closed source software, but much of a tad more obscure *nix software has a tendency to exhibit a huge part of "works on my machine" mentality, configure scripts and Docker weren't invented in a vacuum.

      • saurik 2 hours ago

        > 2005 is already 25 years ago

        o_O

  • b_e_n_t_o_n 4 hours ago

    Is it possible that software is not like anything else, that it is meant to be discarded: that the whole point is to always see it as a soap bubble?

    • mmackh 3 hours ago

      Not if software is tied to infrastructure, buildings, etc.

      • b_e_n_t_o_n an hour ago

        Perhaps software should be designed in such a way that despite it working on infrastructure, it can be swapped and discarded.

      • tanduv 3 hours ago

        but even buildings need maintained

        • spauldo an hour ago

          Automation costs a lot. The projects I work on are almost always in the millions of dollars,band they're far from being considered "big" projects. The hardware manufacturers will sell you equipment that runs for thirty years. Companies are reluctant to replace working systems.

          I replaced a PLC a couple years ago. The software to program it wouldn't run on my laptop because it used the win16 API. It used LL-984 ladder logic, and most people who were experts in that have retired. It's got new shiny IEC-compliant code now, and next they're looking at replacing the Windows 2000 machines they control it with. Once that's done, it'll run with little to no change until probably 2050.

  • bravesoul2 5 hours ago

    JS is hated but if you compile to browser JS that code will run in 2100. If you mainly deal with files / blobs not databases you will have these things in 2100 too. I think a lot of apps can be JS plus Dropbox integration to sync files. Dropbox may rot but make that a plugin (seperate .js file) and offer local read/write too and I think you'd be pretty future proof.

    • jillesvangurp 4 hours ago

      Except of course software rot and javascript code bases go hand in hand.

      You seem to assume, browsers have stopped changing and will be more or less the same 75 years from now.

      I think you are right that that code might run. But probably in some kind of emulator. In the same way we deal with IBM mainframes right now. Hardware and OS have long since gone the way of the dodo. But you can get stuff running on generic linux machines via emulation.

      I think we'll start seeing a lot of AI driven code rot management pretty soon. As all the original software developers die off (they've long been retired); that might be the only way to keep these code bases alive. And it's also a potential path to migrating and modernizing code bases.

      Maybe that will salvage a few still relevant but rotten to the core Javascript code bases.

    • mook 2 hours ago

      Things like E4X, sharp variables, and array comprehensions have already been removed; it's just that the mass of newer developers mean the average doesn't know about them. Unfortunately it's not like they never remove things.

    • flomo an hour ago

      You are right that your javascript bundle will probably run forever as-is. However, three years later your toolchain will be totally broken and now you are in NPM Hell trying to fix it. Ten years later, good luck.

      (S3 better example than Dropbox. That will mostly be around forever.)

    • tekno45 4 hours ago

      postgres will be around in 2500 lol

      • Kinrany 2 hours ago

        Postgres will get disassembled into independent composable parts and some other "distribution" of it will be used for a more narrow set of use cases that actually require running the database as a standalone process

  • Falkon1313 an hour ago

    Over the course of my learning and my career, I've kind of gone back and forth on this a bit.

    On the one hand, software is like a living thing. Once you bring it into this world, you need to nurture it and care for it, because its needs, and the environment around it, and the people who use it, are constantly changing and evolving. This is a beautiful sentiment.

    On the other hand, it's really nice to just be done with something. To have it completed, finished, move on to something else. And still be able to use the thing you built two or three decades later and have it work just fine.

    The sheer drudgery of maintenance and porting and constant updates and incompatibilities sucks my will to live. I could be creating something new, building something else, improving something, instead, I'm stuck here doing CPR on everything that I have to keep alive.

    I'm leaning more and more toward things that will stand on their own in the long-term. Stable. Done. Boring. Lasting. You can always come back and add or fix something if you want. But you don't have to lose sleep just keeping it alive. You can relax and go do other things.

    I feel like we've put ourselves in a weird predicament with that.

    I can't help but think of Super Star Trek, originally written in the 1970s on a mainframe, based on a late 1960s program (the original mainframe Star Trek), I think. It was ported to DOS in the 1990s and still runs fine today. There's not a new release every two weeks. Doesn't need to be. Just a typo or bugfix every few years. And they're not that big a deal. -- https://almy.us/sst.html

    I think that's more what we should be striving for. If someone reports a rare bug after 50 years, sure, fix it and make a new release. The rest of your time, you can be doing other stuff.

  • userbinator 4 hours ago

    those written for e.g. Linux will likely cease working in a decade or two

    Have we already passed the era of DON'T BREAK USERSPACE when Linus would famously loudly berate anyone who did?

    I suspect Win32 is still a good target for stability; I have various tiny utilities written decades ago that still work on Win11. With the continued degradation of Microsoft, at least there is WINE.

    • whizzter 2 hours ago

      While the early core win32 parts are still fine, much COM based stuff will probably be a pain in the future.

      It's not direct breakage per-se (API's were generated from definition files and there was an encouragement to build new API versions when breaking API's), the issue will be that many third party things were to be manually installed from more or less obscure sources.

      Your Office install probably introduced a bunch of COM objects. Third party software that depended on those objects might not handle them being missing.

      I think I took some DOS-like shortcuts with some of my early DirectDraw (DirectX 3 level?) code, afaik it doesn't work in fullscreen past Windows Vista but _luckily_ I provided a "slow" windowed GDI fallback so the software still kinda runs at least.

    • SkiFire13 2 hours ago

      The kernel doesn't break userspace, but userspace breaks itself quite often.

    • cookiengineer 3 hours ago

      You can account for that by using Go, with CGO_ENABLED=0. Then you have a self contained binary that relies solely on the POSIX syscalls.

    • atq2119 3 hours ago

      No, the issue in Linux is that userspace has traditionally had a tendency to break itself.

    • jwrallie 3 hours ago

      I am considering learning Win32 for that purpose, not that I plan to do anything complicated with it, but for small single purpose tools it should be useful, at least when some kind of UI is needed.

      • rerdavies an hour ago

        Don't do it! Seriously! Terrible awful stuff to use. Even worse to program for!

    • wolvesechoes 3 hours ago

      I would love ReactOS to succeed.

      • Imustaskforhelp 2 hours ago

        If I remember correctly, ReactOS just uses wine under the hood. Which can be used in linux and even mac or bsd too.

        Basically (compile to windows?) seems like a good enough tradeoff to run anywhere, right?

        But I prefer appimage or flatpak because of the overhead that wine might introduce I suppose

        • wolvesechoes an hour ago

          > If I remember correctly, ReactOS just uses wine under the hood

          Nah. I think they share some effort and ReactOS team adds patches to WINE codebase, it is a separate thing.

  • Copenjin 4 hours ago

    > Software rot is a big issue for cultures that constantly produce new programs

    Cough cough vibing cough cough

    • alexshendi 3 hours ago

      Common Object File Format (COFF)?

      • Copenjin 2 hours ago

        onomatopoeia, fixed spelling :)

  • alexshendi 3 hours ago

    I think once you get rid of dynamic libraries and GUIs your software rot will be greatly reduced.

  • fuzzfactor 4 hours ago

    There were companies not quite worth a $billion who would have never made it that far if they couldn't convince masses of people that platform rot was good for them.

  • alexjurkiewicz 2 hours ago

    It's hard to take this article seriously. We should write software for DOS because we won't need to maintain it post-release?

    Maybe software written in the age of DOS was relatively trivial compared to modern tools. Maybe there's a benefit to writing code in Rust rather than C89.

  • superkuh 3 hours ago

    Unless explicitly addressed rot rate is proportional to popularity.

    Unpopular targets, platforms, languages, etc don't get changed and provide a much needed refuge. There are some interpreted languages like perl where a program written today could run on a perl from 2001 and a program from 2001 would run on perl today. And I'm not talking about in a container or with some special version. I'm talking about the system perl.

    Some popular languages these days can lose forwards compatibility (gain features, etc) within just a few months that every dev will use within a few more months. In these cultures sofware rot is really fast.

    • wolvesechoes 3 hours ago

      > Unpopular targets, platforms, languages, etc don't get changed

      Ah yes, Windows, some niche OS for hipsters and base-dwellers.

  • nektro 3 hours ago

    lovely article aside from this bit:

    > while those written for e.g. Linux will likely cease working in a decade or two

    there's nothing to support this claim in practice. linux is incredibly stable

    • ajuc 3 hours ago

      I also noticed this part of the article but for the oposite reason (I think 10-20 years is overly optimistic). I've written a small 2d game for Linux back in 00s. Using C++, SDL and a few other libraries (for example now-abandoned libparagui for GUI).

      Any time I tried to run it afterwards - I had to recompile it, and a few times I had to basically port it (because libparagui got abandoned and some stuff in libc changed so I couldn't just compile the old libparagui version with the new libc).

      It's surprisingly hard to make linux binaries that will work even 5 years from now, never mind 10-20 years from now.

      For comparison I still have games I've written for DOS and windows in 90s and the binaries still work (OK - I had to apply a patch for Turbo Pascal 7 200 MHz bug).

      The assumptions around linux software is that it will be maintained by infinite number of free programmers so you can change anything and people will sort it out.