56 comments

  • crtasm 3 hours ago

    >When you run npm install, npm doesn't just download packages. It executes code. Specifically, it runs lifecycle scripts defined in package.json - preinstall, install, and postinstall hooks.

    What's the legitimate use case for a package install being allowed to run arbitrary commands on your computer?

    Quote is from the researchers report https://www.koi.ai/blog/phantomraven-npm-malware-hidden-in-i...

    edit: I was thinking of this other case that spawned terminals, but the question stands: https://socket.dev/blog/10-npm-typosquatted-packages-deploy-...

    • zahlman 32 minutes ago

      > doesn't just download packages. It executes code. Specifically, it

      It pains me to remember that the reason LLMs write like this is because many humans did in the training data.

    • squidsoup 3 hours ago

      pnpm v10 disables all lifecycle scripts by default and requires the user to whitelist packages.

      https://github.com/orgs/pnpm/discussions/8945

      • sroussey an hour ago

        It’s just security theater in the end. You can just as easily put all that stuff in the package files since a package is installed to run code. You have that code then do all the sketchy stuff.

        What’s needed is an entitlements system so a package you install doesn’t do runtime stuff like install crypto mining software. Even then…

        • Mogzol 19 minutes ago

          A package, especially a javascript package, is not necessarily installed to run code, at least not on the machine installing the package. Many packages will only be run in the browser, which is already a fairly safe environment compared to running directly on the machine like lifecycle scripts would.

          So preventing lifecycle scripts certainly limits the number of packages that could be exploited to get access to the installing machine. It's common for javascript apps to have hundreds of dependencies, but only a handful of them will ever actually run as code on the machine that installed them.

        • theodorejb an hour ago

          I would expect to be able to download a package and then inspect the code before I decide to import/run any of the package files. But npm by default will run arbitrary code in the package before developers have a chance to inspect it, which can be very surprising and dangerous.

      • theodorejb 43 minutes ago

        Bun also doesn't execute lifestyle scripts by default, except for a customizable whitelist of trusted dependencies:

        https://bun.com/docs/guides/install/trusted

      • chrisweekly 2 hours ago

        One of the many reasons there is no good reason to use npm; pnpm is better in every way.

    • j1elo 3 hours ago

      Easy example that I know of: the Mediasoup project is a library written in C++ for streaming video over the internet. It is published as a Node package and offers a JS API. Upon installing, it would just download the appropriate C++ sources and compile them on the spot. The project maintainers wanted to write code, not manage precompiled builds, so that was the most logical way of installing it. Note that a while ago they ended up adding downloadable builds for the most common platforms, but for anything else the expectation still was (and is, I guess) to build sources at install time.

      • exe34 3 hours ago

        how hard would it be to say "upon first install, run do_sketchy_shit.sh to install requirements"?

        • lelandbatey 2 hours ago

          People want package managers to do that for them. As much as I think it's often a mistake (if your stuff requires more than expanding archives different folders to install, then somewhere in the stack something has gone quite wrong), I will concede that because we live in an imperfect world, other folks will want the possibility to "just run the thing automatically to get it done." I hope we can get to a world where such hooks are no longer required one day.

          • exe34 2 hours ago

            yes that's why npm is for them. I'd rather download the libraries that I need one by one.

    • DangitBobby 2 hours ago

      I seem to recall Husky at one point using lifecycle hooks to install the git hooks configured in your repository when running NPM install.

    • interstice 2 hours ago

      Notable times this has bitten me include compiling image compression tools for gulp and older versions of sass, oh and a memorable one with openssl. Downloading a npm package should ideally not also require messing around with c compilation tools.

    • vorticalbox 3 hours ago

      One use case is downloading of binaries. For example mongo-memory-server [0] will download the mongoDB binary after you have installed it.

      [0] https://www.npmjs.com/package/mongodb-memory-server

      • 8note 3 hours ago

        why would i want that though, compared to downloading that binary in the install download?

        the npm version is decoupled from the binary version, when i want them locked together

        • jonhohle 2 hours ago

          I think it falls into a few buckets:

          A) maintainers don’t know any better and connect things with string and gum until it most works and ship it

          B) people who are smart, but naive and think it will be different this time

          C) package manager creators who think they’re creating something that hasn’t been done before, don’t look at prior art or failures, and fall into all of the same holes literally every other package manager has fallen into and will continue to fall into because no one in this industry learns anything.

  • ashishb 12 minutes ago

    Here's my `npm` command these days. It reduces the attack surface drastically.

    ``` alias npm='docker run --rm -it -v ${PWD}:${PWD} --net=host --workdir=${PWD} node:25-bookworm-slim npm' ```

      - No access to my env vars
      - No access to anything outside my current directory (which likely is a JS project).
      - No access to my .bashrc or other files.
    
    Ref: https://ashishb.net/programming/run-tools-inside-docker/
  • noosphr 7 minutes ago

    A day ago I got down voted to hell for saying that the JavaScript ecosystem has rotted the minds of developers and any tools that emulate npm should be shunned as much as possible - they are not solutions, they are problems.

    I don't usually get to say 'I told you so' within 24 hours of a warning, but JS is special like that.

  • gbransgrove an hour ago

    Because these are fetching dependencies in the lifecycle hooks, even if they are legitimate at the moment there is no guarantee that it will stay that way. The owner of those dependencies could get compromised, or themselves be malicious, or be the package owner waiting to flip the switch to make existing versions become malicious. It's hard to see how the lifecycle hooks on install can stay in their current form.

  • robpco 3 hours ago

    Alternate article with more detailed description of exploit: https://www.bleepingcomputer.com/news/security/phantomraven-...

  • 650REDHAIR 3 hours ago

    As a hobbyist how do I stay protected and in the loop for breaches like this? I often follow guides that are popular and written by well-respected authors and I might be too flippant with installing dependencies trying to solve a pain point that has derailed my original project.

    Somewhat related, I also have a small homelab running local services and every now and then I try a new technology. occasionally I’ll build a little thing that is neat and could be useful to someone else, but then I worry that I’m just a target for some bot to infiltrate because I’m not sophisticated enough to stop it.

    Where do I start?

    • jonhohle 2 hours ago

      There are some operating systems, like FreeBSD, where you use the system’s package manager and not a million language specific package managers.

      I still maintain pushing this back to library authors is the right thing to do instead of making this painful for literally millions of end-users. The friction of getting a package accepted into a critical mass of distributions is the point.

    • Etheryte 3 hours ago

      Use dependencies that are fairly popular and pick a release that's at least a year old. Done. If there was something wrong with it, someone would've found it by now. For a hobbyist, that's more than sufficient.

    • numbsafari 2 hours ago

      Don't do development on your local machine. Full stop. Just don't.

      Do development, all of it, inside VMs or containers, either local or remote.

      Use ephemeral credentials within said VMs, or use no credentials. For example, do all your git pulls on your laptop directly, or in a separate VM with a mounted volume that is then shared with the VM/containers where you are running dev tooling.

      This has the added benefit of not only sandboxing your code, but also making your dev environments repeatable.

      If you are using GitHub, use codespaces. If you are using gitlab, workspaces. If you are using neither, check out tools like UTM or Vagrant.

      • DyslexicAtheist an hour ago

        you had me at:

        > Don't do development

      • suck-my-spez an hour ago

        Are people actually using UTM to do local development?

        Im genuinely curious because I casually looked into it so that i could work on some hobby stuff over lunch on my work machine.

        However I just assumed the performance wouldn't be too great.

        Would love to hear how people are setup…

    • uyzstvqs 28 minutes ago

      I'm not sure about NPM specifically, but in general: Pick a specific version and have your build system verify the known good checksum for that version. Give new packages at least 4 weeks before using them, and look at the git commits of the project, especially for lesser-known packages.

    • evertheylen 40 minutes ago

      If you're on Linux, I've tried to build an easy yet secure way to isolate your system from your coding projects with containers. See https://github.com/evertheylen/probox

    • ajross 3 hours ago

      > As a hobbyist how do I stay protected and in the loop for breaches like this?

      For the case of general software, "Don't use node" would be my advice, and by extension any packaging backend without external audit and validation. PyPI has its oopses too, Cargo is theoretically just as bad but in practice has been safe.

      The gold standard is Use The Software Debian Ships (Fedora is great too, arch is a bit down the ladder but not nearly as bad as the user-submitted madness outside Linux).

      But it seems like your question is about front end web development, and that's not my world and I have no advice beyond sympathy.

      > occasionally I’ll build a little thing that is neat and could be useful to someone else, but then I worry that I’m just a target for some bot

      Pretty much that's the problem exactly. Distributing software is hard. It's a lot of work at a bunch of different levels of the process, and someone needs to commit to doing it. If you aren't willing to commit your time and resources, don't distribute it in a consumable way (obviously you can distribute what you built with it, and if it's appropriately licensed maybe someone else will come along and productize it).

      NPM thought they could hack that overhead and do better, but it turns out to have been a moved-too-fast-and-broke-things situation in hindsight.

      • zahlman 26 minutes ago

        > PyPI has its oopses too, Cargo is theoretically just as bad but in practice has been safe.

        One obvious further mitigation for Python is to configure your package installer to require pre-built wheels, and inspect the resulting environment prior to use. Of course, wheels can contain all sorts of compiled binary blobs and even the Python code can be obfuscated (or even missing, with just a compiled .pyc file in its place); but at least this way you are protected from arbitrary code running at install time.

      • paulryanrogers 32 minutes ago

        Didn't Debian ship a uniquely weak version of OpenSSL for years? HeartBleed perhaps?

        IME Debian is falling behind on security fixes.

        • ajross 28 minutes ago

          They did, and no one is perfect. But Debian is the best.

          FWIW, the subject at hand here isn't accidentally introduced security bugs (which affect all software and aren't well treated by auditing and testing). It's deliberately malicious malware appearing as a dependency to legitimate software.

          So the use case here isn't Heartbleed, it's something like the xz-utils trojan. I'll give you one guess as to who caught that.

      • squidsoup 3 hours ago

        Having spent a year trying to develop against dependencies only provided by a debian release, it is really painful in practice. At some point you're going to need something that is not packaged, or newer than the packaged version in your release.

        • LtWorf 33 minutes ago

          That's when you join debian :)

        • ajross 2 hours ago

          It really depends on what you're doing. But yes, if you want to develop in "The NPM Style" where you suck down tiny things to do little pieces of what you need (and those things suck down tiny things, ad infinitum) then you're naturally exposed to the security risks inherent with depending on an unaudited soup of tiny things.

          You don't get secure things for free, you have to pay for that by doing things like "import and audit software yourself" or even "write simple utilities from scratch" on occasion.

      • megous 35 minutes ago

        As a hobyist (or profesionally) you can also write code without dependencies outside of node itself.

  • severino an hour ago

    I wonder what could one do if he wants to use NPM for programming with a very popular framework (like Angular or Vue) and stay safe. Is just picking a not very recent version of the top level framework (Angular, etc.) enough? Is it possible to somehow isolate NPM so the code it runs, like those postinstall hooks, doesn't mess with your system, while at the same time allowing you to use it normally?

    • theodorejb 35 minutes ago

      One option to make it a little safer is to add ignore-scripts=true to a .npmrc file in your project root. Lifestyle scripts then won't run automatically. It's not as nice as Pnpm or Bun, though, since this also prevents your own postinstall scripts from running (not just those of dependencies), and there's no way to whitelist trusted packages.

  • edoceo 4 hours ago

    Happy I keep a mirror of my deps, that I have to "manually" update. But also, the download numbers are not really accurate for actual install count - for example each test run could increment.

  • worik 14 minutes ago

    This has been going on for years now.

    I have used Node, I would not go near the NPM auto install Spyware service.

    How is it possible that people keep this service going, when it has been compromised so regularly?

    How's it possible that people keep using it?

  • cxr 3 hours ago

    Imagine if we had a system where you could just deposit the source code for a program you work on into a "depository". You could set it up so your team could "admit" the changes that have your approval, but it doesn't allow third parties to modify what's in your depository (even if it's a library that you're using that they wrote). When you build/deploy your program, you only compile/run third-party versions that have been admitted to the depository, and you never just eagerly fetch other versions that purport to be updates right before build time. If there is an update, you can download a copy and admit it to your repo at the normal time that you verify that your program actually needs the update. Even if it sounds far-fetched, I imagine we could get by with a system like this.

    • chrisweekly 3 hours ago

      You're describing a custom registry. These exist IRL (eg jFrog Artifactory). Useful for managing allow-listed packages which have met whatever criteria you might have (eg CVE-free based on your security tool of choice). Use of a custom registry, and a sane package manager (pnpm, not npm), and its lockfile, will significantly enhance your supply-chain security.

      • cxr an hour ago

        No. I am literally describing bog standard use of an ordinary VCS/SCM where the code for e.g. Skia, sqlite, libpng, etc. is placed in a "third-party/" subdirectory. Except I'm deliberately using the words "admit" and "depository" here instead of "commit" and "repository".

        Overlay version control systems like NPM, Cargo, etc. and their harebrained schemes involving "lockfiles" to paper over their deficiencies have evidently totally destroyed folks' ability to conceive of just using an SCM like Git or Mercurial to manage source the way that they're made for without introducing a second, half-assed, "registry"-dependent VCS into the mix.

        • morshu9001 15 minutes ago

          Does the lockfile not solve this?

    • kej 3 hours ago

      Now you have the opposite problem, where a vulnerability could be found in one of your dependencies but you don't get the fix until the next "normal time that you verify that your program actually needs the update".

      • edoceo 2 hours ago

        If a security issue is found that creates the "normal time".

        That is, when a security issue is found, regardless of supply chain tooling one would update.

        That there is a little cache/mirror thing in the middle is of little consequence in that case.

        And for all other cases the blessed versions in your mirror are better even if not latest.

    • zahlman 24 minutes ago

      So, vendoring?

    • lenkite 3 hours ago

      Well in the Java world, Maven had custom repositories which did this for the last 20+ years.

    • anthk 3 hours ago

      You are describing BSD ports from the 90's. FreeBSD ports date back to 1993.

    • edoceo 3 hours ago

      That is exactly what I do.

  • throwaway81523 2 hours ago

    "It's always NPM."

    • LtWorf 19 minutes ago

      Eh sometimes it's pypi

  • ghusto 3 hours ago

    When people ask me what's so wrong with lowering the bar of entry for engineering, I point to things like this.