26 comments

  • sebmellen 44 minutes ago

    "Here, install my new 1-day old NPM package that doesn't let you install packages younger than 90 days."

    Pardon me, I couldn’t help myself :D

  • tkzed49 2 hours ago

    Not controlling transitive deps makes this vastly less useful because direct deps can specify version ranges (e.g. latest minor version). Personally I'd stick with pnpm's feature.

    • zelphirkalt 12 minutes ago

      This is why one should pin all direct and transitive dependencies with their checksums and not upgrade everyday willy-nilly. There is no need to specify the specific version numbers of transitive dependencies, if one keeps a lock file that pins those exact versions and checksums of transitive dependencies, and one doesn't upgrade willy-nilly all the time. Make upgrading dependencies a conscious choice, and perhaps have a policy of at most upgrading every X days.

  • zelphirkalt 15 minutes ago

    This works only if there are some other people, who will use a dependency "too early" to fall victim to some exploit and then notice it, within those 90 days. Imagine, if everyone only used packages older than 90 days. Then we would have no frontrunner to run into the issues before us.

    A cooldown time alone is not actually a sufficient solution. What people really need to stop doing, is not properly pinning their versions and checksums, and installing whatever newer version is available. That would cause a problem even, if the date line is moved 90 days into the future for all packages. If however, one only updates versions of dependencies when one consciously makes that choice, there are far fewer points in time, when versions change, and therefore the chance of catching something is also much lower. Combine that with a cooldown time/minimum age for versions, and you got an approach.

  • moritzwarhier 2 hours ago

    As someotherguyy already mentioned, this is a default feature in pnpm.

    And as far as cat-and-mouse-games go in other package managers, I'd say that pinning dependencies and disabling postinstall scripts is a much better option. Sure, not a foolproof one either, but as good as it gets.

    edit: misspelled someotherguyy's user name

    • 2muchcoffeeman 22 minutes ago

      Why is the community persisting with such poor solutions?

      • moritzwarhier 10 minutes ago

        What would be a better solution? Do other package managers reliably restrict access to the host system beyond the scope of the project folder?

        Many quirks come from abilities that were once deemed useful, such as compiling code in other languages after package install.

        But even restricting access to the file system to the project's root folder would leave many doors open, with or without foreign languages: Node is a designed as a general purpose JS runtime, including server-side and build-time usage.

        The USP of node was initially to provide an API that, unlike the web platform, is not sandboxed.

        This not only allows server-side usage, but also is essential to many early dev scenarios. Back in the days, it might have been SCSS builds using node-gyp (wouldn't recommend). Today it's things like Golang TypeScript or SSGs.

  • mrconter11 2 hours ago

    But safe-npm is not 90 days old yet.. :/

    • jagged-chisel 32 minutes ago

      Consider this a 3-month lead on the ability to utilize it

  • someothherguyy 2 hours ago
  • arrty88 34 minutes ago

    With the help of AI, i see no reason to install most deps nowadays besides types and react and mui framework. Everything can be built from scratch quickly.

    • zelphirkalt 11 minutes ago

      You still will have to maintain it then though.

  • ttoinou an hour ago

    If everybody does that, won't we take 90 days more to detect problems / hacks of npm packages ?

    • lelandbatey an hour ago

      No, cause the folks detecting the problems typically do so by actively scanning new releases (usually security companies do this). Few such problems are detected by people who do a "normal" update and receive compromised code, investigate, and then report the problem. It does happen, but it's not the "usual" way these supply chain attacks are discovered, especially not the really big ones.

      • cmckn 16 minutes ago

        This feels like a game of hot potato, and everyone is blindfolded. There’s no way to ensure that an unspecified security company has audited a release (for free). Delayed dependency updates could make sense in a broader strategy, but on its own I think it’s mostly wishful thinking.

  • pr0xyb0i 2 hours ago
    • silverwind 2 hours ago

      Seems like a worse version of `before` because `before` also handles indirect dependencies, whil this module does not seem to.

  • codezero an hour ago

    Does anyone have any statistics on how long a compromised package has been in the wild on average?

  • asdkkthrowaway 2 hours ago

    Doesn't this just mean you're 90 days late on any patches?

    • moritzwarhier 2 hours ago

      auto-updating is bad.

      Scheduled, audited updates are good.

      Installing random npm packages as suggested here is also bad. Especially with "--global", although I'm not sure if that makes any difference because Node by default of course can access all of your file system.

    • beepbooptheory 2 hours ago

      This article was on the front page recently that discusses the idea behind this:

      https://blog.yossarian.net/2025/11/21/We-should-all-be-using...

      Most of the time, you need quick patches because of fairly recent dependency changes, so if you just wait and kind of "debounce" you dependency updates, you can cover a lot of supply chain vulnerabilities etc.

      • ntonozzi 2 hours ago

        It's not debouncing, it's delaying. Ideally you can still update a specific dependency to a more up to date version if it turns out an old version has a vulnerability.

  • robkop 2 hours ago

    You could dual brand as vibe-npm, only install packages that are in your models training dataset

  • cheesekunator an hour ago

    Why does elapsed time mean a library is safe? This is so ridiculous. It doesn't protect you against anything. I'm sure there are 1000s of old libraries out there with hidden vulnerabilities or malicious code.

    • Waterluvian an hour ago

      Literally nothing can mean a “library is safe.”

      The idea of “safe” in terms of risk and security has misled a lot of people into this wrong idea that there’s a binary state of safe and unsafe.

      It’s all about risk management. You want to reduce risk as inexpensively as possible. One of many inexpensive approaches is “don’t install dependencies that are new.” Along with “don’t install dependencies that nobody else uses.” You might also apply the rule, “don’t install dependencies that aren't shipped with the OS.” Or “don’t use dependencies that haven’t been formally proven.” Etc.

      Indeed, calling it “Safe-NPM” can be misleading. As if using it achieves some binary state of safety.

    • femiagbabiaka an hour ago

      Most supply chain attacks have a very limited window in which they’re exploitable. This is not a panacea, but it is a good idea.