Every time I see a client importing unsigned code with no evidence anyone they trust has reviewed it, I flag it as a supply chain attack vector in their audit and recommend mitigations.
Some roll their eyes, but I will continue to defend it is a serious issue almost every company has, particularly since I have exploited this multiple times to prove a point by buying a lapsed domain name that mirrors JS many companies import ;)
If projects are importing tens or hundreds of third party libs without any kind of validation or review the process is fatally flawed.
Whatever the language or repository system reusing libraries like React, Requests, Apache commons, or lodash make sense after reviewing the pros and cons (functionality, security, size, performance etc). But blindly adding small repositories to your packages file without understanding the implications is only increasing the risk of trouble.
Node and npm for some reason seems to have encouraged this - remember leftpad.
A meta repository that lists versions reviewed by a trusted group of people? It would ad latency to bug fixes and limit the amount of available libraries, but would prevent single developers from taking down the ecosystem on a whim.
This is what `npm audit` and GitHub "DependendaBot" are both doing (originally in parallel with their own meta-databases, though now that GitHub owns npm things are lot tighter integrated, it sounds like).
Admittedly:
A) Both of these meta-repository tools are reactive rather than proactive: they flag bad versions rather than known good versions.
B) It doesn't take too many HN searches to find people don't trust `npm audit` or DependaBot either because both have provided a lot of false positives and false negatives over the years.
C) If someone does trust one or both, often the easiest course of action is to automate the acceptance of their recommendations and just blindly accept them leaving us about where we started and just blurring the lines between what is repository and what is "meta-repository". (Even the "Bot" in DependaBot's name implies this acceptance automation is its natural state, and the bot's primary "interface" is automated Pull Requests).
That is more or less what Arch Linux does. There are oficial repos (core and extra) maintained by Arch Linux developers, an unsupported packages collection (AUR) where anyone can upload a package recipe and an intermediary between those two called community repository that is mantained by trusted users.
Maybe limit the capabilities of software e.g. dictate what permissions are reasonable. Maybe require certain "standard libs" for things like console output that limit what can be output.
You don't need to dictate a standard set of permissions, you just need to remove a single very common anti-pattern called "rights amplification".
Why is a program able to concoct a random string that conveys no authority, into a file handle that conveys monstrous authority potentially over an entire operating system, ie. file_open : string -> File.
That's just crazy if you think about it: a program that only has access to a string can amplify its own permissions into access to your passwords file. This anti-pattern is unfortunately quite pervasive, but it's a library design issue that can be tackled in most existing languages by using better object oriented-design: don't use primitive types, use more domain specific types, and don't expose stdlib functions whereby code can convert an object that conveys few permissions into one that conveys more permissions.
This means deeper parts of a program necessarily have fewer permissions, and the top-level/entry point typically has the most permissions. It makes maintenance and auditing easier to boot.
Pay the maintainers of the libraries you use, and have a contract with them that states their obligation to maintain and support your use of their code
Because you haven't solved the problem for FLOSS, you've solved the problem for non-FLOSS that might also happen to also be FLOSS aka contribute somehow to a FLOSS version of the project - but the solution doesn't help those under the FLOSS licence, and complicates incentives to contribute to FLOSS/"community" versions.
Great for corporates who can buy the support contract, but is also suspiciously similar to the "freemium" model were FLOSS devs are suddenly incentivised to make the FLOSS offering insecure in comparison to the paid licence.
In this case, Marak is his own bad-actor/saboteur, how would the support contract help? It would be far more likely to make free-users 2nd class users, and as such it might be better to simply keep the products managerially separate due to that conflict of interest.
And lets be honest here - when does something stop being a reasonable "maintenance fee", and start to become rent-seeking / extortion? I think if you want to get paid, you simply don't work with MIT/GPL, or fork to a different licence; Changing your mind halfway through isn't reasonable IMHO.
MIT basically means "anyone working on this codebase agrees to MIT terms for their code, and as such authorship isn't so important". If you change your mind, you broke your agreement. If suddenly your authorship matters, what about every other author who stuck to their MIT agreement?
Again, it's still under a FLOSS license. you can use it for free as in beer, under the terms of the license. If you want further expectations satisfied (i.e. ongoing maintenance of the software), that's what money is paid for. You're not paying for the software, you're paying for services around it. (and the nice thing with Open Source is that if the original creator isn't available for whatever reason, you can pay somebody else for them, which is a lot harder with not-open software)
Do you have advice for projects that use Maven? I know every package on Maven Central has a PGP signature, but as far as I know, Maven doesn't verify them.
Not really, individual package developers don't have as much inmediate control over the repository's state as they do with NPM. Packages go through a review by one of the trusted developers and sometimes automated QA and testing (including as of late reproducibility testing, i.e. does the source match the binary?), before being uploaded to the repository.
If you can't trust the team behind the distro, then sure, your supply chain is compromised, but it's significantly less likely for a single package developer to cause any damage, as all the big distros have rather extensive policy and procedures to prevent such things.
I use Gentoo which uses portage the package manager and the way portage works is it pulls source then compiles. Source is rarely checked by everyone. Small packages exist as well. Many Linux distro simply barrow binaries from "trusted" sources. The entire eco system is really a deck of cards.
This is a false equivalence brought up every time anyone mentions how vulnerable the npm/gems/pip ecosystems are to supply chain attacks.
Linux code is always reviewed before deployment, goes through many eyeballs, people are careful about this. The same is not true of npm, or any of the other services (as this event clearly shows).
Parts of the Linux stack equivalent to colors and faker are carefully audited and reviewed by multiple people? That sounds to me like elevating them to important bits in a false equivalence.
When it comes to security (among other things), one simply cannot say that all the important bits are in the kernel. If that were the case, there would not be an issue to discuss here.
Every time I see a client importing unsigned code with no evidence anyone they trust has reviewed it, I flag it as a supply chain attack vector in their audit and recommend mitigations.
Some roll their eyes, but I will continue to defend it is a serious issue almost every company has, particularly since I have exploited this multiple times to prove a point by buying a lapsed domain name that mirrors JS many companies import ;)