Hacker News new | past | comments | ask | show | jobs | submit login

Having worked with C/C++ projects. Managing dependencies is downright painful. About the best option we found was to treat everything as build from source and then git submodule our dependencies in. This is still not good but at least it gave us a consistent environment.



> Managing dependencies is downright painful.

The risk when it is too easy is that you suddenly pull half of the Internet into your project. In C++, the cost of having a dependency makes me choose carefully if I want a dependency or not, and if I do, if I want this particular dependency (Who develops it? Is it maintained? If it stops being maintained, can I easily move to an alternative, or could I take over maintenance? ...).

> About the best option we found was to treat everything as build from source and then git submodule our dependencies in.

If you mean having a git submodule and doing something like `add_subdirectory(dependencyA)` in CMake, please don't do that. Building from source is fine, but then you can totally install the dependency locally and have your build system (be it CMake, Meson, ...) fetch it. At least I started doing it the way it is described here [1] and it works well for me.

[1]: https://www.acarg.ch/posts/cmake-deps/


Requirement to "install dependencies locally" is part of the pain with C++ dep management. Not having to do it makes builds much easier to define, as you don't need to supports lots of distros and hope that everyone of them is up to date.


I don't know, I feel like it is a bit of a tradeoff.

You can totally say "here is my code, it's open source, and the build system will look for those dependencies on the system" and let the users (who are developers) handle the dependencies the way they want. This way the user gets to choose if they trust the dependencies or not. Ideally the distro ships with those dependencies, and the user can just install them (better: write a package that depends on those dependencies and install that).

It seems like developers don't really know how to do that anymore, and language package managers allow them to not learn about it. But at the cost of control: they don't know anymore what dependencies they are pulling, or even how many they are pulling.

The modern way seems to be mostly about productivity, and therefore devs don't want to learn anything about dependencies, they want everything to "just work". But that is arguably bad for security.


I remember doing that whenever I wanted to build a C program on GitHub from source. “Oh cool, so I’m going to need these 8 dependencies. Let’s see which ones are supported on my distribution. Ok, I have 6 of them but 2 are weird versions from many years ago. No way that will cause problems! The other 2 I can install from source - oh, but the latest version of this dependency in git needs something else I don’t have…” and there goes my whole evening.

Is my computer more secure for it? Am I in more control? Absolutely not. I have weird 3rd party programs installed in weird places all over my system. I have a computer that is almost impossible to rebuild if my hard drive dies. Finally I can make fragile builds that might have bugs that nobody else can reproduce because of the idiosyncratic combination of package versions I’m using.

This is so unbelievably worse than cargo or npm. With cargo I can take any package on GitHub and reasonably install and run it on any computer I own (running any OS) with one command. The only time this breaks down is when part of the dependency tree is C code. It’s miraculous.

Give me that experience in apt and I’m there. But the experience apt currently provides is unbelievably bad for developers compared to cargo/npm/etc. I can only assume nobody who works on Debian has seriously tried npm or cargo.


> oh, but the latest version of this dependency in git needs something else I don’t have…” and there goes my whole evening.

Why not start by building the version of that dependency supported by the project you want to build? It's not like you ask cargo to take the latest versions of all the dependencies, is it?

> This is so unbelievably worse than cargo or npm.

I agree that in many cases it is less convenient. But in your example, 6 out of the 8 dependencies come from distro packages, which I think is better. For the last two, you could actually contribute community packages.

Have you ever tried Arch Linux, for instance? There is the Arch AUR where the community can contribute packages. I don't remember having had to compile a dependency from source, because everything I need is either in the official repo, or on the AUR. It is explicit when I need to get it from the AUR, so I know that I may want to pay more attention to what I am pulling. And usually I do and have a quick look at the package recipe (and its dependencies). In this case I would have to check 2 of them, and not 8, which is a big win.

That is of course less convenient, but I believe it is more secure. I see it as a tradeoff.


> Why not start by building the version of that dependency supported by the project you want to build?

Because I don’t know which version that is half the time. Does the readme say to use version 2.2 or 2.0.1? Are they meaningfully different? Maybe 2.2 is needed to be compatible with something else on my system. (Nvidia has entered the chat). I have no idea what is in the changelog of some random dependency that I’ve never heard of before - and I just remembered, I don’t care. I never care.

> For the last two, you could actually contribute community packages. Have you ever tried Arch Linux, for instance?

That sounds like an even more effective way to waste an entire evening. Maybe multiple evenings! When I see a cool program someone’s written that I want to play around with, what better use of my time could there possibly be than figuring out how to submit community made packages to arch Linux for the random dependencies of some random software someone linked me? All this before I’ve even built the program I want to try out? No thanks.

And how is it more secure? Do programs magically get security audits as part of their addition to arch Linux community packages?

Your comment reminds me of that infamous response in Dropbox’s “Show HN” thread. “Well, the problem Dropbox solves sounds like something easily done with rsync which is already part of arch Linux. Have you tried arch Linux? Instead of starting a billion dollar company, you could submit a simple bash script as a community contribution to arch. Using a bash script in arch Linux is of course less convenient. But I believe it is a trade off.”

And to answer the unspoken question, no. Arch Linux doesn’t have the tens of thousands of up to date packages that are already in cargo. And are already available on every operating system under the sun. Manually adding them to arch sounds like a pointless task that would only serve to make updating my dependencies more difficult and make my software less compatible on all the other systems it already, effortlessly works on. (Like Debian, FreeBSD, macOS and windows.)


> Does the readme say to use version 2.2 or 2.0.1? Are they meaningfully different?

If the library is done right, then 2.2.0 should work if it requires 2.0.1, and the reverse may not work (if the program you want uses a feature that was added after 2.0.1).

> and I just remembered, I don’t care. I never care.

Yeah, I think it is a big reason why language package managers are so popular: most don't care.

> When I see a cool program someone’s written that I want to play around with, what better use of my time could there possibly be than figuring out how to submit community

First, probably those packages were already contributed by someone else. Someone who cared.

Then... for the rare packages that may not already be there, I would hope that you could consider spending a couple hours contributing something back to the community you are most likely using for free and complaining about. For most libraries it should not even take two hours, except maybe the first time ever you do it.

> And how is it more secure? Do programs magically get security audits as part of their addition to arch Linux community packages?

As I said above, the ones that are shipped by the Arch official repo seem more secure. For the community ones, it's less clear, but I would argue that AUR users are at least not less likely to review a package than the cargo users are to review a transitive dependency.

> Using a bash script in arch Linux is of course less convenient. But I believe it is a trade off.”

Well if you want to use this as an argument, you should say why it is a tradeoff. Why is it a tradeoff? Are you saying that rsync is more secure than dropbox, but less convenient?


> Then... for the rare packages that may not already be there, I would hope that you could consider spending a couple hours contributing something back to the community you are most likely using for free and complaining about.

Is me volunteering my time and expertise to write and publish opensource code not virtuous enough? “If you really loved opensource, you’d also do this other task constantly that doesn’t even take 2 hours each time”.

Why does arch even need rust packages to be added by hand? I’ve already programmatically expressed the contents and dependencies of my rust package. And so have all my dependencies. Why not just mirror that programmatically if you want my dependency tree in arch?

> the ones that are shipped by the Arch official repo seem more secure. For the community ones, it's less clear, but I would argue that AUR users are at least not less likely to review a package than the cargo users are to review a transitive dependency.

It seems more secure? You’re making security assessments based on vibes?

I can’t speak for others, but I’m personally much more likely to review my dependencies in rust or npm than in Debian or whatever because the source code is available and linked from the cargo package page. And I can control+click in to my dependencies (or debug into them) and read their code. And with npm they all live in node_modules. Source and all. I dive in there all the time. Does arch do that too? I have no idea where to look for the source code of a package I installed using apt. I’d probably need to Google the package and hope apt hasn’t patched it in any significant ways that make the version on GitHub out of date.


Just to be clear: I mentioned arch as an example of a distro that has a very active community repo (AUR).

> It seems more secure? You’re making security assessments based on vibes?

I shared an opinion, I am not publishing a security paper. You also say a ton of apparently uninformed stuff that expresses your "feeling". Like "with Arch I would constantly have to contribute packages myself" (Did you try it? Do you have statistics and proofs?). You are entitled to disagree with my opinion, just as I am entitled to have one.

I think that we won't agree here, and for the record I was not saying that language package managers were fundamentally bad. I was merely noting that I see pros and cons on both approaches. I tend to defend the "distro maintainers" side more often, because in my experience, most developers don't know about it.

> I have no idea where to look for the source code of a package I installed using apt. I’d probably need to Google the package and hope apt hasn’t patched it in any significant ways that make the version on GitHub out of date.

Exactly my point. You don't like the distro way because you don't know anything about it. Not saying it is not a valid point: you are entitled to your opinion.

My opinion is that there are pros with distro package managers, like the fact that maintainers put together a set of packages that they ship as a distribution (that is the whole point of a distribution).


I agree that we probably won’t agree on this.

Re: security, specifics matter. AUR “feeling” more secure than cargo doesn’t mean it is. If the principle reason to use it is security, tell that story. How is it better?

> You don't like the distro way because you don't know anything about it.

I’ve been using Linux since a version of Slackware I installed off floppy disks. I remember trying apt for the first time and thinking it was an absolute revolution. The best thing since sliced bread. I haven’t tried arch but for what it’s worth, gentoo gets the source thing right. You can see the source of anything on your system with 1 command. And better yet, it all builds. There’s no stupid -dev version of packages like there is in apt (that drives me nuts). I’ve still never submitted a package to any package manager - and maybe you’re right. Maybe I should.

I’m upset by all of this because I want apt and arch and all the rest to be better. But as far as I can tell, apt hasn’t improved in decades. And now docker (which I hate even more) has come along and solved a bunch of apt’s problems for it by making something even uglier on top. What I want is a package manager with these features:

- Reproducible, hermetic environments (nix)

- Local dependency installation (nix, docker, cargo, npm)

- Cross-platform packages (docker, cargo, npm. Nix is trying but not there yet)

- Cross language packages (nix, apt, docker)

- Feature flags (cargo)

- Support for the same package to be depended on at multiple different versions in the dependency tree (cargo, npm)

- Semver API compatibility checks (nobody does this, but Cargo is working on it.)

- Simple, cross platform publishing. None of this “here’s how you install it on a few Linux flavours we got working manually”. I don’t want to have to play a guessing game of which distributions package which dependencies and what they call them. (Cargo and docker do a good job of this)

- Support for binaries and libraries (apt, cargo, npm, nix. Basically everything but docker.)

- I can just take a random project on github and run it reliably without spending hours pissfarting around first. (Cargo, npm. Apt gets negative points because software packaged with apt breaks so often on newer Ubuntu / Debian distributions.)

I can’t speak for arch but apt doesn’t cut the mustard any more. I wish it did. But it doesn’t. And no amount of “just work around it by manually submitting more packages to this one distribution specific package manager” will satisfy me. I change computer and Linux distribution all the time. I want something that works reliably everywhere. The other day I ran some rust code I wrote on FreeBSD. There are about 100 transitive dependencies, and I bet almost none of them test on FreeBSD. But it doesn’t matter. Once I had rust installed, I checked out the project and ran cargo test. Everything built and worked perfectly. That is how good life is with cargo. Or docker or npm. Or even Python once you get your head around conda or venv.

It’s 2023 and that’s table stakes for a package manager. Distribution specific package managers need to keep up because they don’t pass muster any more.


> Re: security, specifics matter. AUR “feeling” more secure than cargo doesn’t mean it is.

I specifically mentioned the official ones, not the AUR. The story is that the maintainers of a distro check (on a best-effort basis) the packages they ship, and the distros that have a security team patch them when CVEs are announced. Not sure how that would not seem more secure than e.g. pip, where I don't think there is any kind of check (before or after the package has been published).

> Or even Python once you get your head around conda or venv.

For what it's worth, I have this third-party project that used to work and now always fails to install because of some dependency issues in pip. For me it's been broken for years now, I always need to go fiddle with it, `pip install <package>` just doesn't work (it installs something that then complains about a dependency missing, and I can't install the dependency because of some incompatibility reason). I have never had an issue with Arch or Alpine.

I am not a huge fan of writing apt recipes, I like the pacman/apk ones a lot more.

> It’s 2023 and that’s table stakes for a package manager. Distribution specific package managers need to keep up because they don’t pass muster any more.

I do agree with you that there are many issues and I would like it to work better. But one thing I would like to see is devs learning how the tools work and trying to "do things properly". Too many people throw their project in a docker container just because they have no clue how to use their build system correctly, or how to install a dependency properly.

I see a lot of "I use this because it just works and I don't need to understand how", and it feels like this brought us stuff like docker-for-absolutely-everything and ElectronJS. Nix seems nice, but if people can't even be bothered to learn how to use SemVer properly, I don't see how they would ever get the motivation to even consider Nix.


My experience of using the AUR is that a decent part of the packages I try to install fail to build, which is certainly pretty secure.


Yeah I guess YMMV, I personally never had issues. Maybe once in a while a package has an issue and there are already comments on the AUR website so I don't even need to debug it myself. Maybe I had to debug once in ten years or something.


> There is the Arch AUR where the community can contribute packages. I don't remember having had to compile a dependency from source, because everything I need is either in the official repo, or on the AUR

When you install from AUR you are compiling from source though. Unless you're using chaotic aur, which only packages a subset of what's available.


Yes, my mistake, and good catch! I meant "having to write the recipe myself".


I am a developer, I know what dependencies I am pulling in my projects, what they do and I even read the code. What good will offloading this to my users do? They have to decide whether they trust my apps with all its dependencies, they can do that by reviewing the code of all deps transitively, but there is not much difference between a dependency from the distro repository and the dependency from crates.io.

> Ideally the distro ships with those dependencies

This is all good and well in the world where there is one single Linux distro, but usually you want to target all mainstream distros and macOS and Windows and what now. Depending on system packages becomes a brittle solution in these cases, since who knows which version of the necessary lib is packaged on this ancient Debian installation. If you depend on a newer version, then you are just forcing you users-developers to either spend time packaging it properly or just running `make install` and littering the system with untracked files. Honestly, stuff like `cargo` fixes this elegantly so I never have to think about it again.


> I am a developer, I know what dependencies I am pulling in my projects, what they do and I even read the code.

I am convinced that you are more the exception than the rule. For node projects that pull hundreds packages transitively, I can't believe for one second that the devs even read the list (and even less that they would start considering reviewing the code).

> but there is not much difference between a dependency from the distro repository and the dependency from crates.io

Who reviews what goes into crates.io? I know for a fact that other package repositories don't have any check at all: anyone can push anything. Whereas the distro repository is reviewed by the distro maintainers. Big distros have a security team even.

I think that is a noteworthy difference.

> but usually you want to target all mainstream distros and macOS and Windows and what now.

Of course, usually you want everything, for free, and for yesterday. But let's be realistic: the vast majority of projects don't have users on all the mainstream distros, macOS and Windows. I would start by maintaining a package for my preferred distro, and maybe one for Ubuntu. But maintaining a package should not mean "building a .deb": ideally you should use the program on that system so that you actually test it.

If someone wants it in another distro, they can write and maintain a package for that distro. And again ideally they use it themselves.

I believe that distro maintainers are responsible for distributing software for their distros. And for that they can rely on contributions and community repos, of course.

But projects that target 50 platforms and offer 50 binaries to download even though nobody has ever even installed 48 of them are missing the point, IMO. It is great if cargo allows you to say "builds for 50 different OSes", but if nobody ever tested them, to me it's just marketing.


> Whereas the distro repository is reviewed by the distro maintainers. Big distros have a security team even.

I don't think it globally matters and it certainly does not scale. Are you saying that Debian devs review all code in their repos? I doubt that, and they're certainly let log4j into their repos.

We already have pretty good automated tooling for detecting known vulnerabilities in deps, and updating and rebuilding is not hard. I don't see the added value of Debian's managing build-time deps of my apps.

It should be the devs' responsibility to audit code they use in their projects (including the toolchain), as well as it should be their responsibility to package, maintain and support the final product. I wouldn't want to support copies of my projects which were somehow modified by the 3rd-party maintainers before being provided to end users.

> the vast majority of projects don't have users on all the mainstream distros, macOS and Windows

I don't need to go far, if we're talking for example about some internal corporate tools, then the vast majority of projects have the majority of their users on macOS and Windows. In rare cases some deb-based usage is supported on best effort, but any other distro -- good luck!

The situation is not much different with some public projects, almost all devs don't go further than "supply a .deb and be done", and it's lucky if they support even that.

> It is great if cargo allows you to say "builds for 50 different OSes", but if nobody ever tested them, to me it's just marketing.

Cargo is not an end-user package management tool, it's a build tool for devs, which is then used for building deb, yum, whatever packages you need. It tracks mostly just build-time deps, and for everything else such as glibc, sqlite, any other .so I have no choice other than apt on Debian. This is fine with me.


> Are you saying that Debian devs review all code in their repos?

Well they certainly do patch a fair amount of vulnerabilities faster than I would. Everytime I checked because I was concerned about a vulnerability, it had already been patched.

> I don't see the added value of Debian's managing build-time deps of my apps.

I can trust the Debian security team to do a better job than I would. I definitely do not trust arbitrary developers about security. If there is one thing I have learned from the software industry, it is that almost nobody cares about security. Turns out that the people in distro security teams generally do care about security.

> I wouldn't want to support copies of my projects which were somehow modified by the 3rd-party maintainers before being provided to end users.

Well maintainers can and sometimes do that. But of course if you don't want them to distribute your project, that's your right. Maintainers usually don't run after devs to work for them for free :-).

Also note that some distributions are not the typical general-purpose Ubuntu. Maybe an embedded IoT distribution wants to use your library, and maybe they want to harden it, or something. If you have an open source license, it is totally their right to do whatever they want.


> The risk when it is too easy is that you suddenly pull half of the Internet into your project. In C++, the cost of having a dependency makes me choose carefully if I want a dependency or not, and if I do, if I want this particular dependency (Who develops it? Is it maintained? If it stops being maintained, can I easily move to an alternative, or could I take over maintenance? ...).

There is no such risk, as you're not playing a roulette. No one is pulling a gun to your head and forcing you to pull in hundreds of dependencies.

You can do exactly the same in Rust as you do in C++ and be conservative in what you use for dependencies. This is what I do, and it works great.

Now, that said, I agree that for the ecosystem as a whole this is a problem and people do tend to be too trigger happy when pulling in dependencies.


I understand that you agree with my point, but you disliked my wording? :-)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: