Hacker News new | past | comments | ask | show | jobs | submit login

> Why not start by building the version of that dependency supported by the project you want to build?

Because I don’t know which version that is half the time. Does the readme say to use version 2.2 or 2.0.1? Are they meaningfully different? Maybe 2.2 is needed to be compatible with something else on my system. (Nvidia has entered the chat). I have no idea what is in the changelog of some random dependency that I’ve never heard of before - and I just remembered, I don’t care. I never care.

> For the last two, you could actually contribute community packages. Have you ever tried Arch Linux, for instance?

That sounds like an even more effective way to waste an entire evening. Maybe multiple evenings! When I see a cool program someone’s written that I want to play around with, what better use of my time could there possibly be than figuring out how to submit community made packages to arch Linux for the random dependencies of some random software someone linked me? All this before I’ve even built the program I want to try out? No thanks.

And how is it more secure? Do programs magically get security audits as part of their addition to arch Linux community packages?

Your comment reminds me of that infamous response in Dropbox’s “Show HN” thread. “Well, the problem Dropbox solves sounds like something easily done with rsync which is already part of arch Linux. Have you tried arch Linux? Instead of starting a billion dollar company, you could submit a simple bash script as a community contribution to arch. Using a bash script in arch Linux is of course less convenient. But I believe it is a trade off.”

And to answer the unspoken question, no. Arch Linux doesn’t have the tens of thousands of up to date packages that are already in cargo. And are already available on every operating system under the sun. Manually adding them to arch sounds like a pointless task that would only serve to make updating my dependencies more difficult and make my software less compatible on all the other systems it already, effortlessly works on. (Like Debian, FreeBSD, macOS and windows.)




> Does the readme say to use version 2.2 or 2.0.1? Are they meaningfully different?

If the library is done right, then 2.2.0 should work if it requires 2.0.1, and the reverse may not work (if the program you want uses a feature that was added after 2.0.1).

> and I just remembered, I don’t care. I never care.

Yeah, I think it is a big reason why language package managers are so popular: most don't care.

> When I see a cool program someone’s written that I want to play around with, what better use of my time could there possibly be than figuring out how to submit community

First, probably those packages were already contributed by someone else. Someone who cared.

Then... for the rare packages that may not already be there, I would hope that you could consider spending a couple hours contributing something back to the community you are most likely using for free and complaining about. For most libraries it should not even take two hours, except maybe the first time ever you do it.

> And how is it more secure? Do programs magically get security audits as part of their addition to arch Linux community packages?

As I said above, the ones that are shipped by the Arch official repo seem more secure. For the community ones, it's less clear, but I would argue that AUR users are at least not less likely to review a package than the cargo users are to review a transitive dependency.

> Using a bash script in arch Linux is of course less convenient. But I believe it is a trade off.”

Well if you want to use this as an argument, you should say why it is a tradeoff. Why is it a tradeoff? Are you saying that rsync is more secure than dropbox, but less convenient?


> Then... for the rare packages that may not already be there, I would hope that you could consider spending a couple hours contributing something back to the community you are most likely using for free and complaining about.

Is me volunteering my time and expertise to write and publish opensource code not virtuous enough? “If you really loved opensource, you’d also do this other task constantly that doesn’t even take 2 hours each time”.

Why does arch even need rust packages to be added by hand? I’ve already programmatically expressed the contents and dependencies of my rust package. And so have all my dependencies. Why not just mirror that programmatically if you want my dependency tree in arch?

> the ones that are shipped by the Arch official repo seem more secure. For the community ones, it's less clear, but I would argue that AUR users are at least not less likely to review a package than the cargo users are to review a transitive dependency.

It seems more secure? You’re making security assessments based on vibes?

I can’t speak for others, but I’m personally much more likely to review my dependencies in rust or npm than in Debian or whatever because the source code is available and linked from the cargo package page. And I can control+click in to my dependencies (or debug into them) and read their code. And with npm they all live in node_modules. Source and all. I dive in there all the time. Does arch do that too? I have no idea where to look for the source code of a package I installed using apt. I’d probably need to Google the package and hope apt hasn’t patched it in any significant ways that make the version on GitHub out of date.


Just to be clear: I mentioned arch as an example of a distro that has a very active community repo (AUR).

> It seems more secure? You’re making security assessments based on vibes?

I shared an opinion, I am not publishing a security paper. You also say a ton of apparently uninformed stuff that expresses your "feeling". Like "with Arch I would constantly have to contribute packages myself" (Did you try it? Do you have statistics and proofs?). You are entitled to disagree with my opinion, just as I am entitled to have one.

I think that we won't agree here, and for the record I was not saying that language package managers were fundamentally bad. I was merely noting that I see pros and cons on both approaches. I tend to defend the "distro maintainers" side more often, because in my experience, most developers don't know about it.

> I have no idea where to look for the source code of a package I installed using apt. I’d probably need to Google the package and hope apt hasn’t patched it in any significant ways that make the version on GitHub out of date.

Exactly my point. You don't like the distro way because you don't know anything about it. Not saying it is not a valid point: you are entitled to your opinion.

My opinion is that there are pros with distro package managers, like the fact that maintainers put together a set of packages that they ship as a distribution (that is the whole point of a distribution).


I agree that we probably won’t agree on this.

Re: security, specifics matter. AUR “feeling” more secure than cargo doesn’t mean it is. If the principle reason to use it is security, tell that story. How is it better?

> You don't like the distro way because you don't know anything about it.

I’ve been using Linux since a version of Slackware I installed off floppy disks. I remember trying apt for the first time and thinking it was an absolute revolution. The best thing since sliced bread. I haven’t tried arch but for what it’s worth, gentoo gets the source thing right. You can see the source of anything on your system with 1 command. And better yet, it all builds. There’s no stupid -dev version of packages like there is in apt (that drives me nuts). I’ve still never submitted a package to any package manager - and maybe you’re right. Maybe I should.

I’m upset by all of this because I want apt and arch and all the rest to be better. But as far as I can tell, apt hasn’t improved in decades. And now docker (which I hate even more) has come along and solved a bunch of apt’s problems for it by making something even uglier on top. What I want is a package manager with these features:

- Reproducible, hermetic environments (nix)

- Local dependency installation (nix, docker, cargo, npm)

- Cross-platform packages (docker, cargo, npm. Nix is trying but not there yet)

- Cross language packages (nix, apt, docker)

- Feature flags (cargo)

- Support for the same package to be depended on at multiple different versions in the dependency tree (cargo, npm)

- Semver API compatibility checks (nobody does this, but Cargo is working on it.)

- Simple, cross platform publishing. None of this “here’s how you install it on a few Linux flavours we got working manually”. I don’t want to have to play a guessing game of which distributions package which dependencies and what they call them. (Cargo and docker do a good job of this)

- Support for binaries and libraries (apt, cargo, npm, nix. Basically everything but docker.)

- I can just take a random project on github and run it reliably without spending hours pissfarting around first. (Cargo, npm. Apt gets negative points because software packaged with apt breaks so often on newer Ubuntu / Debian distributions.)

I can’t speak for arch but apt doesn’t cut the mustard any more. I wish it did. But it doesn’t. And no amount of “just work around it by manually submitting more packages to this one distribution specific package manager” will satisfy me. I change computer and Linux distribution all the time. I want something that works reliably everywhere. The other day I ran some rust code I wrote on FreeBSD. There are about 100 transitive dependencies, and I bet almost none of them test on FreeBSD. But it doesn’t matter. Once I had rust installed, I checked out the project and ran cargo test. Everything built and worked perfectly. That is how good life is with cargo. Or docker or npm. Or even Python once you get your head around conda or venv.

It’s 2023 and that’s table stakes for a package manager. Distribution specific package managers need to keep up because they don’t pass muster any more.


> Re: security, specifics matter. AUR “feeling” more secure than cargo doesn’t mean it is.

I specifically mentioned the official ones, not the AUR. The story is that the maintainers of a distro check (on a best-effort basis) the packages they ship, and the distros that have a security team patch them when CVEs are announced. Not sure how that would not seem more secure than e.g. pip, where I don't think there is any kind of check (before or after the package has been published).

> Or even Python once you get your head around conda or venv.

For what it's worth, I have this third-party project that used to work and now always fails to install because of some dependency issues in pip. For me it's been broken for years now, I always need to go fiddle with it, `pip install <package>` just doesn't work (it installs something that then complains about a dependency missing, and I can't install the dependency because of some incompatibility reason). I have never had an issue with Arch or Alpine.

I am not a huge fan of writing apt recipes, I like the pacman/apk ones a lot more.

> It’s 2023 and that’s table stakes for a package manager. Distribution specific package managers need to keep up because they don’t pass muster any more.

I do agree with you that there are many issues and I would like it to work better. But one thing I would like to see is devs learning how the tools work and trying to "do things properly". Too many people throw their project in a docker container just because they have no clue how to use their build system correctly, or how to install a dependency properly.

I see a lot of "I use this because it just works and I don't need to understand how", and it feels like this brought us stuff like docker-for-absolutely-everything and ElectronJS. Nix seems nice, but if people can't even be bothered to learn how to use SemVer properly, I don't see how they would ever get the motivation to even consider Nix.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: