Hacker News new | past | comments | ask | show | jobs | submit login

How should Flatpak change that? These two (DEB and Flatpak) are, at least currently, entirely parallel, each with its own set of advantages and problems. I don't think it would make any sense to migrate a distribution such as Debian to Flatpak entirely.



Sorry, I meant to write AppImage not flatpak.

The point was that a lot of packaging would be done by the devs, allowing minutes maintainers to so more important stuff.


I don't think having developers doing the packaging is an ideal situation. I often hear developers complaining about the multitude of GNU/Linux distros because they think it's somehow their responsibility to provide binaries. It's not. The role of the upstream developer is to make their build system easy to use so that other people (like distro maintainers, but also just "regular" users) can compile from source without feeling like they're pulling teeth. If your software is easy to build, it will naturally flow into the distros when its users want it. I package a lot of software and a lot of software is difficult to build without tons of hacks.

For interested readers, here's some best practices for being a good upstream:

- Just use the GNU autotools. Users expect `./configure && make && make install` to work. Too often people roll their own configure scripts and Makefiles and they always miss something important. Distros expect there to be certain knobs to tweak, and configure scripts and Makefiles generated by the autotools have all of them.

- Don't bundle third-party dependencies. For security (and for better documentation of the true dependency graph) distros often must go through extra trouble to unbundle third-party libraries when present. Some project even add their own custom patches to their bundled source. Resist the urge to do this.

- Include accurate copyright information. Put a license header at the top of every source file. Any serious distro will need to do at least a cursory inspection of licensing info to make sure it meets requirements.

- Make proper source release tarballs. Do not depend on your version control tool being available at build-time. Do not depend on the autotools being available build time. Use 'make distcheck' to make a fully bootstrapped tarball to distribute.

- Do not make any use of the Internet during a build. That means no downloading third-party libraries, pre-built binaries, etc. It's imperative that a build can succeed without network access, and some distributions isolate builds from the network to ensure they don't misbehave.

- Do not hardcode absolute paths to binaries. No /usr/bin/bash or etc. Your assumption will surely fail on a non-trivial number of systems. Find the location of a binary at configure time by inspecting $PATH. GNU autoconf can do that and substitute the absolute file name where it's needed, such as in a script's shebang. The same advice can be applied for anything else you need an absolute file name for.

- Do not assume /usr exists. The Filesystem Hierarchy Standard is not as popular as it used to be. This is a more generalized form of the previous point. Again, if you use the Autotools you will be doing the right thing by default.

There's surely more, but that's what I can think of right now. Surely a Debian developer or someone else has compiled a more thorough list. Anyone know of one?

I think that today's software being so difficult to build is making practices that are frowned upon by distributions (for very good reason) seem like acceptable solutions, which leads us to the growing popularity of Docker, Snappy, and Flatpak. "This software is nearly impossible to build, so just use my {Docker,Snappy,Flatpak} bundle!"

tl;dr - Make your software easy to build, don't just package up a mess.


> tl;dr - Make your software easy to build, don't just package up a mess.

You've pretty much hit the nail on the head. You forgot one additional bit though, please for the love of god don't have a crazy web of dependencies.

I see a lot of Node and Ruby apps online that I think would be incredibly useful in the Fedora package collection and have considered contributing them on more than once occasion. What always stops me is the 50+ dependent NPM packages or Gems they require that aren't already packaged by someone else.

The incredibly annoying part is most of these packages provide minimal functionality that you could have just implemented yourself, or that shouldn't in turn need another 3-10 transitive dependencies of their own. I get that not re-inventing the wheel is generally a good idea, but please try to pick your dependencies wisely if you want to see a distribution include your package - because a volunteer maintainer likely doesn't want to be responsible for your package + a dozen or more dependencies if they can avoid it.


As a Fedora user the same problem used to exist for Perl packages some packages you could install via yum and some you'd have to get from CPAN. It's been probably a decade since I did anything serious in Perl so not sure if issue still exists but I suspect it still does because I have seen the same issue with python packages and the "PIP" tool.

I see more and more programming languages trying to bundle their own dependency management tools 10 years ago I thought CPAN was great nowadays I'm not as sold it's basically reinventing distro style package management but in a way unique to each programming language.


To be fair, all of these language specific package managers make distribution packaging easier as well. It's super easy to make a package for anything distributed as a CPAN package, Ruby Gen, NPM package, distutils/setuptools package, etc.

The real problem comes when developers start using these package managers with reckless abandon and letting their dependency tree grow out of control. I don't mind packaging an extra library or two, but a dozen or more is pushing my patience.


CPAN packages can be translated into perl packages automatically in many cases, or with little modifications. Same for python, ruby and node packages. See fpm[1] for example of one of such tools.

[1]: https://www.digitalocean.com/community/tutorials/how-to-use-...


That's great, but distributions rightfully don't allow fpm generated packages. For all of these languages we've already got easy to use infrastructure, but maintaining a dozen or more packages just to get a single app in the repository is a huge commitment.


fpm is a bad example. Sure, it makes .deb or .rpm formatted things, but they are not proper packages by any means. Bundling up something pre-built from another packaging system is not what packaging is about. To do it right you need to build your own binaries from source code using only your own packages to provide the dependencies.


Of course, automatic translation of binary packages is bad thing which will produce wrong result in lot of cases, but automatic translation of source packages, with build instructions and meta-information, is time saver. For RPM, I will have a .spec file, which I then can edit further, or use tool options to fill fields with proper values. When .spec is ready, in most cases I will need to update version and changelog only to upgrade to newer version.


Tools for other languages are here:

https://wiki.debian.org/AutomaticPackagingTools


Yes, I should've mentioned that. I've been down the rubygems and npm rabbit hole a few times myself...


Got any advice for ways to bundle stuff written in Rust, Go, Ruby, or Node? All of these languages come with package-managers that encourage reusing packages from their respective ecosystems.

I packaged a Github clone called Gogs (written in Go) for Debian/Ubuntu, complete with Lintian support. But I had to compromise on the 'rules' file and add a "get-orig-source" target that uses Go's package manager to grab all of the dependencies. I used that rule to grab all of the source files required to create the source package (which can then be built in isolation).

But if I understand Debian's official packaging rules, this is verboten because it winds up including a bunch of interconnected third-party libraries. Since I didn't write Gogs or any of its dependencies, I can't exactly go through and eliminate all external dependencies. And even if I could, much of Go's standard library exists only in ecosystem form.

How should a prospective package maintainer handle these kinds of ecosystems? Trying to distro-package every library (Perl-style) would be a Herculean effort, and could conceivably be met with hostility by the upstreams.

There is so much software being written in Go/Rust/Ruby/Node/etc. How can we go about packaging it?


There's no easy road. We need to convince upstream to change their ways. The proliferation of language-specific package managers is a problem that many people don't yet understand is a problem. I often get hostile reactions when I advocate for general-purpose systems package managers over language-specific ones.

In the meantime, we can use the information available in these language package managers to help bring that software to the systems package managers. How easy it is all depends on the language. If the language/package manager is sane and the build system isn't conflated with the package manager, we can make quick progress by writing importer scripts that automate most, but not all, of the work. Node, Go, and Java are utter nightmares for various reasons. Python is pretty good. Ruby is somewhat annoying but doable. It seems that Rust is decent but I haven't used it. All I know is that someone recently wrote a Crate package import for GNU Guix (the package manager I contribute to and recommend highly) that seems to work. [0]

[0] https://www.gnu.org/software/guix/manual/html_node/Invoking-...


I watched the Rust packaging effort via the mailing list, and that seems to be going quite well.

Would you happen to know a good video or writeup on why language-specific package managers are a bad idea? I mean, the situation with C and C++ libraries seems significantly worse to me, and I personally really enjoy having the fully Crates.io index at my disposal on any box that runs Cargo.


I think language-specific package managers are just fine for sharing source code for that language with other developers of that language. But as soon as you need to do more than that they become extremely problematic. The dependency tree for a Crate (or a package in any other similar system) ends where the Rust dependencies end. It cannot describe the full dependency graph for any package that depends on a program or library written in another language. I'm more familiar with Ruby so here's a real-life example: The mysql2 gem needs to compile a C extension and link against libmysqlclient. However, the 'gem' utility only handles Ruby dependencies, so in order to 'gem install mysql2' you need to use your system package manager to provide 'libmysqlclient'. There's always going to be this disconnect and you'll have to use multiple package managers in order to get anything working. It's very error prone. Wouldn't be great if a single packager manager could describe the complete dependency graph? This is a big reason why I advocate for GNU Guix. I do devops for a living, and much of the difficulties I face are due to problems trying to glue multiple package managers together.


Not your parent, but I personally believe that using "package managers" to describe both of these things conflates the issue. That is, both are valuable, for different reasons. The shortest way that I can describe it is this: I use my system package manager to install things for software that I'm not developing, but language-specific managers for software that I am actively developing. When I used to write a lot of Ruby, I had my own install, but now that I mostly write Rust, I have Ruby installed via apt-get.

The two styles of managers just have different goals. That is, a package manager such as apt has the goal of creating a single, harmonious system from stuff written in many languages. But a language-specific package manager like Cargo has the opposite goal: to provide a good way of writing software written in one language across multiple systems. This is where most of the tension comes from. The rest of it is from the same general structure, but with different specifics: the goals of these kinds of systems are very different, and conflicting.

Software is hard.


There's no need for two styles of package managers. GNU Guix and Nix can serve both purposes (and more) very well, for example.

I think language-specific package managers are fine for easily sharing source code amongst developers using the same language, but they shouldn't be used in a production system.


I don't necessarily disagree. I think your third sentence still implies two styles, which is contradicting your first.

Nix and Guix don't work on Windows, right? They're still not close to a solution until they do.


If you're on Windows, then fine use whatever is available. A language specific package manager is about as good as it gets there. There are no good package managers for Windows, and I don't think there can be. I don't even know if you can isolate builds in a container like you can on GNU/Linux. That's a crucial OS feature. Besides, I aim to liberate users, not enslave them, so I develop for the GNU system, not Windows.

The third sentence is not a contradiction. I'm just saying that I can live with people using language-specific package managers, but really they would be better off with a general-purpose one.


There are several package managers for Windows:

https://en.wikipedia.org/wiki/List_of_software_package_manag...


Debian has a package for rustc and cargo in Stretch, specifically for helping package stuff written in Rust. They also have a way of converting crates.io packages to Debian packages for this purpose. Asking about this on http://lists.alioth.debian.org/pipermail/pkg-rust-maintainer... is probably the best way to get advice, that's where the people doing this work congregate.


The problem is, my experience is "use autotools" and "don't package third party dependencies" make my software much harder for my users to build, and distributions are going to generally have out of date versions.

I'm not saying packaging is easy, and I do try to make it friendly for distributions, but don't pretend that doesn't make it worse for general users in the process.


You don't have to use autotools, but it makes doing the right thing easy. If you want to use CMake (I much prefer it myself!) just make sure you use pkg-config, same with scons or whatever. These are all included in Fedora's (my distro of choice) package collection and there's no trouble using them to build.

But please, if you decide to bundle third party libraries make sure you can build without them and use ones provided by the system instead. It's a political nightmare to include packages with bundled libraries because it makes security updates a huge headache since we can't simply rely on Anitya (https://release-monitoring.org/distro/Fedora/) to send us notifications that a new release of the library is available, not to mention the extra work of actually updating the bundled library once we do find out an update has been published.


It is best to not bundle third party libraries, instead, have your build system check if the libs are installed and if they aren't then download and build them. rleigh mentions below that cmake can do that.


It absolutely does not make it worse for general users in the process. The whole point is to help the user! I'm not saying there aren't problems on the distro side. Distributions like Debian do have the problem of moving much too slowly, and apt and the other "imperative" package managers are severely flawed, but the basic best practices I aligned make things better for all users.

And I'm not saying that you should never provide some prebuilt binary to your users if their distro is lagging behind. And if you really feel the need to bundle third-party libs then just make sure there are configure switches that can be flipped so that system libs are used instead. The best thing for users is for them to be able to get all of their software from their distro, and that requires distros and upstreams to each do their part.


> I don't think having developers doing the packaging is an ideal situation. I often hear developers complaining about the multitude of GNU/Linux distros because they think it's somehow their responsibility to provide binaries.

It is in many scenarios. If I have a little app I want to package then it becomes my responsibility. For commercial software I make it always is and this is part of the reason that linux sucks for commercial software.

And then there's issues like security patches. Developers need to know what branches are used downstream.


>It is in many scenarios. If I have a little app I want to package then it becomes my responsibility. For commercial software I make it always is and this is part of the reason that linux sucks for commercial software.

You are talking about proprietary software, where developers have unjust power over users. If you want to distribute such software then yes, you have to do the work of making binaries for each distro you want to support by yourself. I would argue that it's not GNU/Linux that sucks here. If instead you gave your users freedom by using a free software license on the source code, then others may package the software for use on the system of their choosing.


> - Don't bundle third-party dependencies. For security (and for better documentation of the true dependency graph) distros often must go through extra trouble to unbundle third-party libraries when present. Some project even add their own custom patches to their bundled source. Resist the urge to do this.

If any of the dependencies aren't currently packaged in Debian, how would one follow this guideline?


Install them separately, but don't embed. CMake provides features like the external project stuff which lets you fetch and build other sources. But even then, you don't need to embed that in your source tree either--do it at a higher level which builds all the dependencies plus your own sources. This keeps your sources free of embedded junk, giving it the flexibility to be packaged, or used standalone.


In addition to what rleigh said, you could package them for Debian :)

https://mentors.debian.net/intro-maintainers


That's a fascinating link.

If I wanted to package my third-party dependency, the first thing I would do is "learn about personal interests of sponsors" and see if my third-party dependency and a sponsor's interests intersect. There's a link to a page describing the sponsoring process, where apparently I'd file a bug against a "sponsorship-requests" pseudo-package and then, I guess, wait.

Next (or perhaps concurrently) I'd file a separate "Intent to package" bug against the "Work-Needing and Prospective Packages" pseudo-package. There's a whole page about WNPP and format guidelines for submitting said bug using the "reportbug" tool. Those format guidelines are longer than the JSON spec.

Then I'd still need to make the package, after all. That link you gave lists five important reference materials, one of which is said to be "must read" and has 12 chapters and 7 appendices. There's also a "New Maintainer's Guide".

Then I need to publish my package. There's an account to sign up for. Plus I'll need to create, keep up with and sign stuff with a GPG key because uploads are http/ftp only.

Once that is finished I apparently get an email response. Finally... I am done!

Now it's time to find a sponsor.

There's a whole section on what to do if you can't find a sponsor. The first is to follow up on the WNPP request I was supposed to make six paragraphs ago. The other is apparently to look up sponsors in a sponsor search-engine on the Debian website and bother them.

Then there's another section on actually getting the package into debian through an ftpmaster. (Both the sponsor and the non-Debian Debian-package maintainer are ominously reminded here that the ftpmaster's _opinion_ on inclusion is binding.)

And then maintaining it.

I would be, for the life-time of my application, maintaining the Debian package of one of my third-party dependencies. This, in response to my query about how to be a good upstream citizen in the hopes that downstream maintainers can more easily package my application! :)


The amount of documentation there suggests it's not so easy. Especially when you're trying to get something unrelated done.


If you are doing something unrelated you probably aren't interested in packaging some dependency for Debian, so you may as well just manually compile and bundle your deps into a container format like docker/appimage.


Some more best practices:

https://wiki.debian.org/UpstreamGuide


Thanks!




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: