All the wonderful people who package software for their distro!
We take for granted the ability to install anything via our package managers, but it's made possible by efforts of a vast number of contributors who often just like a piece of software and stepped up to allow everyone else to use it too. Too often they get zero recognition for their efforts.
I'm doubly grateful because I'm also an upstream maintainer who doesn't have to try to keep up with all the various packaging processes.
If you want to support them, the next time some software is unavailable or lagging behind in your distro, see what you can do to help!
As someone who has packaged a lot of software for Debian, CentOS, Gentoo, NixOS, and GNU Guix, I have to agree -- it's by far the most difficult platform.
I actually believe much of the Docker hype can be attributed to this. Why bother learning packaging when you can just script something and throw it in a container.
Now I just use Guix on any distro for my up-to-date/custom software needs.
I do recall that Debian packaging, coming from the RPM world, had a very steep learning curve, but that mostly felt like an outsized up-front investment, rather than ongoing pain. (RPM was too long ago to say if it was similar for me).
I'm curious [1] which aspects you find make the Debian (and Ubuntu?) system particularly difficult. Specifically, I wonder if those aspects are difficult by design, in order to "force" effort on the part of the packager to extract benefit on the part of the user or distro, if they're arbitrary, or, perharps if they're by-design but misguided.
[1] No horse in the race.. mostly an end-user whose packaging work is limited to internal-use-only.
There's a big gap between making a working .deb file and getting something to pass QA to get into the archive.
It's been quite a while since I looked at it, but the documentation was fragmented and not up to date. Things have gotten better now you can use git-buildpackage [0], but it still feels arcane.
Challenge - take a package like wget and try to update it to the latest version from upstream Git (or whatever). It should be one or two commands but I've never been able to make it work easily. It's even harder if you're trying to port something across from Ubuntu or to use a fork like wget-lua [1]
> There's a big gap between making a working .deb file and
I assume none of us is even talking about this case. That is, a trivially working .deb could provide little more functionality than a tarball, so that's hardly interesting.
> getting something to pass QA to get into the archive.
The last part I have no knowledge of, since I only ever put my packages into internal archives. Were there political aspects, or only technical?
For the technical, I'm still looking for (ideally from the GP, but from anyone who shares the opinion), specifics.
> the documentation was fragmented and not up to date.
This is a common complaint I hear/read, and I recall it was part of my initial learning investment, figuring out where to go for what information. Now it's mostly bookmarked, and changes tend not to be drastic.
> Things have gotten better now you can use git-buildpackage
One thing I found, repeatedly and consistently, was that every single external (i.e. not from Debian) utility that tried to make the process "better" or "easier" ultimately did the opposite, since it would hide or abstract away an important aspect of the packaging process.
> Challenge - take a package like wget and try to update it to the latest version from upstream Git (or whatever). It should be one or two commands but I've never been able to make it work easily.
I can't recall if I've done wget specifically, but I've done this kind of thing before without much issue, assuming that latest version actually builds and runs on that platform without porting work and without needing a ton of new dependencies (or new versions of existing ones).
I tended to find most of my time was spent in chasing down and re-packaging various libraries whose older versions were no longer good enough.
> It's even harder if you're trying to port something across from Ubuntu
I'm not sure what you mean, since Ubuntu is already using Debian packaging.
> or to use a fork like wget-lua
Certainly forks are going to be more effort, but my experience is that this is true even in their unpackaged state, if they require more exotic (or specific) dependencies be pre-installed, a custom build process, or any porting work. If someone had already done all that work, comprehensively documented it, but merely not translated it into debianization, it would save me a ton of time in creating a custom package.
Sure. These were approximately chronological order (though Firefox's Sync has occasionally swapped a couple). They're also at least a couple years old, so some may be much older than that and are now dead links without the use of archive.org.
You can somewhat see the evolution of the python package debianization ("python policy").
I know I'm going to get downvoted to hell for this, but am I the only one who thinks of these people's efforts as a huge amount of wasted time? Why should anyone have to package software aside from the developer who made it? Why should it have to be done for so many different package repositories and packaging formats? Why are people ok this?
Because the software is inconsistent (and thus buggy) in that regard. Also, barely any developer possesses the knowledge and infrastructure necessary to build their software for dozens of arcane architectures.
Packagers create the glue between the software (which is heterogeneous) and the distro (which is internally consistent). The world without this glue is a horrible mess and a huge waste of time; the old Slackware is a good example. It's thanks to the distro makers that you don't have to spend days collecting requirements and acquiring arcane knowledge just to install a piece of software.
So: yes, those people devote a huge amount of time - so the countless users around the world don't have to. The net time value is positive.
I think we need to distinguish between software that is used as a component in a larger system (e.g., the core OS for which is the core business of a distribution), and an application that is not part of the larger system (distribution) but merely runs on top of it. For putting together a distribution (with tightly integrated components) the traditional packaging philosophy is probably very well suited For add-on third-party software that is not an integral part of the distribution but merely wants to run on it, not so much.
An independent software author (e.g., Ultimaker or Prusa) just wants to reach all "Linux" users at once without dealing with different distribution's policies, and without needing to use the same version of e.g., Qt, that happens to be in a given distribution. And as a user of their software, I want their software on my "Linux" system in the same moment Windows and macOS users can have it.
A lot of the strength of open source comes (much like in computation in general) from chopping tasks into smaller and smaller pieces. The upstream developer is unlikely to be skilled at packaging for a number of different platforms. If platform users can rely on others who are skilled at packaging, we all benefit.
The biggest reason, both historical and present, that we have many different package repositories and packaging formats is because of the concept of shared libraries and the desired benefits: security, stability, speed, freshness of volatile data and disk usage.
And from what I have seen there is only a primary alternative being proposed; containers. Have a copy of all needed libraries in every package and leave it to the developer to patch and fix security. Occasionally put a few things into the operating system, but then you have to hope that those parts don't change or you have to make different packages for different version of the operating system.
What we need is a clear separation between the Core OS a.k.a. base system which should be provided by every "Desktop Linux" distribution, and the rest.
Applications should only use those shared libraries that come with the Core OS a.k.a. base system, and either link statically to or bundle the rest.
Like an iOS application can only consume what iOS provides or bundle any additional dependencies privately.
The result would be a much simpler and more resilient system (at the expense of some storage and memory overhead, which is the lesser evil imho).
> The result would be a much simpler and more resilient system (at the expense of some storage and memory overhead, which is the lesser evil imho).
On balance, I agree with the conclusion. However, coming from a non-desktop viewpoint (server, not embedded, though I do sympathize with the latter), I don't think it's obvious that "some" overhead is worth it, nor has it historically been worth it.
At scale, size can matter, though, like I said, I think today, nobody would even notice.
It's tough to "fight" that history, though, so we go through the pain of even more overhead of full virtualization before cutting it back with OS-level virtualization (a.k.a. containters) and (re-)declaring victory.
I disagree that it's a waste of time. (But I upvoted, because I think it's an important question, thereby adding to the conversation).
> for so many different package repositories and packaging formats?
How many are there, though, really? In theory, the number is unbounded, but, in practice the number of distros is modest, the number of popular ones is smaller, and the number of unique packaging formats even smaller.
Although a sibling comment alluded to it with distros being internally consistent, I wanted to unpack that a bit more.
Specifically, one benefit I've found as a "user" (sysadmin/devops) is that of well-defined dependencies. This isn't inherent to packaging, but it tends to be a feature of the more mature systems and distros.
The other benefit is that it provides a more universal mechanism of traceability and, thereby, at least a path to reproducibility of builds. This has implications for security, of course, but also for debugging.
Systems like the Open Build Service can ease the pain a bit by building for different distributions and versions, but it is a pain nevertheless. Luckily the Open Build Service instance at https://build.opensuse.org/ can also do AppImages, which run on most "Desktop Linux" systems.
Upvote from me, I fully agree with you. I'm sick and tired of people re-packaging python/npm/ruby/etc applications as distro packages. As a maintainer of a few high-traffic python libraries, it wastes my time. I am not ok with it.
Paul Davis of the Ardour project. Ardour is probably one of the best digital audio workstations ever made. Sad thing is that Ardour is such niche software that it doesn't get as much monetary support as it should.
Indeed. I make an automatic small monthly contribution of a few dollars to the project (they make it super easy to do it).
BTW since Harrison Consoles have a commercial version of it (MixBus/MixBus32C) and are actively contributing to Ardour, I wonder if they are also supporting it financially.
It's light years beyond it, it's a full fledged digital audio workstation capable of high quality studio recordings and mixing. A good analogy would be Paint vs. Photoshop.
EDIT: Not to downplay Audacity which is also awesome, it's just not meant for the same use case.
Ardour is an NLE(Non-linear editing) Audacity is not. Also, Ardour's plugins act in realtime. Ardour is used for recording and mixing music, film, and other things. Audacity is mainly an audio editing program.
Consider supporting Henry Zhu, the maintainer of Babel. Henry decided to dedicate himself 100% to open-source earlier this year and is one of the main reasons Babel is such an indispensable (albeit invisible) tool. Henry welcomes contributions at https://www.patreon.com/henryzhu/memberships
seriously. its insane that some startups are raising VC money that 100% would not be able to exist without Babel being maintained. Some of that money should go to Babel but won't. How do we fix this so that Henry doesnt have to keep begging? It's really broken.
Start from the projects you are using. You might find that many of them are primarily one-man efforts and could use some support.
For example, we use verdaccio (https://www.verdaccio.org/) as a private NPM server. IT was so mindblowingly simple to get a private server working that, when the opportunity presented itself, we felt compelled to donate: https://opencollective.com/verdaccio
In this case, it seems that there was an older abandoned effort (sinopia) and the original developer Alex Kocharin (https://github.com/rlidwka) stopped. Juan Picado (https://github.com/juanpicado) picked up the mantle and given how NPM is moving fast and breaking things it's great to see someone is keeping up with the open source solution.
The people building MicroManger (and ImageJ) - the (best/only/reasonable) open source software capable of driving most microscopes used in biology - and then processing that data. It's raw, it's complicated, and (non-technical, demanding, one-off) biologists are the end users. But piped through that software is most every piece of live/functional raw data used in all your cancer/genetics/genomics research. It's also a core to many other downstream, forked, or scripted sets of machine-control, and data-processing. It's a thankless job that literally drives humanity forward.
Urban from LibrePCB, who has been developing a free/libre EDA suite (for PCB design) mostly by himself for over 4 years now: http://librepcb.org/ The first release will hopefully be out this year.
If you think a FOSS KiCAD "competitor" with a solid and well thought-out library design and good usability should be supported, then check out his Patreon page (or his BTC address). Or – of course – contribute code.
Interesting! I'm a KiCAD user but have never heard of this. Why doesn't Urban just contribute to KiCAD instead of maintaining a separate open source project?
Apologies if he mentions it in the video -- I'm at work.
>> Why doesn't Urban just contribute to KiCAD instead of maintaining a separate open source project?
Not every open source project is built the way a person thinks it should be. The only way to know if an alternate viewpoint is better is to build it and find out. Other times a person just wants to build it for themselves for their own reasons. Either way, variety can be a good thing. One day an alternative may just replace your favorite piece of software and then you might ask why the creator started from so far behind all those years ago...
There are probably as many reasons as there are developers.
Do you know anything about this project or is this just a generic diatribe?
I was hoping to hear about his opinions of the shortcomings of KiCAD and where LibrePCB improves on them. It would certainly help someone like me decide whether to fund him on Patreon or not, seeing as I've donated to CERN for KiCAD development.
As of a couple years ago development of ntp was largely performed by one Harlan Stenn. He (and the funding issues around ntp) made the news a while back, but the guy probably deserves even more visibility, given the crucial nature of his work.
"Notorious" does mean "famous", but it carries the added implication of being famous for being bad. I suspect the meaning has changed over time, just like "awful" used to mean "awe-inspiring".
It's had its share of bugs, but the amount of work he has put into it over the years is pretty incredible. Most other video editors for Linux don't even come close, and it recaptures a lot of the lost glory of Windows Movie Maker before it was rewritten and made ugly.
Oh yes! OpenShot is so great. I haven't used it in a couple of years but when I did it was a pleasure to use as an occasional user with 0 knowledge of video editing software.
Jen Fong-Adwent, Jeff Lindsay, Dominic Tarr, James Halliday and Ben Lupton are the five I really admire and think their ideas don't get enough exposure (I realize they've all had successful projects but talking to them at length their ideas generally speaking are fantastic and should be encouraged).
Shay Banon - Founder of Elastic Search. He's been in the Open Source space for years building frameworks like Compass which influenced Red Hat to build Hibernate Search. He's done other work as well.
Ross Mason - Founder of Mule Soft. He built Mule which is Open Source and helped many companies better integrate services.
Joe Walnes - Joe is a beast. He's contributed and created many Open Source projects. websocketd, smoothie, xstream, webbit, and many more.
Rick Hightower - A very smart guy who use to blog about technology that helped many many people. He also created Boon a JSON parser, qbit, and many other frameworks. Because of all of his hard work, he was also made a Java Champion.
I could name so many more folks. Honestly, it's amazing not more people are acknowledged for their efforts.
My partner Mike Schwartz founded Gluu because he was tired of recommending proprietary access management platforms like Siteminder and IBM Tivoli that were locked behind six figure licenses.
We're now a team of about 30 people with 10 years of development into the Gluu Server, a free open source software platform for SSO, 2FA, access management:
I've always been mystified about what the heck is Tivoli. You indicate it is an access management system. Can you please provide more details ... I'm intrigued.
Chris Hobbs / RX14 (major crystal language contributor) is 17 years old, is a genius, and is behind a lot of the recent work on the language including windows support, parallelism, and other things. You can follow him on GitHub or watch him work on crystal on twitch:
An amazing self-hosted cross platform file syncing/sharing service. Being based on git allows for (somewhat) unlimited version history. Simple and elegant, really surprised more people don't know about it.
thanks! I was even going to post myself (SQLAlchemy). However our deepest areas of need are the most boring and soul crushing - documentation and bug triaging. Ideally someone else with full push / release access other than me. We get lots of great patches and pull requests but I'm the only one moving it all through. I'm probably not easy to work with (but I'm open to improvement!)
iTerm2 seems to be the favourite terminal of macOS users… at least those that spend a large amount of their time there (especially with its native tmux support).
There are a lot of engineers and financial quants using Matlab, and the Octave project makes a lot of that Matlab code usable without an expensive Matlab license.
I guess it's not clear to me why (at least in the 'first world') it's wrong for quants and engineers to pay something for the tools they use. (i have spent a fair amount of time contributing to open-source, so I've no problem with OS generally, but I also don't think their is anything wrong with the folks at Matlab getting paid for their efforts. Quants are sure as hell getting paid for theirs).
The better the free software, the greater the exploitation by other people making a bunch of money on it, while the devs work nights and weekends for nothing. I don't have a solution, but there it is.
The ones that undeterred maintains software they haven't built themselves.
For instance the maintainers of Mithril.js (pygy, tivac, isiahmeadows) is/has been doing great work for a long time, with nothing in return. There are many like them out there on other projects, I'm sure, who doesn't get the thanks that they deserve.
A guy called Alexey Tulinov maintains an excellent but very little known, light-weight unicode processing library called libnu. I really wish it had more recognition: https://bitbucket.org/alekseyt/nunicode
That's an app I use daily. A couple of years back I tried to send him some money, as a way of showing appreciation for his work. He wasn't interested so I kind of assumed money wasn't an issue for him.
Gael Guennebaud. The Eigen library powers tons of numerical heavy code -- including Tensorflow -- but doesn't get much spotlight. Watching Eigen evolve under Gael's stewardship has been amazing.
Great framework to build highly scalable, low latency service (developer friendly). It would be great some detailed tutorials and documentation about its internals.
Jonathan Westhues, author of Solvespace. There are a number of hard things to implement in that code and he did them all initially. Others have been making good contributions too, but it could use more developers. I've been digging in the code myself but have not made a pull request yet. We shall see...
This isn't a Show HN post with an open bandwagon. It's about nominating underrecognized achievers, the recognizance of which is only objectively done by someone other than the subject in question.
I'm ok with being tactless, if it highlights something important. :)
eg I just pointed out the project (DB Browser for SQLite) that I've been helping out for years. We're widely used, people say very good things about us, and we could definitely use more funding. ;)
We take for granted the ability to install anything via our package managers, but it's made possible by efforts of a vast number of contributors who often just like a piece of software and stepped up to allow everyone else to use it too. Too often they get zero recognition for their efforts.
I'm doubly grateful because I'm also an upstream maintainer who doesn't have to try to keep up with all the various packaging processes.
If you want to support them, the next time some software is unavailable or lagging behind in your distro, see what you can do to help!