Hacker News new | past | comments | ask | show | jobs | submit login

Debian's 'synaptic'. Help and leverage it to other platforms, if needed. Less pain all around.



Please god no.

Apt itself is barely good enough to handle libraries written in C, much less a dynamic language with multiple potentially-incompatible runtimes, and Debian's policies are dead set against making anything remotely wholesome:

  * As an author, affected middlemen have a stranglehold on easy distribution
  * License wankery (fuck debian-legal)
  * Teenagers randomly patching upstream software without review
  * Shipping non-standard configurations, often with features randomly disabled
  * Rearranging everything to fit their naive 'filesystem hierarchy'
    (this completely fucks up a decent packager like Ruby Gems)
  * Breaking off features into separate packages whenever possible
  * Shipping ancient versions of software with a selection of patches picked
    specifically to introduce no features, just cherry-pick 'bug-fixes'
  * Shipping multiple versions of a runtime with mutually-exclusive depgraphs
  * FUCKING RELEASE FREEZES
    There's no goddamn reason for any non-system software to be frozen ever
Ubuntu is making a decent stab at unfucking all this (at least on their turf) with PPAs: https://help.launchpad.net/Packaging/PPA


These are all problems with Debian's use of synaptic; the program itself is a very good package manager.


Synaptic is a major improvement over the original apt tools -- while it's slower, it actually handles dependencies correctly most of the time, and doesn't throw anal-retentive errors.

Unfortunately, the underlying apt system has plenty of shitty behaviors that aren't even related to Debian's shitty packaging policies or the hostile original implementation:

  * Only one process can even read the db at a time!
  * It's extraordinarily fragile
    * Loves to crap out at the slightest network failure
    * Will corrupt its database on SIGINT
  * Hamhandedly muddles up installation and configuration
  * Does not handle optional dependencies well
  * Poorly handles only part of the 'alternatives' problem
A lot of its failings are rooted in the assumption that installation will be fast enough to be interactive, so why bother?


Actually I believe a few ar problems with the parent poster: "* Rearranging everything to fit their naive 'filesystem hierarchy'(this completely fucks up a decent packager like Ruby Gems)"

The FHS is a known standard which works with every language. What specifically about Ruby makes it unique amongst all other software?


Gems (like NeXT / OS X .app bundles) are self-contained, with the documentation and data resources alongside the code in a standard way. This makes it very easy to support having multiple versions of the same software installed simultaneously, with an optionally qualified import statement to disambiguate.

The FHS inspires maintainers to large amounts of useless and regressive tedium in re-separating the peas and the carrots into global piles. It's not so bad with traditional C libraries, but the brokenness is immediately obvious when dealing with the libraries of a language that has anything resembling a module system.

What's specific to Ruby is that their community somehow managed to not fuck up their packaging medium.


Yes, but native package managers already allow multiple versions to be installed simultaneously.

'What's specific to Ruby is that their community somehow managed to not fuck up their packaging medium.'

Overwriting global binaries in /usr/bin is pretty fucked to me, and I don't think I'm alone in that. Say I'm using puppet or OVirt or other Ruby based system apps - I wouldn't want Gems breaking them. If Python did this (being the basis for most Linux distros) or Perl did this on older Unix there would be hell to pay.


    There's no goddamn reason for any non-system software to be frozen ever
What about, I don't know, developers who release early and often without good test coverage? If you're putting your seal of approval on a bunch of software, you probably want to make sure that it works. This cannot be done instantaneously.


That doesn't mean you should be shipping the version of Firefox released when you last did a freeze, whether that's 6 months or 3 years ago. Even worse, with the way the freeze cycle is done in practice, they need to stop all updates even for bleeding edge users for several months (Ubuntu) or 6-18 months (Debian).

In a rolling release system you don't have to have a single imprimatur of package approval. At minimum everyone implements it with at least 'stable', 'new', and 'fucked' markers for packages, and you can go way further with multiple repos and overlay semantics.


Have you heard of Debian stable, testing, unstable, and experimental?


testing is the only one of those that's remotely usable, and it gets frozen as soon as they start thinking about releasing -- etch was frozen for five months, lenny for seven. Sid / unstable is generally frozen whenever testing is, is still really laggardly normally, and is constantly broken anyway. I've never gotten anything useful done with Experimental.

At least volatile has been around for a few years, so people aren't fucked on tzdata and such because of bullshit policy, though I think it's still off by default.


  * Rearranging everything to fit their naive 'filesystem hierarchy'
    (this completely fucks up a decent packager like Ruby Gems)
Well who made ruby gems install to /usr/bin by default and potentially interfere/overwrite system-installed software in the first place? Sorry, but I think "Linux Standard Base" is a good thing.


To be fair, nearly all package management systems install to /usr/bin by default, and run roughshod over anything underneath them. At least gem can be configured to install in your home directory, and utilize dependencies that are already present. Apt is completely incapable of supporting anything like that.

The LSB is trivial spec-wank. A whole bunch of effort that doesn't address any of the actual real-world portability issues.


There's no goddamn reason for any non-system software to be frozen ever

You may want to re-think this, from a perspective other than the single-user system.

Consider an organization that uses its own software, which may not be very good but does the job productively. Imagine that one piece produces output that a browser must present. The tested and supposedly stable system is updated. With a major new version of the browser. Which now renders the output as a blank page.

Hilarity ensues. Or perhaps not.


No such organization would have their desktops update themselves straight from the internet -- you either have them connect to your own internal server for updates, or you do regular full system imaging -- both methodologies are straightforward and widely used for all major platforms, with explicit support from upstream.


I sincerely doubt porting that to other platforms would be very easy, but perhaps you and I have different definitions of "pain." :)

Plus, you'd have to worry about minor-yet-annoying namespace issues (e.g., a Python package and a non-Python package sharing a name).

Am I the only one not offended by the idea of package managers for each programming language? They always work better when they're tailored to the language.


> Am I the only one not offended by the idea of package managers for each programming language?

Learning multiple tools to do a similar job is a bit of a nuisance. If you know one tool, you can learn its details over time. That's much harder with multiple tools. Does tool X remove configuration files when uninstalling? Can it even uninstall? Do I need to update some configuration files manually?

There's also the part where you sometimes need integration with packages from other package manager systems. System libraries (libcurl, etc.), header files if it gets compiled at install time, make, a compiler, etc.

But since the problem is getting solved by multiple tools already (cpan, pear, apt, yum, just plain ./configure+make, etc.), maybe we should work more on integrating those package managers and less on replacing the others. It seems unlikely to happen with so many incompatible personal preferences around...


There are cases when they don't work better. It happens when your non-python program requires something from python for scripting, or when a python module requires a 'classic' library. A global system is quite good in those cases.

I'm not sure what you mean by the namespace issues. Everything that's installed as a python package in debian is prefixed with "python-"...


About the porting and the pain: I believe the hard parts are defining the metadata, and refining the actual program logic. In my experience over the last decade, 'synaptic' has been the most trouble-free system to use.

I think it would be less work to clone/port the needed logic bits to Windows/whatever, and share most of the metadata defined for Debian/Ubuntu/etc, instead of redoing (and debugging) everything from scratch.


Or perhaps start with something simple, like Arch's 'pacman'...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: