Hacker News new | past | comments | ask | show | jobs | submit login

Ah he gets it. Shared libraries are a good idea if you only consider performance and idealized scenarios. In the real world, where users have to use programs that they want to not be broken, vendoring absolutely everything (the way Windows apps do it) is essential.



Huh - are DLL's not a thing anymore?

IMO the best reason to use shared libraries has always been cases where updates are likely critical and there's a well-defined interface that is unlikely to have breaking changes. Like libssl. If there's a security fix, I'd love to have `apt update` apply that fix for everything, and not worry about verifying that everything that might have statically linked it gets updated too. Although ideally the latter would be easy anyway.

It's also nice for licensing - being able to have dependencies that you don't have to redistribute yourself.


> Huh - are DLL's not a thing anymore?

There are some shared dll's too, perhaps most notably the OS APIs. But 99% of the dll's on my system sit in Programs\specificprogram\x.dll

"Dynamically linked", doesn't imply "Shared between multiple executables". The trend on windows just like on Linux and Mac is for bundled-everything even for the thing you'd think was last to get there on windows: things like C++ runtimes and .NET frameworks are now bundled with applications.

The tradeoff between patching in one place, and being forced to maintain backwards compatibility and effectively make the runtime an OS component was won hands down by the bundle everything strategy.


Many apps on Windows bring their own local dlls that no other apps on the system use.


'Dynamic linking' and 'shared' are different things though.

You can have DLL's bundled with your app, so that the app links at runtime, but really they're not for 'sharing' at the OS level so to speak.

I think the number of DLL's shared should be very small, and as you say 'very stable ABI' like for OS level things.

Everything else - in the bundle.


As others mentioned, most now sit alongside the executables.

To give some historical perspective of why, check out this Wikipedia article on "DLL hell", more specifically the "DLL stomping" section:

https://en.wikipedia.org/wiki/DLL_Hell


To expand on the other answer you got, there's nowhere to put DLLs by default. You install what you need in your directory in Program Files and at that point you may as well statically link them.


Or you may as well not. DLLs sitting next to your executable are pretty much like static linking, except with one crucial difference - they can still be swapped out by end user if the need arises. For instance, to apply a fix, or to swap out or MITM the DLL for any number of reasons. It's a feature that's very useful to have on those rare occasions when it's needed.


Or your program's DLL can get replaced with some crappy malware, and now you can't figure out why Super Editor 2013 is mining cryptocurrency.


There are three ways this could happen:

- An installer you used could've been compromised. For example, the attacker swaps out a DLL for a bad one, uploads the modified installer to a file sharing site, and gets you to download it from there.

- The application has its DLL swapped/modified on the fly before or during installation by pre-existing malware in your system.

- DLL is replaced at some point post installation.

All of these attack vectors can be pulled against a statically-linked program too, and the privileges they require also allow for more effective attacks - like modifying a system component, or shipping in a separate malware process. Crypto miner will be more effective if it's not tied to execution of Super Editor 2013, even if it's delivered by its installer :).

Problems with malware have little to do with dynamic linking. They stem from the difficulty in managing execution of third-party code in general.


> All of these attack vectors can be pulled against a statically-linked program too

Yeah, but then the attacker would have to pull them against a bazillion apps, in stead of just infecting a bunch of more or less generic DLLs and then just replace all copies of those wherever he finds them.


Which is why I said, "the privileges they require also allow for more effective attacks". If you can scan my system and replace popular DLLs in every application that bundles them, you may as well drop a new Windows service running your malware. Or two, and make them restart each other. Or make your malware a COM component and get the system to run it for you - my task manager will then just show another "svchost.exe" process, and I'll likely never notice it.


> If there's a security fix, I'd love to have `apt update` apply that fix for everything, and not worry about verifying that everything that might have statically linked it gets updated too. Although ideally the latter would be easy anyway.

While it is one failure mode, and a well-understood one, to end up finding that you've been owned by an application with a hidden dependency that had a security hole that you didn't know about, the "apt update" means that your ability to fix a security hole is now waiting on the time it takes you to update thousands or tends of thousands of programs, 99% of which you don't and never will use, against the One True Copy of the library.


Do security fixes break general usage of stable APIs often? That is not my experience. I mean that's the whole point of semantic versioning: you should be able follow a maintenance release line (or even a minor release line) without any incompatibilities. I don't know that I can remember an unknown security issue or other critical fix, certainly not many, where I've had to wait for a major release or break the contract.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: