Hacker News new | past | comments | ask | show | jobs | submit login

It cannot be done as easily with interpreted languages (Python, PHP, Ruby, Node.js), though--and those are precisely the languages from which Go has been stealing users.



Vendoring in your dependencies is definitely not hard in a dynamic language, just for some reason most projects never bother. I doubt it's a significant reason why they've been moving to Go.

I think it's more prosaic reasons, like that's the language their job uses. People who use dynamic languages tend not to think of themselves as specialists in their stack.


> Vendoring in your dependencies is definitely not hard in a dynamic language, just for some reason most projects never bother.

Not to take a side in static-vs-dynamic linking or the language argument, but that is absolutely incorrect. Single-static-binary is a very significant reason for lots of people moving to Go: that, plus a good cross-compilation story make a lot of problems go away.

Vendoring dependencies is hard in general. It's much harder in dynamic languages.

Some evidence/examples:

- Look at how many different ways of packaging things that Python has.

- Ruby, which most people consider to be one of the scripting languages that got vendoring right out of the gate, still struggles with system libraries used to bootstrap the Bundler process on deployment targets.

- Node.js, another one which is considered to have gotten vendoring right out of the gate, has massive problems with its implementation: package assets in node_modules take forever to fetch/inflate deployment times and artifact sizes, and put strain on systems. People argue that the difference between "my node_modules directories have so many files I ran out of inodes" and "my golang binary is really big" is just a difference of degree, but it's a big difference regardless.

- Vendoring/deploying compiled/native dependencies are a massive hassle in dynamic languages as well: better make sure that you compiled those deps in a way compatible with your target system (a big hassle if you are, say, building an old Perl C/XS extension on OSX and targeting Linux for deployment), and make sure they all link correctly once there, and, if they link, hopefully they link with system libraries that don't have behavior differences from wherever you tested the code. And a lot of popular libraries have a native component.

- There's also the problem of dependency resolution. Several dynamic languages have hard-coded system library paths, which means that if your vendoring misses a spot, you might be loading an unexpected version of something, or failing to start. The "just put everything in the system lib path" ignores the reality of multitenant/multi-use systems, and as a whole 'nother piece of expertise.

- The popularity of Docker/containers is largely driven by the fact that they let you "statically link" your whole stack. That demand indicates that some folks, at least, found the vendoring story for dynamic languages difficult.

> People who use dynamic languages tend not to think of themselves as specialists in their stack.

This sounds suspiciously like "if you use $language you're an idiot/inferior". Spare me your arrogance and language elitism, please. There are specialists, generalists, experts, and idiots on every platform ever invented--in very, very similar proportions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: