> Except that, in this analogy, you're a carpenter.
Alright, and if you're a junior carpenter, are you going to be tasked with chopping down the perfect trees and whittling those down into the perfect legs and tabletops from which you will assemble your perfect end tables? Probably not; you're almost certainly gonna start off by, at most, going to your local hardware store, buying whatever shitty wood they've got, and putting that together into something that might look god-awful but at least functions approximately like an end table. Only after you've gotten good at building end tables with off-the-shelf wood are you likely going to have the requisite understanding of the limitations of said off-the-shelf wood to have any interest (let alone understanding) in going with custom wood.
Similar deal with software. If you're a junior programmer, yeah, you might be encouraged to develop your skills by developing libraries, if you're lucky, but in most shops you're probably gonna be tasked with, you know, actually delivering software, and that software will probably need to use existing libraries for expediency's sake. It's exactly why a hefty chunk of web development nowadays involves full-blown frameworks like Rails or Django (or even lighter-weight alternatives like Sinatra and Flask, respectively) instead of people hand-crafting HTTP request parsers / response generators and hooking 'em up to raw sockets.
"Store-bought" libraries, like store-bought wood, ain't perfect by any stretch. They still provide a reasonable starting point for getting a project done reasonably quickly instead of getting stuck in the weeds (literally) of growing your own trees.
I think we're probably extending the analogy too far, but I'll play along.
We don't craft programs out of machine code any more. We use high-level languages. So that's where we're not growing our own trees any more. I think the rest of your analogy agrees with me: even junior carpenters need to know how to build an end table by actually building them rather than buying them from Target.
It's a judgement call, I'll agree. But only allowing your junior devs to write glue code for imported dependencies means that's all they know how to do. They'll get the idea that that's all coding is - find the right dependency and write some glue code, and you're done.
I've seen this attitude so much recently, especially in startup product development, and it's actually dangerous. No-one audits their dependecies, they just read the description, import it and try to work out how to make it do the thing. If it works, who cares if it's huge, or has 34539473 other dependencies, or has been taken over by a malicious maintainer? Velocity, right? Get it into production and on to the next feature.
> We don't craft programs out of machine code any more. We use high-level languages.
A.k.a. we chop down existing trees instead of growing our own.
> This is going to end badly.
While I don't disagree, the alternative is more often than not for nothing to ever get shipped because the development team is busy reinventing the universe, especially when a "don't use libraries" mentality gets taken to its logical conclusions ("we need perfect custom hardware to run our perfect custom operating system written to run our perfect custom programming language/runtime to run our perfect application" - just because Google has those sorts of resources doesn't mean a lone developer trying to get a side project done has those sorts of resources).
There is, however, a reasonable middle ground: start with those dependencies, and then as you encounter their limitations start working to replace them. The vast majority of development teams don't have the resources to make "do everything in-house" the default, but they might have the resources to selectively fork or replace a critical library once they've built an initial implementation of their project on top of said library. To your point about junior devs and learning opportunities, this is a perfect way to learn: "alright, so you built this app with this library, great, now write a replacement for that library that does A, B, and C instead of X, Y, and Z".
I agree with that. I think I just draw the line further down the scale than you. Possibly, that's because I'm an old fart and more used to having to write everything from scratch, and so therefore more used to it (and less liable to consider that wasted effort).
Alright, and if you're a junior carpenter, are you going to be tasked with chopping down the perfect trees and whittling those down into the perfect legs and tabletops from which you will assemble your perfect end tables? Probably not; you're almost certainly gonna start off by, at most, going to your local hardware store, buying whatever shitty wood they've got, and putting that together into something that might look god-awful but at least functions approximately like an end table. Only after you've gotten good at building end tables with off-the-shelf wood are you likely going to have the requisite understanding of the limitations of said off-the-shelf wood to have any interest (let alone understanding) in going with custom wood.
Similar deal with software. If you're a junior programmer, yeah, you might be encouraged to develop your skills by developing libraries, if you're lucky, but in most shops you're probably gonna be tasked with, you know, actually delivering software, and that software will probably need to use existing libraries for expediency's sake. It's exactly why a hefty chunk of web development nowadays involves full-blown frameworks like Rails or Django (or even lighter-weight alternatives like Sinatra and Flask, respectively) instead of people hand-crafting HTTP request parsers / response generators and hooking 'em up to raw sockets.
"Store-bought" libraries, like store-bought wood, ain't perfect by any stretch. They still provide a reasonable starting point for getting a project done reasonably quickly instead of getting stuck in the weeds (literally) of growing your own trees.