Hacker News new | past | comments | ask | show | jobs | submit login
What “full stack” really means to the job market (chrismm.com)
40 points by dasmoth on July 26, 2016 | hide | past | favorite | 53 comments



I couldn't disagree more with this entire article.

> the self-taught web developer knows surprisingly little about the web’s underlying technology.

I know very few developers who aren't self-taught. In reality the opposite seems to be true. It's the developers who take a course and don't continue learning that stall. This is something technologists (even non-developers) absolutely _need_ to do -- self-teaching. The article even contradicts itself with the very next line.

> Language-oriented courses cannot cover the complete web stack, and students will end up clueless about what an htaccess file does, or how to restart a Unix daemon, or how the different types of POST encoding work.

Then, there's this...

> There was a time when looking things up on Stack Overflow whenever you had a problem just wasn’t an option, and many pieces of software had unreadable documentation, if they had any at all ... That is the environment where hackers thrived, and that’s what we are going back to, sooner or later.

It's almost as if he's nostalgic of the way things used to be. Like things were better when the documentation was terrible and you couldn't look things up. To all of that, I say good riddance.

> If every time you find yourself dealing with a complex issue that affects multiple technologies your first instinct is to search on Google, you should reconsider your working habits.

I'd actually argue the opposite. If you ever find yourself with a complex issue that affects multiple technologies, your first instinct _should_ always be to google it. Sure, you should understand the problem and all, but there's no need to make your life difficult just for the sake of "being a hacker".

> I have met many programmers that don’t like to code in their spare time, and that has reliably revealed them to be sub-par developers.

Disagree 100%. Some of the best developers I've met understand very well how to separate their work. The stereotypical hacker that works all through the night on obscure problems tend to pidgenhole themselves and lose sight of the purpose of programming. They also tend to be the ones that burn out to epic proportions. It's a very unhealthy mindset to have, and to be encouraging others to have.


> Disagree 100%. Some of the best developers I've met understand very well how to separate their work. The stereotypical hacker that works all through the night on obscure problems tend to pidgenhole themselves and lose sight of the purpose of programming. They also tend to be the ones that burn out to epic proportions. It's a very unhealthy mindset to have, and to be encouraging others to have.

Yes.

I'd add: anyone smart enough to be considered an expert in any subject will eventually find other subjects to explore. I'd argue this helps them become a better developer, more than coding in their spare time ever would.


And conflating full stack with just front end and back end doesn't help to me full stack means layer 1-7


"Full-stack" was never a reference to the OSI model.

You can spend your whole career deep in the layer 7 weeds nowadays. A legit "full-stack" technologist will dive into layers 6 and 5 occasionally.

Below that you're talking about TCP/IP, BGP, Ethernet, and fiber. The beauty of the model is that each layer doesn't have to care about the implementation details of other layers, and that's reflected in hiring as well.

Nomenclature aside, some of us really do span layers 1-7. But layer 1 is pretty boring, and there's not a lot of value to add in layers 2-4 honestly, except in extraordinarily demanding environments.

Even the people who think they work at layers 3 and 4 are usually just using layer 7 tools to manipulate configurations for layers 3 and 4.


Its how I perceive it - if you cant design an implement a full 7 layer stack for simple applications (ie CCNA level) you should GTFO

"The beauty of the model is that each layer doesn't have to care about the implementation details of other layers"

Err having worked on international OSI interconnects id take that statement with a bucket of Salt -)


The 'full-stack' trend is a reflection of rising, dare-I-say unrealistic expectations, one which the author supports by their recommendations in their blog post. By perpetuating the notion that the only 'true way' to be a good developer is to structure their lifestyle around understanding implementation details behind all the layers of a modern tech stack, they place an unnatural reverence on the mythos of hackerdom while ignoring that software development is not solely a creative pursuit.

As it stands now, 'full-stack developer' is a euphemism, which in hip new places means 'we want you to live and breathe code, because you will be given vague requirements and expected to deliver the entirety of the solution from the bits moving across the wire to the UI espousing the latest visual design language in less than a month', and in established places means 'we want an infusion of new blood to bring sanity to some legacy code and we're counting on you to debug and fix everything by yourself'.


A lot of it is bullshit.

One of the most common recurring posts on HN is "it is impossible to estimate software projects"

Well yes, if you refuse to use a stable stack you will need 300 hours to chase your tail when you could have solved your client's problem in 30.

Writing the average bizapp as an SPA would be like your doctor cutting off your leg because your knee hurts. In any other field we'd call it malpractice, but is is called being one of the cool kids in computing.


> One of the most common recurring posts on HN is "it is impossible to estimate software projects"

Its not so much that it is impossible to estimate software projects as it is that the analysis necessary to estimate software projects with reasonable accuracy involves doing a substantial portion of the work required to produce the solution -- which means that accurate estimation that is useful for decisionmaking is elusive, not that estimation is impossible.


I heartily agree. This view is actually supported mathematically. See "Large Limits to Software Estimation" (http://scribblethink.org/Work/Softestim/kcsest.pdf) by J.P. Lewis. Supplementary materials: http://scribblethink.org/Work/Softestim/softestim.html


Is writing "the average bizapp" as an SPA a bad trade off? Depends on the circumstances and the people involved, but it's certainly not clear to me that this is going to take more time, or entail more risk, than writing an "old school webapp".

Doctors are expected to stay up-to-date too, and at least keep an open mind about new treatments and procedures.


Medicine's been a work in progress for centuries, doctors have done and believed all kinds of stuff we can call crazy today to figure out what works.

Your sore knee analogy... probably really did end in wholesale unnecessary amputation and death until we shot and hacked up enough people during wars to learn more accurate ways to treat a sore leg. We learned a lot from crude, violent and deadly experiments by Nazi doctors doing stuff that would be intuitively recognized as illegal in most countries today.

Programming is also in a state of evolution. Maybe one day we'll look back at SPAs and everyone will agree they were as crap as java apps in browsers. We used to think asbestos, lead, oil, cigarettes, opium etc were great solutions too.


> Your sore knee analogy... probably really did end in wholesale unnecessary amputation and death

Really? It doesn't seem likely to me that even a primitive medical person would amputate a leg because someone's knee hurts


We know amputations date back to stone age times - https://saesferd.wordpress.com/2010/01/26/prehistoric-ampute...

http://health.howstuffworks.com/medicine/modern-treatments/a....

Lots of medical/all history will surprise you right into the 20th century and beyond -

http://www.cracked.com/article_15669_the-10-most-insane-medi...

#10 morphine cough syrup for kids

#9 mercury for ... everything

#8 heroin for coughs

#7 electric shocks to cure erectile dysfunction

#6 lobotomies for mental illnesses

#5 urine therapy ... lives on

#4 bloodletting for ... everything

#3 tapeworms for weight loss

#2 holes in skulls for headaches

#1 vaginal massages for female hysteria


I proudly call myself a 1/2 stack developer. Didn't put it on the resume, but I used tell that people who were interviewing me, it makes for a good joke to break up the ice.

So I know one half, but I know it well. Yes, I've heard of front end frameworks, some design ideas, some CSS but I might as well say I don't. Well, so far it has worked pretty well for me.

I also heavily interviewed for my company before, and would usually be pretty leery of who claimed they are "full stack". Yes, some candidates were really good and knew both sides, but most just knew about superficial stuff on both ends (which is ok too, just not for what we were hiring).


>I have met many programmers that don’t like to code in their spare time, and that has reliably revealed them to be sub-par developers.

Whatever little credibility you had up to this point you've just lost.


I quit reading here. This idea that programmers have to work 18-hour days just to avoid becoming substandard is idiotic.

ps. if you have to spend all your free time studying to avoid losing your job, you never actually had any free time to begin with.


This idea that programmers have to work 18-hour days just to avoid becoming substandard is idiotic.

Is it stupid because it isn't true or is it stupid because you don't want it to be true?


It isn't true. If you use your time wisely then you will be just fine. If your employer doesn't give you the time/budget for on-going education, then you need to look for a new employer.


it's stupid because it isn't true.

the best and most productive programmers i know have lots of other hobbies and interests, both cerebral and physical. they are intellectually gifted people.

if you need to try that hard, maybe you're just not that smart. i learned that early, because i'm not that smart, and working 18 hours a day sure as hell didn't fix that.


>if you need to try that hard

Do people who program in their spare time usually work 18 hour days?


What does programming in my spare time have to do with studying for my job? I think the idea that the article was trying to convey was that the best programmers love the work enough that they'd be doing it on their own, even if they weren't paid for it.

I work an 8 hour day. I may spend another several hours coding on some days, but I would characterize that as play, not work. I try to do things that I don't think I could convince anyone to pay me for.


Not exactly, because someone who explores other technologies unrelated to his/her "work" field will inevitably pick up experience and knowledge - which the employer can grab for free without sending the employee to expensive courses or having to hire highly specialized freelancers.

I have done a lot of weird things as "hobby" projects with even weirder technologies, including interfacing with smartcards in PHP (don't ask), working with cheap-ass Chinese thermoprinters, electronic chips, devkits, FPGAs, operating a mixed OpenLDAP/AD stack,... and I very often have profited at my job from the knowledge I gained while developing.

Oh, and someone who never heard of Apache rewrite rules please shouldn't call himself a full-stack developer. Ops (and being able to formulate ops requirements to a hoster!) are a part of the stack, too, not just deploying some nodejs stuff to a container in the cloud.


> Not exactly, because someone who explores other technologies unrelated to his/her "work" field will inevitably pick up experience and knowledge - which the employer can grab for free without sending the employee to expensive courses or having to hire highly specialized freelancers.

I don't think there is any debate that, in general, someone who tinkers, explores, and codes outside of work will be relatively better than someone who doesn't. That is not the same as saying the former is above some absolute line while the latter is below it. This is the characterization that people here, including me, take issue with. You can easily be competent and effective while not spending time outside of work writing code.


>I don't think there is any debate that, in general, someone who tinkers, explores, and codes outside of work will be relatively better than someone who doesn't.

I've found that the truth of this depends almost entirely on the type of coding one is doing at work.

If you're doing challenging work all day and then do some challenging work after hours, you'll likely be better than most based on the sheer amount of practice.

If your day job is mostly a joke and you are maintaining systems that you don't really understand, coding all night won't make you better than someone who is doing challenging work all day and logging off entirely at closing time.


Hence why I said "in general", though after considering what you wrote here there may be more exceptions than I thought.


You're not wrong. But some people don't like to stare at a computer and program for 8 hours a day and then go home and keep doing it. They may have other responsibilities or other hobbies, or just a highly active social life.

There was a time when I was younger where I spent a lot of time outside of work programming, but I also didn't have any other hobbies (well, playing video games doesn't really count), no responsibilities, and almost no social life. Now that I'm older and have all three, programming at home is a luxury I often choose not to do. But that doesn't make me a bad programmer.


One could just as easily argue that you're not a "bad programmer" specifically because you used to spend so much of your free time programming.


But the point is that it totally depends on the person and it's not this hard and fast rule.

It's also vague to the point of meaninglessness considering all the various kinds of technologies and depths within those technologies one might be toying with in their spare time.

I would argue that the general intuition that many programmers have about people who code in their spare time is simply a too-narrow view of the picture. It's more about people who are really good at LEARNING, and people who code in their spare time tend to become very good at self-teaching new things and figuring out how different systems work and interact.

A developer can be good at learning as long as he or she is practicing learning something in their spare time, but I don't think it has to be a side project (or even technology-related at all) to see a benefit in professional productivity.


> A developer can be good at learning as long as he or she is practicing learning something in their spare time, but I don't think it has to be a side project (or even technology-related at all) to see a benefit in professional productivity.

This is an excellent point. I have lots of side projects outside of work. At the moment, none of them involve code but they all involve learning things; some of them are even fairly technical.


People used to have "hobbies". Now it needs to be a side project.


This opinion needs to fucking die in a fire already.


Is it really surprising that programmers that practise additional 10 to 15 hours a week improving their skills outside the scope of whatever project or tech stack they are limited to at work were better in this person's experience?

It would be pretty incredible if spending a lot of free time on challenging personal projects made someone a worse developer or had no impact at all.


Not denying your points.

It's not the only way to be a good developer though, that's the idea I hate. It's about teaching yourself to be a good learner, which can be accomplished through a variety of hobbies that don't even need to be tech related. If programming at home is your jam, then more power to you. Saying that any developer who doesn't code on the side is, categorically, "sub-par" is ludicrous though.


The ability to automate is powerful. Anyone who doesn't find a use for it in their private life isn't just a mediocre developer; they're a mediocre human.


I wouldn't go so far as to call people who don't see the universal relevance of computation "mediocre humans," but they're certainly missing out on a lot. Computers generalize cognition and memory at scales far beyond what humans are capable of. It's ludicrous to only ever write code to shuffle around cyberwidgets.


I'm blunt because I feel strongly about it. They could be so much greater, but they willingly choose not to be. Then they lie to themselves about being just as good as anyone else despite being denied jobs.


Sarah has a job as a web developer. She doesn't like programming very much, and avoids it at home. Her real passion is acting, so she joined a local theater troupe that performs plays at the community center. Every year they do a little Rocky Horror Picture Show event at the local theater.

Frank is a programmer at a big bank. He does most of his work in COBOL. The work is tedious and the hours are long, but Frank doesn't mind. Work is supposed to be hard. Besides, he makes plenty of money, enough money to help his only son meet his tuition. Frank never really knew what he wanted to do with his life. He just sort of ended up where he is, but his wife and son are happy, and that makes Frank happy.

Both of these people are mediocre developers and mediocre humans. They waste forty hours of every weekday developing skills that they won't use at home and only want so they can continue working at jobs they don't like. Turns out most people are mediocre. Those two aren't upset with what I posted, because they don't care, because they don't visit Hacker News, because they have lives.

Lets talk about Jeff. Jeff is a sixteen year old atheist who loves science. He gets high grades in all his math and science classes. He can even handle English pretty well. He just does the readings in class! He loves talking about how great science is. When he's not talking about science or playing league of legends he's browsing the internet looking at things like the "I Fucking Love Science" Facebook page or r/atheism. Jeff is a happy intelligent scientific young man. He also doesn't do any science. Turns out looking at inspiring quotes on a space background isn't science.[0] Jeff's a pretty useless kid, but he's sixteen, so we'll give him a break.

Mark is a student finishing his undergraduate degree in Computer Science, or maybe he's a man who's worked five years doing front-end development; doesn't matter. Mark is a programmer. He visits Hacker News to get the latest news, spends at least a couple hours a week on Reddit, keeps a small blog on Medium, hangs out with plenty of friends on the weekend, and enjoys a good beer. He doesn't use any of the stuff he's learned at school or work for his personal life. He doesn't host his resume on a personal website. He doesn't use regular expressions at home. He doesn't build a program to split games up into binary chunks so his kids can move games between computers without having to waste one of his expensive CDs.[1] He doesn't write any kind of automated backup scripts. He doesn't use anything like LaTex or org-mode for writing professional documents. He doesn't do anything like programming or related to programming in his spare time because it's boring and hard. Mark doesn't like it, but Mark doesn't care. He's making good money and there wasn't anything else he really wanted to do, but unlike old man Frank, Mark doesn't have an identity. Frank's family is what guides him. Frank thinks of himself as a family man and a provider. Mark thinks of himself as a programmer. Mark spends all his time on Hacker News, Reddit, and Medium reading and writing about software, but never ever coding for fun.

When I talk about mediocre humans, I'm talking about the Marks of the world. If you're wasting your time reading my spiteful comments and naive Medium blog posts, then you identify as a software guy, and if you're a software guy you should be honing your craft instead of whining about unfair expectations from employers. Don't think for a second you're just as good as everyone else. You're not.

[0] http://thebestpageintheuniverse.net/c.cgi?u=youre_not_a_nerd

[1] This is an example program from the book Programming Python


Just had to get all that judgement off your chest, huh?


Good luck with any future interviews.


Most full stack developers have breadth but not depth. It's incredibly time-consuming to have a complete understanding of the entire stack and no matter what you do, even if you live and drink code and do it all the time you will not be a complete engineer, there is just too much and you will never retain it all. Rather than try to be this ninja superstar that start-ups all look for these days its more realistic to be able to work full-stack but specialize in a particular field. Machine learning and a solid academic background in stats, math and understanding of concurrency will get you just as far as a guy who knows React, Sass, HTML5, Python, Ruby, Go, MongoDB, MySQL, AWS, Redis, Celery, Kafka and Hadoop.


I recently move from a full-stack position into a more focused position. I felt like I was "jack of all, master of none" which really irked me. The time I was given for training was good, but it had to be split among all the different layers of the stack which meant I never really got an opportunity to get deep into any particular layer.


> Most full stack developers have breadth but not depth.

I have to disagree with this, its a generalisation at best. In my experience the "full stack" developers have had a much deeper knowledge of a subject then the "specialist". This is simply because the layers in the stack are not as discrete people would like you to think and the full stack developer has a much better understanding of why and how something is.


Context matters. Small startups will usually benefit a lot more from an engineer who can be flexible and create a lot of immediate value as opposed to an ML expert, at the outset. Large corporations are a different story of course.


The article never answers the question in its title.

The basic job of management is to organize the division of labor. In practice, "full stack" seems to mean "we don't know how to divide up the problem, so we want people who can cover for this management failure".


Also - "full stack" can also mean "we don't want to hire another developer, we want one person to (poorly) do the job of three other coders"


>>> Whenever you have to google some error message or problem, read all the answers. Get as much context as possible on your problem, and do not be satisfied just with having come across a solution.

This is a fatal flaw.

Considering a TON of JS frameworks are so new, there just isn't a whole body of data on issues people are experiencing.

I don't how many times when React started getting traction, I'd google an issue I was having and there was one Stack Overflow question about, and one answer and neither had been upvoted. No tutorials available and posting something on the Google group was about as effective as lighting my hair on fire.


And then there's the converse problem with older frameworks like Rails: you google a question and you find three different answers, and now you have to figure out which of them work in your context (because your context can't be summarized by which version of Rails you have).

If you're googling something about Linux it's even common to find that none of the answers fits your context.

The more "full-stack" I get, the more I just hate all stacks. Everything turns to shit if you give it long enough.


"Full Stack" == VHDL / BPMN / ITIL / PDDL / .this !!! ;-) (and of course, IGMP)


oops sorry I forgot to mention sharding!


I don't agree with the author.

But full stack became a thing in job ads for a reason. And in my opinion the reason isn't because fullstackers are better. It is because of the dissonance between being able to see what cadidate needs to become your employee and being able to reach those candidates (marketing skills, interview skills, communication skills, finance skills). Using the word "fullstack" in job ads just tries to minimize this skill-gap.


There is an astounding level of arrogance and developer elitism permeating the whole post. I think lojack with top upvoted answer captured most of my grievances.

The only thing I would add is that as high quality (and progressively cheaper or even open-sourced) layers of abstraction continue to be developed, it will favor the pragmatic problem solver who seeks breadth in knowledge over the pedantic hacker who DFSes into esoteric issues that are often nonessential to creating core value. And that's because the main ingredient in successfully writing software that creates real value is finding the simplest solution that works well enough. Very often that means biting the bland bullet and setting aside your burning desire to do something complicated.


> "Language-oriented courses cannot cover the complete web stack..."

On Coursera, every course I've taken has used the programming language as a means to an end, not an end in and of itself. Perhaps the author should Google online courses more extensively.


> I have met many programmers that don’t like to code in their spare time, and that has reliably revealed them to be sub-par developers.

I dunno. When I spend 8 hours a day coding (paid), spending another 2-4 working on my own stuff isn't all that appealing. I mean, I want to do other things with my life than just write code. If that makes me a shoddy developer so be it.

When I'm not busy on paid work I do explore new technologies and techniques. Perhaps this is what the author is refering to? However I still consider this to be "work", as it's a way to shore up my skills. It's unpaid but it makes my skillset and my time more valuable.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: