The other day my wife and I were at a baby clothing store. We were chatting with the young woman at the counter, and she was complaining about her student loans. I asked her what her major was, and she said it was physics. I said "oh" with a degree of surprise that made me cringe. I mentioned my brother had majored in physics. She asked me where he went to school, and I told her he went to Yale. She said "oh, my parents went to Yale, but I went to MIT."
I felt bad about the encounter, and I couldn't figure out why. Statistically, you're safe in assuming that any random person working at a clothing store isn't an MIT grad making pocket money while working on her PhD. I realized later that I felt bad that it mattered to me. That when she revealed she was smart and educated, in my head I moved her from one class of people into another.
I have a hard time reconciling the idea that we need more STEM graduates with the difficulty that actual STEM graduates have finding work in their fields.
Even if it's part time work while pursuing a PhD, if there was anything resembling a real shortage, these things wouldn't happen. And it's not a rare story.
The problem is one of miscommunication between tech and media/government.
There is definitely a shortage of developers, a shortage which tech companies have regularly been complaining about.
Unfortunately, politicians and journalists just hear "we need more nerds" so assume there is a STEM shortage (in their non-technical minds, nerds are all equivalent). Thus the broad and pointless push for STEM graduates when what's really needed is a specific subsection of Technology.
I wouldn't even say we have a shortage of "good developers." Like commenters pointed out a few days ago on the "how to make it in the tech mecca" post, there is a shortage of "ideal" developers willing to work for less and a strong employer willingness to pass on 100 or 1000 "ok" or "good" candidates that they could train in favor of that "ideal" candidate.
So I would argue that it's a problem with an asymmetrical job marketplace and poor long term decision making on the part of employers.
Training doesn't always turn an "ok" or "good" candidate into a great one. The requirement is for developers. Either junior developers or senior developers. Hiring a trainee actually costs you developer time in the short run, for no guaranteed output in the long run as many of the trainees will just wash out.
Besides, all you need is some books and open source software and you can train yourself, at least to the level that you're hirable. OK, some things you have to learn from experience in ways that are hard to accomplish outside of working for a big company (large scale distributed systems) but most of the companies that have large scale distributed systems are willing to hire straight out of college and, effectively, train them!
So, the rational thing to do is to let the "ok" developers develop their own skills and then hire the ones that end up being good. But then, I don't think the standard is unreasonably high in the first place.
You're correct in that most "good" developers can self-train on whatever stack the employer uses. The problem is that many employers are not even willing to buy one book and wait two weeks. If the candidate cannot be instantly profitable from day 0, they are deemed to be "not qualified" for the position.
There are too many tech fads and trends for a developer to self-train on everything a potential employer could possibly want before knowing exactly what that is.
In my opinion, tech employers should stop being so specific with their requirements and simply allocate some time for the new people to adjust to their in-house way of doing things. As many of us know well, every development team has its own slightly different way of doing things, which has to be learned in order to work more effectively. Nobody "hits the ground running", because the tech skills are hardly ever the limiting factor in hitting full productivity.
And I assure you that any company that wants to hire people to "hit the ground running" will forget to mention their new hires will be doing that running as a steeplechase over mismanagement hurdles.
Training doesn't improve the quality of a candidate. It assures the candidates you can find at the price level you have chosen are familiar with the specific technologies and processes that your company uses.
Then there's a pretty big disconnect that I've seen. The bigger companies that try to hire all the best programmers--Google, Amazon, Microsoft--don't actually care much about tech stacks unless they're hiring for a senior specialist in some specific technology, and even then it'll have more to do with a problem domain than a programming language. The employers who care about tech stacks tend to be either mediocre large companies (who by definition aren't very clueful) or small companies and startups (who really do need you to hit the ground running because they really do have limited resources).
I think it comes down to whether the person directing the hiring process knows anything about programming or not. At large, development-heavy companies, the person hiring may actually have been a 100% software developer at some point, before taking on management responsibilities. Then there is a big doughnut hole, where the company is large enough to need developers, but too small to want to promote any of them, and that is a wide wasteland of frustration before you get down to small companies and start-ups whose business is so strongly focused on tech that they can afford to have a full time developer, but not so much that they can afford a bad one. Those tend to be split between companies that hire for competence and those that want to hire someone that they can use up fast and burn out. If the latter have fewer numbers, they need to hire more people, so their influence in the hiring market is larger.
The mix is very heavily influenced by geographical locale. Around me, the market is mostly the large-but-not-huge companies that hire a lot of software developers but never promote them. So there's my bias. Google, Amazon, Microsoft, et al. do not have a presence here, so they cannot improve the behavior of other companies with their competitive pressure.
Very few people want to train developers. Essentially, that is the problem in the first place. Employers want educators to provide job training, when that's not what is supposed to happen in higher education.
In the 1990s (and probably earlier), a lot of for-profit educational institutions, in collaboration with a number of large tech companies, produced a lot of "certifications" which tried to fill this gap. It failed pretty miserably for everyone but the for-profit educational institutions.
What's left? Most of the employers want to hire at a lower pay level than most of the qualified applicants are willing to take, especially since regular pay increases have dried up in many markets.
I've been on both sides of this coin, though. I know it's difficult to hire people even if you have plenty of good applicants, because, even when you're willing to train people on the specifics of the position you're hiring them to fill, truly qualified applicants may not work out after 6 months.
In a perfect world (sometimes called "once upon a time"), companies take all of this into account and consider it a part of doing business. In the modern world, companies have found that they can get more work out of their existing employees and out of new-hires by behaving as if all training can be done on-the-job. You might give the new-hires a little less work to compensate for the extra load of learning the job, but the rest of the team has to make sure the job is still getting done AND train the new-hire.
Honestly, the better hiring processes I've seen focus on basic algorithms and data structures as the foundation of judging a candidate's technical ability. That is the stuff you learn in CS. That's also the foundational kind of stuff where, if you don't know it, it doesn't really help if you learn a specific tech stack.
I've also seen C++ shops that quiz you on C++ trivia. Either they were doing a poor job or they were very small and worked only in C++ so it was reasonable for them to ask questions that you'd know the answer to if you just read a book before the interview.
Your post seems to have shifted its goalposts from good to great and back again a couple of times. I would not say that someone becomes great at what they do given only books and practice at home.
I dunno, it feels to me that there are lots of okay developers, but few good ones. Given the nature of the tools available, I am incurring technical debt at a significant rate unless I hire a good developer.
Even this isn't quite true - I'd qualify 'developers' with 'good'. There are a lot of average/bad developers out there and companies don't really want to hire them...but many don't have a huge pool of candidates like Facebook and Google.
I see this miscommunication also extended to within the schools. Both my daughter's schools (elemetary and middle schools) have STEM specific programs but they are entirely focused on the Science or Math aspect (where job prospects are not easy to come by) with virtually no emphasis on technology or engineering (where job prospects are more available).
When I asked my daughters teacher who is one of the STEM chairs how they would better incorporate technology she said she hadn't thought about it. We've created this acronym when really we need to be bumping up the TE as I feel the already existing availability in primary education of SM is sufficient.
I feel a lot of the STEM stuff in K-12 boils down to hypothesis testing. Do X a hundred times, plug and chug into this formula, and presto! the outcome is successful with confidence Y%. The variation is just on what X is.
A professor I worked with absolutely hated the acronym STEM. Each letter of it was very different from each other. It also brought with it an over simplification of education and the issues around it. It does make for a handy political acronym though..
STEM as a term might have made sense if it had only been used to refer middle and high school teaching. Getting "more nerds" pre-career path would mean more people would at least be prepared to pursue whatever the subsection of tech was in demand when they are at college.
We don't need more STEM graduates, we need economic incentives for STEM graduates including companies actually willing to take on and train green grads (everything requries a few years of very specific experience these days), willing to pay them good wages for part time work, willing to pay them good wages at all... etc. etc.
The problem is managers aren't scientists, few companies care about pure research and long term benefits coming from it, and nobody is willing to spend effort and resources training inexperienced average graduates any more.
It is also, in many ways, too easy to get a STEM degree by learning how to be a good University student as opposed to learning your subject well.
The core problem is university needs to be something special and uncommon, not something necessary. Most jobs have little to do with the University training that comes with them and there's not the budget for 1/3 of the population to do science.
Nowadays the economic incentives are the other way around. It used to be that pharmaceutical companies did everything in-house, from early discovery to clinical testing, and Central Research did discovery and lead optimization. That system provided stability for the employees.
Compare that to the pharmco carousel that you can see in action today in Boston and San Diego. The major players are IP brokers that buy startups with promising compounds (good if you are a lawyer or in marketing), but the science is done by university startups which are either bought out or fail, and the number of very capable people on postdocs (because they are between startups) is plain staggering. Try putting your kids through college doing that.
Is anyone surprised that I'm very suspicious of anyone preaching the startup gospel? They are inevitably venture capitalists or marketeers, or fresh out of college and still very green.
Add to that companies like Valeant whose business model is to buy companies that already have drugs on the market and fire most of the research staff. This is predicated on the belief that in-house R&D is nearly always a long term money loser. Better to crowdsource R&D and only pay for the winners.
"the number of very capable people on postdocs ... is plain staggering"
What's wrong with doing a postdoc? I'm in the UK, but here a postdoc is a normal part of the science career path. It pays reasonably well for academia and is a first step towards being a lecturer (assistant professor).
We are not talking about the academic career ladder, we are talking about the career path of med. chemists in general. To an increasing level, early discovery happens at academic startups, and when the company fails (because their two or three compounds fell through) all the scientists are out of work. The fastest new gig available is a postdoc, and the salary is a real problem when you try to put kids through college on that.
This model takes the "scientists are cogs in a machine" paradigm to a new level. Of course med. chemists are cogs in machines, lead discovery is nothing but finding ligands for targets (perhaps it's also finding the proper target), and your scientific training hopefully equipped you for that. If one project fails, there will be others, and it's management that chooses those that will go forward. But it used to be that you didn't change your employer with your project, Big Pharmco always had plenty of projects going. Nowadays it's the employee that bears all the risks, and that includes the risk of poor management. It's no longer cogs in machines, it's cogs with legs.
Have you heard of the postdoctoral treadmill? That's what happens in the states. Postdoc treadmill is another source of cheap labor for labs, without any stability and job prospects.
> there's not the budget for 1/3 of the population to do science
Then what should they do? There is no industrial demand. We could theoretically replace almost the entire service industry overnight with autonomous vehicles and online ordering.
Sounds like a record on repeat nowadays but labor is dying. It takes one person to feed a thousand, and if you average all the labors required to supply ones needs it comes out to less than a whole person by a lot. IE, the sum of plumbing, electrical, power production, road maintenance, car construction, furniture building, home building, medical, agricultural, etc production necessary per person is less than a whole person, and thus you have an excess of people who have no real value add labor to do.
Science is really the only thing you can propose in a labor vacuum - well, we don't need welders or farmers or auto mechanics or factory workers or people digging holes, so why not go figure out the next big thing? Too bad, one, not everyone can do that, and two, research has to be paid for by someone, and considering the gross wealth concentration in the US, its either philanthropy or the government paying for it. And as can be evidenced by real demand for science, nobody is.
We need new industries to come up. New industries which tend to hire lots of scientists come from big government spending initiatives. The ramp up in research during ww2, moon race, etc. Plenty of industries came from that.
We need to push into space further, attack cancer, deal with antibiotic resistance, dig into groundbreaking physics, etc. R and D is only 3% of US government budget.
Isn't it funny how we chastise companies for treating workers like replaceable cogs, and then we do the same in these discussions? If I need an expert in neurobiology, a glut of physics majors is unlikely to be helpful.
1. It says they don't work in STEM fields, not just their fields. My guess would be that the percentage would be much higher for people who don't work in their field, but in many cases it's still just a statistic.
2. There is no path for job growth in many STEM fields except out of STEM and into management. Most people staying in a STEM job for 20+ years find themselves under pressure to move to management, in part because it is somehow easier for companies to pay the salary in middle management than in STEM for someone with a lot of time in the field.
Still, some people just change their minds, or actually want to work in business or some field considered non-STEM, but do their undergraduate work in STEM. Additionally, many STEM fields require a level of specialization that usually requires an advanced degree or long-term work in the field to acquire that specialized knowledge. Many people get discouraged by this when they see their student debt and their lack of employment options after getting a Bachelor's degree, so, if they continue in higher education at all, they look at getting a degree in something else (like an MBA).
There is no fixed number of STEM jobs... and it's probably true that the dearth of job candidates actually reduces the number in the long run. If you can't find the workers you need, in the long term you shift strategy to that which doesn't require them.
After all, the United States isn't the only place to find them.
I thought the argument for increasing the number of H-1B visas is that we need more technology people? The current limit is 65,000 people every year, and the claim is that's too low.
There is a shortage... Of tech workers willing to indenture themselves, or to work for a fraction of what they're worth. But the real elephant in the corner of the room is ageism.
I am 27 (or I will be in some hours... right now 26, in some hours 27).
I am jobless for a while, and in my country, industry research showed that the companies are hiring less and less people in their 20s and hiring more people that are 50+
Any position I see with reasonable pay (ie: pay more than my rent + food) requires 20 years of experience+ since I am 27 I cannot have 20 years of experience (I learned to code at 6, but I don't got my first job at 7 :P)
This is likely because you've been receiving the message that it's a sin to have preconceived notions and to even formulate a mental model of someone that contains any negative elements.
It's not. Instead I suggest taking pride in that you are not restricted to initial impressions and were able to move her box upon receiving additional data points.
I don't wanna reveal the identity. There was this girl in a group I am familiar with. She did BS in physics, PhD in astronomy/astrophysics from Yale. She works as a data scientist at one of valley companies. Even though she is a data scientists, what she does most of the time is to write SQL queries against hive which is a SQL interface to hadoop/bigdata.
If she is indeed an MIT grad, there are many companies who wanna hire her just to do SQL queries.
I am actually in need of an underemployed physicist. I'm writing a book about quantum mechanics and I need a consultant to make sure I get the math right. I was going to put an ad on Craigslist, but maybe someone here has a lead?
I don't know if he does this sort of thing, but I recommend Jonathan Walgate. He has a Ph.D. from Oxford in quantum physics and a business helping people to write grants. He's also a great guy! (I used to do quantum physics.)
That doesn't seem odd to me. If there's a young person working a retail or service job, I usually assume they're doing it for extra cash while they finish their degree. 9 out of 10 times it's true and completely predictable, as these sorts of jobs provide far more of the necessary flexibility than traditional "professional" jobs.
On the other hand, if they're 30+ my preconception is completely different.
If she's smart enough to get a physics degree from MIT, she can probably teach herself enough to become a developer and get a big salary boost. Heck, with the way MIT STEM programs work, she probably already can program and pass most technical interview questions.
(She also might not have been entirely truthful with you.)
Couldn't it be because her skills are obviously being underutilized? Society could benefit from her a lot more, and she could get more resources at the same time, but for what ever reason that's not happening. And that is rationally upsetting.
It was upsetting to me because I don't think we should value and respect people who are MIT grads more than anyone else. Someone who is working retail because that's all they can do is just as valuable as someone who does it while working on something better. I was upset because I realized how classist I was being.
That's over $160,000 in college tuition. If it doesn't pay for itself, it's not a very effective use of capital, regardless if her parents paid for college.
I felt bad about the encounter, and I couldn't figure out why. Statistically, you're safe in assuming that any random person working at a clothing store isn't an MIT grad making pocket money while working on her PhD. I realized later that I felt bad that it mattered to me. That when she revealed she was smart and educated, in my head I moved her from one class of people into another.