We are overpaid for incredible working conditions and devs basically became capricious divas, despite the fact 90% of them are plumbers, and many not very good ones.
If you had any professional doing the same, wasting so much resources as us, changing part of the tech stack every month, debating vocabulary on twitter ad nauseam instead of coding, and whining about how their first world problem should be the focus right now rather than doing their job, they would get laughed at.
But we were incredibly lucky that IT is the most amazing productivity cheat code humanity has come up with so far, so that all this BS was accepted as the cost of doing business.
Well, here is the wake up call.
No, we are not paid to rate the best cappuccino of the valley, converting the most stable software of your org to Elm nor write a commit hook so that nothing can be pushed before the diversity committee validated the change set.
We are paid to solve problems.
If you don't solve problems, when the hard times come, and they always do, you become part of the problem.
>But we were incredibly lucky that IT is the most amazing productivity cheat code humanity has come up with so far, so that all this BS was accepted as the cost of doing business.
Aren't you seeing it backwards? Other professions should have the same "luxuries", in a humane working world. Instead people are treated like cattle.
IT being an "amazing productivity" boost, means those "divas" you describe just get small part of the value they create, especially since a heck of a lot of the rest ends as unprecedented profit levels...
Under a capitalistic labour market theory, employees are not and can't be "overpaid" (and surely not for decades on end). They are paid exactly what they are worth, as price is based on supply and demand equilibrium.
Under a marxist labour market theory employees still can't be overpaid if the company makes a profit on top of what it would re-invest in itself (infrastructure, etc.).
So, under neither theory workers can be overpaid under any normal circumstances.
At worst, under capitalism there can be a higher market rate that gets adjusted to a lower one, but at each point in time none of those was "overpaid", it was the best the market knew to set. The higher rate was appropriate supply-and-demand-wise for when it was applied (and it might raise again as IT companies compete for developers, and so on).
Let's not forget the "US/Europe needs X millions more developers" reports churned out every few months...
> 5. and lets not forget the complete anti-union sentiment and lack of "professionals" to unionize
I would hate for my bonus to be negotiated by the union. Generally the union gets a new funding and the employee get 14,89€ a year as a “hard environment” bonus.
In fact I used to work in a unionized company as a developer, and not only there was groupthink, but the groupthink was bullying the weaker elements.
And as developers, we’re all socially inept, so the weaker elements were most people except the union.
Worst working conditions ever, it was ATOS in France.
As someone who spends every day in a state of the art FAANG office, I don’t think anyone deserves or needs this level of comfort. I think it’s wasteful and unnecessary. I can make my own coffee, I don’t need on-site baristas doing it for me.
The point of company perks is to convince people to forgo more salary than the perks costs the company. On-site baristas are cheap, but they seem really expensive. In the end it’s a subsidized company cafeteria which has been a thing for the last 100 years or so.
The amount people care about free snacks is vastly out of proportion with the actual cost of providing free snacks. That’s great because young people making ~200k don’t really care much about slightly more gold plated health insurance coverage relative to how expensive it is.
> The point of company perks is to convince people to forgo more salary than the perks costs the company.
It's also that the company can provide these things to all employees for less than it would cost for every individual employee to buy them by themselves.
On the other hand, if you're going to have a communal coffee machine, might as well have a bunch of other stuff and then pay dedicated staff to run it, for the convenience of everyone else.
Maybe it's copycatting of the college dorm startup vibe, too.
Ok but cutting benefits just means that baristas are out of work and more $ goes into the pockets of billionaire shareholders who need it even less than you.
The only reason those perks exist is because of the unfathomable amounts of profit those companies make compared to others.
Tax the shit out of these companies and use the money to fund more socially useful jobs in healthcare, caring for elderly people, writing code for public services and a million other things.
The fact that luxurious living wastes labour is not a plus for luxurious living. There are tons of things society could be doing with that labour that are more useful, they just don't get done because we allow pseudo-monopolies to retain too much rent.
I agree with that. But until that happens (if ever) I'd rather companies spread the $ around to their employees rather than keep it all for the shareholders.
You don't need fancy baristas and exotic blends, but cafeterias with people making coffee for workers has been a thing in offices and even factories for a century or so, it's not some unique privilege...
A lot of bad takes in response to this. An actual reason is that if you are as productive for FAANG as the numbers imply, $1M+ in revenue per employee per year, then the value of having a barista make your coffee instead of you taking 10 minutes to do so is well worth the cost. It might cost the company 80$ of your productivity when you make a coffee but only 13$ to pay the barista. That delta over thousands of employees and hundreds of work days adds up.
> can make my own coffee, I don’t need on-site baristas doing it for me.
Hey, some FAANG offices require you to make your own coffee. I had to make my own espresso on one of the $5k+ espresso machines in the microkitchen. Of course, I'd do that after company-provided breakfast in the cafeteria. My cash compensation alone was what I imagined my total equity would be worth if my first startup "hit it big" and IPO'd.
I agree that it's totally unnecessary, and would cringe when people would complain about the quality of the coffee purchased for the machines, that office X got free laundry service and we didn't, or that it was lame that there was no free dinner on Friday nights.
When I started my career, some Hot Pockets in the freezer seemed like an extravagance. I want a decent pay, a clean work environment, IT infrastructure that isn't a burden, and to feel like I'm making a real impact. Beyond that, it's all gravy.
I don't see why not, it's simple division of labour. A barista churning out coffees all day saves a full day of developer time, while being more efficient and skilled at the task. Same with canteens.
When those benefits get stripped, the savings won't be used to feed the hungry or the homeless. It'll get capitalized into equity values or profits raked in by the 0.1%.
OK, but the costs aren't coming from fancy coffee. They come from salaries of engineers. So the question is really about how many employees there are and how much those employees are paid. And maybe, secondarily, how many facilities are needed to provide space for a certain employee headcount.
If US/EU get just a significant fraction of those developers needed, software dev salaries will go to the level of other engineering, say mechanical, construction or material one. Employers would be very happy with that. And lets be honest here, its not that different job despite what some bigger egos think about their work and echo chambers that keep them in that dream.
I don't agree those other engineers are treated as cattle, merely as normal employees. IT, mostly in SV and other places cargo-culting ie Google is often really treated like divas. Bear in mind that this was never true in corporations where IT was treated as a cost center, ie banking. But from various HN posts it seems that typical young SV dev has no experience with that and many feel they are changing the world for the better, when reality its at best a zero sum game, and often not even that (ie optimizing ad revenue stream for your corporation is definitely loss for mankind as a whole, or anything that makes societal parasites who breed depression like facebook/insta/tiktok more effective at their goals).
But folks for some obscure reason need to feel that their work has a good purpose and high moral ground, hence sometimes quite advanced mental gymnastics seen also here.
As you hint, incredible demand for full stack developers created high salaries at businesses that viewed that as their core product. Businesses that built things that had software mostly kept paying what they always had. That voracious demand seemed to feed on itself a bit and grow on its own. When times got rough those businesses asked themselves if they were getting the expected roi on all those positions, and if they could be more efficient and do almost as well with fewer heads. They apparently decided they could. I don't find that surprising, software isn't infinitely improvable. Once you have a product done and refined, there simply becomes less to do over time. If the hours spent maintaining a site never decrease from initial development levels, you are doing something wrong.
> IT, mostly in SV and other places cargo-culting ie Google is often really treated like divas. Bear in mind that this was never true in corporations where IT was treated as a cost center, ie banking.
It's interesting to look at fintech, where banks purchase tech focused company with great engineers because their internal IT simply can't compete thanks to a cost-center culture.
Any industry that doesn't have a regulatory moat around it has tech circling it, looking for the parts of the business with the biggest margins. Even Amazon and Apple are timidly entering highly regulated markets such as health.
The 75th percentile for Software is $151k vs. $120k for Mechanical. Median is $120k vs. $95k.
So about 20% higher for Software.
I would have expected the difference to be higher based on levels.fyi and anecdotal salaries shared on HN. I guess the average software engineer also isn't working for a top 5 tech company who pay well above average.
This is just academic gymnastics.
If market conditions dictate that a worker produces 10 dollars in value so you hire people for 9 dollars, and then the value of their production is reduced to 7 dollars you need to fire someone.
In the real world there is uncertainty in the projected value their work (if it can even be measured, and not taking into account non economic consideration). So someone can be overpaid for a while until it is discovered, at least in the way most people think about the meaning of overpaid.
Most developers don't live in the Valley, most of us didn't earn astronomical salaries during the past decade, most of us didn't have to run code reviews past a "diversity committee". We mostly just got on with our job like everyone else.
Of course we debated languages and frameworks but so what? Every profession has its arguments over tools and processes. Everyone complains about their manager and endless meetings.
Of course, It's a caricature dedicated to HN readers.
I have met plenty of competent devs, working seriously and providing values to humanity, well enough to justify the not so huge salary they get.
But stating the obvious is not going to make people think.
However, the message that part of the community is growing into entitled spoiled kids, shocked that their shinning arses can even be touched by a lay off, is something I did want to convey :)
Sure of course some may be entitled spoiled kids. But do you know who is even more entitled and spoiled? The VC investor and C-Suite class, who are driving these layoffs. We saw a bit of this entitlement over the weekend as these armchair libertarians bleated for a government bailout.
By feeding the narrative that tech workers are spoiled, entitled divas who exactly are we helping?
Tech workers themselves. Hear me out for a minute. Tech workers have been partially insulated from the hard-nosed realities of “They pay me $5 because I make them $6 or $7.”
“They pay me because I know how computers work and computers make them $100K times N somehow, and someone else can make $500K per SWE, so that company established the SWE comp at $300K, but I don’t want to work at that company because they turned evil and have a hiring system that would take me unpaid time on my own to crack, so I’m willing to work at this other company I believe in more for only $150K, and since they make $100K times N from software, they’re able to hire N/2 of us and barely scrape by. Then, when my company profits take a hit from soaring input costs and we pull back on advertisements as a result and the $500K company sees a drop in as revenues and they cut back on their employees as a result, making that $300K reference now up for debate, and these other places pop up on the radar where people just as good as me are eager to work for half of what I cost, that possibility is appealing to businesses who are struggling with uncertain demand from their customers who are stretched from their own soaring life costs for energy, food, and housing. Hmm, maybe as a tech worker, I should try to understand whether this is another temporary tech recession and, if it is, how I should react to position myself and protect myself, my family, and my friends.”
This tech recession won’t last forever; we’re not going the way of ice harvesters or buggy whip makers, but I think we’re going to see a large scale re-alignment and even if tech salaries merely fall back to 2015 levels, it’s going to feel catastrophic to many.
2020 and 2021 were outlier years. Expect more non-outlier years than outlier years and expect a reversion back to needing to connect value to costs and if you don’t know how your company makes money by employing you, try to figure that out, try to increase it, and try to be sure that they know how they make money from you.
We all do better when we create $X of value and take a high but fair fraction of that home for ourselves, leaving everyone in a nice, durably desirable trade of our skills for their needs.
I dont disagree, and I am not part of the SV bubble. Most developers in my country can barely even dream of anything close to US compensation. But we didn't create this climate, VC money and decades of low rates must have played a huge part.
Calling one of the most well-paid and priviledged group of employees im human history, developers, tech workers is an insult o every labor movement in human history. As soon as you have posh white collar office job, you stop being a worker.
Do you own capital or work for it? If the latter, congrats, you're a worker.
And again, HN people living in their little US coastal bubble thinking that the rest of the tech world is as privileged as them. When I started I earned less than a factory worker in my town. Presumably because they wore blue overalls instead of a hoodie they were "real" workers and I wasn't...or does there have to be some level of salary where I can be declared "posh" and no longer a worker?
I'd say Marx would also have his issues with the most priviledged "worker" class out right refusing to unionize and even decrying unions as bad for everyone.
Sure, Marx and Engels both talk about their frustration with class treason, but they were also aware of the pervasive false consciousness in the working class.
Software has huge margins that this guy feels devs deserve less of. Reminds me of the Twitter employee who slept on the office floor getting sacked and saying she has no regrets. I guess it takes all kinds to make the world go round and if we all were the same (eg lack diversity) we’d be sheep, but at least the folks with the money to fail would feel a little better.
Developers are dramatically underpaid, just like all non-exec labor. It's ridiculous to complain about devs making 300k-500k a year when execs are pulling in millions or tens of millions a year with 10M-100B worth of equity under their belt. Focusing on anything other than that is a complete distraction and counterproductive. Devs should be getting 10X their current salary and execs should be getting 1/100th theirs.
I'll add that if you can't easily afford a family house next to your office, you're not overpaid.
Some time ago I used to work in a large organization on the very bottom of the food chain. I was making, say, $100k a year, which was quite decent money. Sitting there, doing same thing I could've grown to a "senior bottom of the food chain" of $150k. That was the limit. A soft one, but still the limit.
The organization was quite picky in selecting their workforce. Think FAANG. So in every team you have a bunch of quite smart opinionated folks, who somehow have to be steered in the same direction. With that I see it kinda reasonable for the team lead to make at least $200k a year. Give or take.
Now we move one step up. Someone has to pull all these "creme de la creme" cats together and herd them, so that at the very minimum teams don't work against each other. Ideally work together for some common goal. And I can understand team leads who is not willing to go to this snake pit for 30% salary increase. Why would they? 50% - may be. 70% - that sounds interesting and worth consideration.
Bottom line: according to my humble experience in "the organization", the salary roughly doubles each time you get up the ladder. And being on different steps of this ladder I can understand why.
Can a family of 4 purchase a home within walking distance of the office? Is the CEO of the company making 1000X more than those at the bottom? How much equity do the execs have and how much is it worth?
Devs are very underpaid, it's just so much value is captured by those at the top that we're all used to fighting for the scraps that happen to fall off the real dinner table.
You could argue that devs are being paid the money that is being extracted from lower-class workers, too. Companies get to underpay for physical production labor, construction labor, etc., and some of that gets fed into developer salaries.
More gets fed into their profits and the executive salaries, but I think you get the idea.
I don’t think you can make the argument that devs are extracting money from lower paid employees when:
1. Exec pay is 100-1000x that of dev pay. I’d also include huge war chests of cash and stock buy backs as evidence that devs aren’t the problem.
2. The vast majority of “high paying” dev roles don’t pay enough money to buy a house in the real estate market that their office is in. A metric that I’d consider bare minimum to call someone adequately paid.
I guess my point is more that devs are a symptom, not the problem. We get the runoff from their exorbitant hoards, which are derived from whole swaths of people getting hosed.
I think if you account for inflation, devs are basically one of the only roles that have been treading water to keep a middle class lifestyle while everything else has dropped through the floor. I do think it's debatable that even developer pay gives you a middle class lifestyle if you can't afford housing.
I do agree that the entire system is broken and devs are mostly working on things that add negative value to humanity but effectively hoover up capital from the masses and deposit it in the accounts of the wealthy.
Because the capital market is opaque and corrupt. Execs aren't selected based on their ability to perform, they're selected based on how acceptable they are to the upper classes that control the capital. Even for those (few) who work their way up the ranks, jumping up levels will be based on politics and ability to be perceived as upper class and not based on performance (which is basically impossible to measure in a managerial role unless you want to give all of the credit of the worker's output to the manager).
So shareholders are dumb and execs are overpaid because they are good at duping them out of their money? Wouldn't a company that does NOT overpay their execs outcompete a company that does?
Shareholders are not dumb, but they have created a rigged game of castles in the sky paid for by greater fools. They look for execs that will successfully con the next bag holder, not people who create innovation, great products, healthy teams...
There's a lot of backscratching going on at that level as well. As an example, more often than you would think, VCs will invest in the parasitic children of potential LPs despite their terrible startup idea because they know it will make raising their own funds easier.
So the CEO is the face of the company, and companies are willing to pay a lot for a face that projects a certain image? If that's the case, then couldn't you argue that it's worth a lot for the company to pay to improve its brand image?
It's worth a lot to the shareholders to manipulate the stock price upwards, that's of no benefit to the customers though. Quite the contrary, choosing executives based on how well they sell equity opposed to how well they can operate a company produces worse results for the market (of goods/services, not of capital -- that's the distinction).
You're confusing the market for equity with the market for the goods and services produced by the company. The shareholders are optimizing for the former at the expense of the latter.
Well, all developers should be WFH so that shouldn't be an issue. For those companies backward enough to force their devs onsite, if there aren't enough homes to house the workforce local to the office, then the office is in the incorrect location.
If people need supervision from someone so extraordinary they will be paid half a million a year, in order to stop them from working against each other, maybe they're not really la "creme de la creme", and maybe organizations should be picky about the right things when selecting their workforce.
Or maybe developers are being overpaid compared to most workers, but underpaid compared to most "decision-makers" (most workers being so vastly underpaid it's not even funny anymore). Overpaid / underpaid is relative.
There's this ridiculous idea that management is partially responsible for the productivity of every developer, which means their salary should be increased commensurately.
You could just as easily argue the reverse: every developer is partially responsible for the productivity of management (whose productivity is zero if the devs get nothing done), so the devs' salary should be increased commensurately.
A hierarchy based on promotion and with increasing salaries isn't based in ironclad logic.
(In fact, you can see this in (in my opinion) two of the most important roles on a development team: the agile lead and product manager! Neither of them are hierarchal managers, yet both influence the direction of the team in critical ways. It works really well; far better than if either or both of those roles is filled by a hierarchal manager.)
This is like pure corporate propaganda, "of course execs do much harder work than devs so they of course they deserve 10x the paycheck!"
It's completely silly, if the average employee at google is generating $1.5M in revenue per year but only gets paid $250k then there is a massive delta in value missing in that equation that is going to Sergey Brin or Sundar Pichai instead or some other Stanford MBA in the massive web of "totally useful and productive" middle management at Google. Surplus value 101.
This comment right here is proof of the bubble - normalising 300-500k per year salaries. Developers don't have a skillset that is harder to develop than university academics or mechanical/electrical engineers in industry. Out in the real world for equivalently skilled professionals in other industries, a 100k salary is great and a 150k salary is fabulous. I'm a 16-year-experienced engineering team leader responsible for process safety on multiple billion-dollar oil and gas facilities and I'm on a great base salary 140k, which is more than many of my peers. There are engineering professors who outstrip my skills level by a factor of ten who are on half my salary.
The exponential elevation of dev salaries into the stratosphere was a natural overshoot of demand vs supply of suitably skilled labour. Now the overshoot is resolving itself starting with decreased demand. Anyone who banked on 300k-500k salaries being the norm has made the same mistake as oil and gas folk have done in every boom throughout the 20th century.
$300k a year will not buy you a 3bd house in the geographies that pay $300k a year. That's a middle (not upper) class life milestone. That others are paid even less is a travesty and a sign of a deeply broken society. Again, the problem is not with those making $XXXk a year, it's those making $XXXM+ a year, or not working at all but controlling $XB. The problem is also that capital has distorted the housing, health and education markets to the point that basically no worker can afford what used to be a middle class lifestyle.
>I'll add that if you can't easily afford a family house next to your office, you're not overpaid.
You are correct, though I will not say that underpayment per se is the problem. It's more along lines of housing is very expensive due to a whole combination of govt interference like taxes, inflation, regulations etc.
If you truly think that this radical change will make the company more productive, what's stopping you from raising a seed round for your new company built on this innovation as competitive edge?
I have done that and it's a great management strategy. You can hire world class developers and talent if you just treat them as peers and don't jack your own compensation up to astronomical levels. It's like a "cheat" to win at management. People will also be less likely to quit and much more likely to join you at your next role.
> what's stopping you from raising a seed round for your new company
Knowing what’s true and being able to convince some investors, who are not universally omniscient and reasonable and open to conversation from people they don’t already know, are different things.
I have been coding for 20 years now, pretty much on every continent, in many different projects.
At least half of them have failed.
I've seen NGO moving to docker based micro services for their data collection tools and burn the donators money for 2 years before dying under the weight of the complexity they just created.
I've seen start up working on ideas for months that made no sense, only to run out of cash but the belly full of useless lines of code.
I've seen corporations hiring 7 people to do the work one single senior dev could do, then from meetings to audits, proceed to ensure the budget would explode and the product never released.
I've seen fancy operations, with free food and fashionable people full of colors and style. And they spent 2 third of their day virtue signaling, fighting over how to do thing, what should have been done instead, and what we should do in the future. But certainly doing nothing right now.
I don't believe the free market is pricing the value of most things fairly. Just like I don't believe diamonds have much value, and never did, even before when people were still buying them for their wedding ring.
Yes, most of us are overpaid, given the problems we actually solve.
But we are in the IT golden age, if you throw computers at things, you get a 100x return. And money was cheap for so long.
So it was ok, to hire 100 people, and have 12 of them being actually useful, because you didn't have time figure out which ones.
However today the economy is contracting. The low hanging fruits have been harvested. And it's no longer economically viable for the companies (that were never our friends in the first place) that they care about how special devs are.
man - if you've been at it 20 years, this isn't your first rodeo...
I've been working 40 years in Tech..
Saw the Wall St. ups/downs in the 90s - NYC was feast/famine back then
Saw the Y2K 'end of the world' ... We're still here
Saw the 'dot com' bust ... Tech/Internet didn't die
Saw the 2008 crash ... We're still here
and thats just what I can remember at 2 am :-) I fully expect us to be here after this current stuff blows over. And I expect there to be a bigger shortage of programmers due to retirement, changing careers, etc.
In fact, I expect to be shortage of programmers even if none of those would happen (and they will), solely because the need will increase as we automatize even more things with AI. The people qualified to do those jobs we partially replace will not suddenly turn into devs.
So we will play divas for a long time. And good for us. Not going to complain that we have such a good situation.
But the fact so many people are surprised by being fired is hilarious. They lived in the illusion they were so special. Like this is perfectly normal to have so many privileges as we have.
We are just very, very lucky. I know doctors that work twice my week to save human life, doing so in terrible conditions and sleep deprived, for half of my salary.
> In fact, I expect to be shortage of programmers even if none of those would happen (and they will)
Did you see GPT-4 demo yesterday?
Seems to me that after a LLM has been trained to the level of GPT-3.5, we programmers not only are not needed, but most of what we can do can be automated.
So how exactly are programmers are needed to automate things with AI? What am I missing?
Fifth Gen project, prolog etc. - Didn't change much
Expert systems were the rage for medicine back in the 90s? Don't see doctors out of work..
IBM Watson passed the bar in UK and was hired as a 'lawyer' - Don't see lawyers going away...
The hype circles keep circling ... Yeah, chatGPT is neat - but its still got a long way to go...
Edit: oh - and lets not forget, we were all supposed to be programming in ADA now. It was going to 'take over' the world, first in DoD... then spreading to general usage. Can't remember the last time I saw an ADA job ?
Developers aren't getting replaced until we have strong AGI. GPT still hallucinates it's output much of the time. You need someone who can review that output before you can trust it.
Maybe LLM's get to the point where they really help dev productivity, and we wind up needing less devs overall, but that hasn't happened in the past when easy to use frameworks like rails made it super easy to spin up a twitter clone. In fact, employment exploded.
TLDR: Devs won't be out of a job until we run out of things to automate. Sometime around the heat death of the universe or we're being crushed by skynet.
Past the very tiny portion of people who work on base necessities to the point they will be given permission to roam when everyone else is supposed to be locked down, jobs are only about fanciful human whims.
"How many people will shortly die of starvation or medical troubles if the job is not performed?" might be a relevant metric here.
So, past these considerations, everyone is going to make needless work. Obviously, from some perspective, they all are overpaid to do so at various degree. IT people, HR, chief suite and many other are on the same boat here, they just don’t navigate with the same level of luxury.
> But we are in the IT golden age, if you throw computers at things, you get a 100x return. And money was cheap for so long.
Check back in five to ten years, this is a slight reversal on a continued bull run. Why? Because people with money want to make more money and saving money, while important is not making money. Worst case scenario we all get transitioned into defense contractors to fight a future where our current overlords are supplanted by an rapidly rising, better educated and executing CCP.
I agree with all said, beside the last two paragraphs. But I'd rather not debate the economical situation.
How do you see IT from the other perspectives?
For example, from the accounting perspective: the profitability of the companies, the cost/benefit ratio, profit per employee, etc.
How the other industries are holding up? Sports? Music? Shows & Movies? Marketing? Law?
We are fortunate that we made the stack very complicated. And the stakeholders believe us that it's organic. Once the stakeholders become IT-proficient, we are doomed (fortunately, as the things stand, that's about never)
The stakeholders won't become any more IT proficient than they have become Accounting proficient. CTOs are IT proficient, and they are enough of the stakeholder.
Pay is a function of value. If you produce a lot of value, you get paid a lot. This is true for all jobs unless the business has figured out a way to get away with keeping most of the value for itself.
When a dev is producing code that generates $x millions either they're going to end up getting paid a lot or the business is going to end up making more profit. To suggest that a dev is 'overpaid' is just saying that you think tech companies should keep more profit from the value they create instead.
It's important to remember that despite all the layoffs, talk of recessions, and 'bad news' around at the moment, company profits and revenues are still way up. Meta's revenue went up $30bn in 2021 ($85.96bn in 2020, $117.92bn in 2021). It fell to $116.6bn in 2022. Maybe the revenue will plunge back down to 2020 levels again this year which would justify slashing the headcount, but I doubt it. I think it's a lot more likely that Meta will announce record profits at the end of the year.
A very free-market take. Do you believe that a fresh graduate at FAANG is really generating 6 figures in value? Do you believe that US employees on average generate 3x the value of their European counterparts?
I don't believe US employees generate 3x the value of their European counterparts, but the justification is right there -- they're paid 3x (or whatever multiple) more.
The actual problem is how you measure "value". We're used to valuing technical work on technical merits, but the market just thinks in money, dollar/euro amounts. And of course it's no fault of European employees that their "value" is significantly lower than their US counterparts -- it's probably due to company leadership, differences in regulatory environment, geopolitical and macro-economical factors, and history (the fact that modern computing started in Silicon Valley is significant).
I agree with the OP that US devs are producing more value - pound for pound - than European based devs, if what we mean by value is money, and I'm 30 something year old European.
The reasons are that the European dev is probably working for a business that's addressing a much smaller market, and that the smaller market probably has a much lower GDP per capita than the US.
It doesn't mean the American dev is better at what they do, or smarter.
> from which they can afford to pay the devs 3x more doesn't actually mean they create 3x more value.
If these two companies are of equal sizings, or if we can find a ratio/cost per engineer/etc that we can say is "equal enough", and we say that Company A makes 3x what Company B makes using this metric, wouldn't it be right to say that Company A's employees produce 3x the profit of Company B?
Given, Company A has a stronghold on the market, government handouts, etc, that are _causing the 3x profit increase_. I for sure agree with that. But, at the end of the day, isn't the employee, the ones producing the output, generating output worth more than Company B?
European work culture is extremely weak. Where are all the European tech giants? The only reason American companies hire Europeans is because 1/3rd of the cost justifies the low productivity.
> Nope, he says this because Americans have a thing called the Protestant work ethic, whereas many Europeans have exactly the opposite of this.
This divergence is greatly exaggerated. Where do you think those hard working Protestants came from?
The main reason European countries are not as wealthy as the US is that they are all much smaller, and linguistic, cultural, and bureaucratic differences make pan European business much harder. You generally have very few large companies that operate in multiple European countries, in comparison to the US.
I'd argue that the free-market take would be that pay is a function of supply and demand, not value. As an example, musicians provide a ton of value, but the free market dictates that they aren't paid a lot, because people who would like to make music for a living are a dime a dozen.
Given that US and EU wages have about a 3-5x disparity, then you might think a) the US worker is worth that premium, and will fulfil 3-5x of your demand for work as the EU worker, b) you have a need towards hiring in the US (certainly understand this in some cases). Or... c) there is a disparity between supply / demand and salaries - why hire in the US if there is so much cheap supply elsewhere?
Monopoly of US firms in the tech industry, their (possibly irrational) reluctance to go remote (until COVID at least), and US work-visa (H1B) quota restrictions.
Now that chasing growth has gone out of fashion and cutting costs is on the table, I suspect companies might lean more towards hiring cheaper, non-US workforce after the layoffs have settled down.
I couldn't afford my lifestyle if pay was a function of value; it'd cost too much. Pay is a function of how hard management believes it is to replace someone.
The value you produce is merely an upper limit on your salary. Your salary is more based on offer and demand, meaning how much your management believes it would cost to replace you.
This is way, way too simplistic and implies that you somehow can know what the value is... The whole discipline of ethics within philosophy for thousands of years had been grappling with this... because it's hard!
Now, just to give you a counter-example, where it would be very hard to argue that the job compensation was based on value: there was an article about a year ago about some government bureaucrat in either Spain or France who's been dead for seven years, and the government forgot to notice, so it was sending him a paycheck month after month. -- How's that for generating value?
Now, of course, there's also a debate on universality of value. Some believe that the value is universal, but the reality is s.t. it's hard to justify this belief. For example, the value provided to a dictator of a fascist state by his bodyguards doesn't seem to align well with the value those bodyguards provide for the rest of the state (and especially for the neighbors of that state). So, how can you argue that the bodyguard's pay is justified? Why do you have to take the perspective of the dictator rather than the people being under their thumb?
And of course, there is a debate about how to measure the magnitude of value: is it absolute or, again, proportional to the subject of it. Those who believe in universal value also tend to believe in absolute value, but they don't have to. So, again, a thousand dollars might be a difference between being able to make good on rent and becoming homeless for a poor person while for a rich person a thousand dollars might be so insignificant that they don't even notice spending it. And then again, you need to work hard to convince others that the value is absolute (i.e. that dollars don't capture the value and so on).
----
Independently of the above:
> When a dev is producing code
I worked for my previous employer for three years. It was before they had any paying customers (a start-up). After I quit, for various reasons, they decided to get rid of my code, and replaced it with something else. In other words, while I worked for my former employer, none of my code generated any value, but I still got paid. They still aren't even breaking even (they have something like five customers), and none of my code is in use anymore. How do you explain my salary then?
And I'm definitely not the only example. Large companies are known for throwing money on something that ends up being a flop, speculatively. So, it looks like there's more to it than simply writing code for the product that generates profits. Don't you think?
Let me put it this way: why should companies use value based pricing and extract ridiculous amounts of profits from consumers for their mediocre services, that goes into the pockets of just a few shareholders, whereas the labor of workers who actually produced the stuff be rewarded by a completely different logic (akin to cost based pricing)?
When workers have some actual leverage, suddenly the free market is a problem and not a magical fairy anymore.
If companies on the market can sell overpriced garbage to enrich their shareholders, surely you can't seriously complain about laborers being overpaid, even if they are. Its all the same game.
The whole system is totally unfair and unreasonable, developers being overpaid is not the problem and such a minor thing to focus on.
Stating those people are overpaid doesn't negate the fact that dev is incredibly well paid position with luxurious work conditions that people manage to complain about while being bad at it.
If you accept the premise that we should mock incompetent divas shareholder and CEO, then you should accept the premise we should mock our own community.
Plus a bit of self-deprecating humor is good for the soul.
Nurses? Our job is a lot easier and less important and working conditions are much better. And yet, a fresh grad straight out of school gets a few times more money than an experienced nurse. And, what is worse, a lot of devs honestly believe that they deserve it.
Are you so naive you think compensation is actually tied to societal value and not revenue generated per employee? Oh what a world that would be, Pre-K Teachers would be living in gated Mansions and garbage men would be driving Bentleys, it certainly isn't ours though.
Ah yes, I remember them. The students that were taking the easy chemistry instead of hard chemistry, the easy physics in 2 classes instead of hard physics in 3 classes, and who mostly seemed to be fairly clueless and just doing nursing because someone told them that health care is a growth field. Reminded me a lot of aimless business majors.
Their jobs actually are also on average easier, their education was easier on average, and their ongoing training investment is on average easier, and maybe (although this I don’t know) with more job security.
Edit to reply:
No, this is not a joke. I remember this impression I had very clearly. I also still roughly remember the courses each major needed to take.
We don't pay people (much more) based on the physical difficulty of their job. If we did, I'm sure the migrant labor that picks our fruit would be first in line for a raise.
My RN friend graduated and got her RN a full 3 years before I graduated with my CS degree. Nursing might be a more demanding job, but it has a much lower barrier to entry than CS, and thus it has a much larger supply (despite nursing orgs complaining about shortages), and a fairly fixed demand that's only increasing as the population ages.
Most programming jobs neither require nor actually need a CS degree. I don't have one and I'm doing pretty ok. Gluing frameworks together is not exactly difficult and has nothing to do with lambda calculus or compiler internals
In countries with socialized care, there's one employer that dictates working conditions (the state). Over-licensing means it's a hassle to port a license to another jurisdiction (to be subject to a single employer again).
In America, same issues mostly except it's not the state that employs but an organization that's given a monopoly over a region thanks to certificates of need. [0]
If almost all the skilled jobs are similarly “overpaid”, maybe that’s just what ordinary good pay looks like.
Some people might be underpaid, and that would be a different angle.
As it is, in the Bay Area, a typical startup software engineer only earns 4-5x what a person that makes burgers at a fast food restaurant makes. (Typically they make $20+ per hour, so $40k+ per year). Despite years of education, monetary investment in education, constant re training, long hours, etc. It doesn’t seem an unreasonable multiple to me, just trying to get a sense of the situation.
Those arguing that we should make less than or around $100k, are we really producing no more value than 2 people making burgers, or one especially fast burger maker?
Also note that wait staff at good restaurants in SF can earn $80-100k.
And even decent outsourced developers in low cost of living countries charge $30-50 per hour (60k-100k per year). With again, less impressive education, and assuming a 40 hour work week which is lower than many engineers in the Bay Area work, and that’s not even for top skilled outsourced work. Try paying less than this now, you’ll pay for it multiple times over later.
> We are overpaid for incredible working conditions and devs basically became capricious divas, despite the fact 90% of them are plumbers, and many not very good ones.
Maybe in the US, but in the UK a highly paid software developer can barely buy a 2 bedroom flat in a suburb where 30 years ago the salary of a junior factory worker was enough to buy a house.
> If you had any professional doing the same
Other professionals are less productive by orders of magnitude, just check what’s going on in the accountancy department of a random company. You can double their productivity introducing Excel’s pivot tables a V-lookup. Or try to get a plumber to fix your toilet within time and budget.
Where do you find these plumbers? When we owned our house, we never were able to find any contractor that was reliable, upfront about costs, etc. Hell, getting them on the phone and to show up was hard enough.
Word of mouth from neighbors and colleagues, mostly. I mostly DIY small things around the house, but when I have a plumbing task I can't or don't have time to do, I've had good luck in calling those plumbers, getting an upfront quote (often, "here's our trip charge and hourly rate, and a toilet repair is going to be less than the one-hour minimum and around $75 in parts")
The reason I do it myself is that $75 in parts is actually $20-30 in parts and I can order parts and then do 45 minutes of plumbing work faster than I could figure out what plumber to call, arrange a time, and then stay home to let them in.
Those plumbers probably don't have to deal with a "not invented here" cistern and related plumbing with bespoke thread pitch and piping diameter either.
IT workers are as divas as Basketball players (to some extent): why on earth Lebron James earns millions? Answer: because the team that owns him makes much more than that by selling his image. Easy. Same of IT workers: IT companies make millions per employee, so they pay us hundred of thousands in exchange. Easy.
Oracle makes millions, but is it because of their amazing devs?
Doctors don't make millions, is it because saving human life is not valuable?
Correlation is not causation.
Not to mention Lebron James actually ships. Indeed, he plays the matches he is paid for. And doesn't cry on TV the coffee is not good enough in the stadium.
I don't understand. You complain about coworkers getting high pay and complaining while not producing any value. Do you think a downturn and potentially getting fired will educate them in this matters and getting less pay will make them solve actual problems?
Re: Oracle, yes actually it is due to amazing devs. OCI paid FAANG salaries so they could get FAANG talent. Now they're the only cloud provider with robust growth in their cloud products
> Oracle makes millions, but is it because of their amazing devs?
Meaning? If it's because of their sales people, they are making a good salary as well.
> Doctors don't make millions, is it because saving human life is not valuable?
As you pointed out, we don't value them enough. People nowadays prefer to spend their time thinking TikTok/Twitter/GitHub... so that's where the moneys goes.
> Not to mention Lebron James actually ships. Indeed, he plays the matches he is paid for. And doesn't cry on TV the coffee is not good enough in the stadium.
It's not about shiping. It's about selling. Sad, but true.
It's hard to read because of the lack of structure, but in the end they're complaining that some code golf competition's winner wasn't legible code that's great to maintain.
If your job is to work on code that looks like that, that really sucks and I'm sorry for you, unless you love doing it, in which case cool. But for most of us it's really not and while I often feel close to being broken, I'm 100% convinced that plenty of jobs are much more taxing, and if I reduced my hours, it would be a breeze. But there's features to build and bugs to fix, and I'm going to retire in a year or three in my forties because that's how good tech pays even people like me who can figure stuff out but are by no means geniuses.
Frankly, because the meaning and purpose of "diversity" has been corrupted to the point that it actually causes issues like the one described. Similar to the corruption of the word "racism".
I think you're missing the point by assuming git branch renaming is the primary (only?) cost. Diversity (meaningful diversity of thought and opinion, that is) actually has business value. That's the reason it was pushed so hard from the top way back in the day, it was seen as a means to improve product offerings and problem solving processes.
These days "diversity" is nothing more than a euphemism for weird racial, sexual orientation, and gender fetishes. These "diverse" workplaces are, ironically, echo chambers for the same patterns of thought based around the idea that superficial characteristics like gender and skin color are meaningful predictive traits
Every new movement has a hype cycle. It's fine. The hype will eventually die down, the useful stuff will remain, the rest will go.
I still can't find any black people to hire for some basic positions. This is a problem. Research continues to show ML models to be biased in favor of the people who create them, and against the people who were never in the room. This affects society in a huge way when that model is used for law enforcement, health care, education, etc.
Why is that a problem? If you can’t find black people to hire, it means black people are not having problems finding employment.
Why do we expect the population to perfectly segment itself in such a way that every workplace has the correct percentage relative to some interpretation of the population?
This idea that black people are needed to provide some sort of black perspective on every team is so weird to me. You’re not your race and the type of black people who are qualified to work in top tier tech companies have more in common with the upper classes and college elites than they have with the stereotypical inner-city poor kid raised by a single mom.
> If you can’t find black people to hire, it means black people are not having problems finding employment.
Yeah, no, that's not how that works dude. That's so incredibly far from reality I don't even know where to begin.
> This idea that black people are needed to provide some sort of black perspective on every team is so weird to me.
Of course you can't fathom why an outside perspective is necessary... You don't suffer discrimination. You've never needed to care about how black people are treated, or what opportunities they have, or how their lack of representation in all walks of life hurts them. It's hurting them, not you.
The reason we care and want black representation on our teams, is we care about black people. That's pretty much it. There are other reasons ("diverse experience provides a wider perspective", "better serving customers who aren't white", etc) but the main one is just.... Wanting to be less shitty to black people. Expending slightly more effort to find and hire them is the very least thing we can do to move toward a more equitable world.
> the type of black people who are qualified to work in top tier tech companies have more in common with the upper classes and college elites than they have with the stereotypical inner-city poor kid raised by a single mom.
Um. No... So much wrong with that paragraph... Please don't repeat that at work... You would probably get repremanded, possibly fired. Holy crap dude.
Besides all the weird racial and class assumptions there... Just for your edification, from my own anecdotal experience, the black people I work with have very little in common with me, culturally, class-y, etc. We get along fine, but we do not have anywhere near the same background or experience, and I have a million times more privilege.
And besides all that, a lot of people in tech, who aren't black, don't come from an upper class or elite colleges. Maybe they do at FAANG? (If they do, I really don't wanna work there)
I work like a dog and have been since engineering school. Actually was working like a dog in school too. I think I genuinely deserve the pay considering the day to day stress of the job and the complex workload.
Lots of people work like dogs and it seems to have no strong correlation with their pay. It's never been about "deserving", it's just supply and demand.
Developing code (duh),
leading a team,
reviewing PR's
monitoring the continuous release pipelines,
providing production support,
recruiting (takes a surprising chunk of my time),
sprint ceremonies,
Fixing bugs and talking to QA
And yet that's written by someone who is working for 10 times minimum wage from his home on his adjustable motorized standing desk. While ordering sushi (because I can afford to every day and the maid is off, because I can afford a maid) for lunch, I received a text message from my friend, a nurse, saying that she will do 50 hours this week and she can't come to the diner we plan.
I'm French, lived in Africa, worked in Asia and the US, and, no, this is not a twitter thing.
I already considered myself amazingly lucky just for having hot water, electricity, and always have heat/sleep/food.
Well, yes. But the solution to most of the stuff you point to is to have better management to rein in the divas, better hiring, better training for those managers and developers, more stable stacks so there's no excuse to rewrite into Elm, etc.
Which our industry also seems to have given up on, because it costs money. So we'll have divas, bad devs, wasted resources, Elm.
Of course, the alternative to spending money on fixing this is much more expensive: just hire more devs... but who said our industry is rational.
I mostly agree that we are divas and that 2023 is a wake up call.
I agree that tech stack churn is a big problem, although I feel that it's a sign of immaturity of the industry and domain themselves rather than an issue with individual developers. Much of our job is to find more efficient/robust ways to fix issues, progressively compressing the job of N people into something that can be done by o(N) people. That means that we are trained and selected to spot inefficiencies and take them personally. That also means taking new technologies for a drive.
I'm sure that there are better ways to do this than trying them in production, but as long as we operate in an industry that cannot differentiate between self-training-for-next-job, research, industrial prototype and industrial product, we won't be able to fix this problem.
I hope that, as the industry matures, this dust will settle a bit. Perhaps one aspect of this will involve letting developers take a few paid weeks once in a while to work on prototypes with whichever technology they want, as a form of self-training. Another will be making companies actually liable for the damage they cause when they try and pass early prototypes as products.
As for diversity, I tend to disagree. In my experience as both a dev, tech lead and sometimes CTO, diversity is not just a political choice, it is also practically useful, for the same reason nepotism and consanguinity are bad. Diversity is what saves you from blind spots (aka "acquired stupidity"). Diversity is also what lets you hire brilliant people who have been overlooked by other companies.
Basically, in my book, it means "don't build a team of clones of yourself", because everybody having the same profile is how you create blind spots.
Hire people who are going to have a different perspective, perhaps because they have a different reasoning (e.g. neuro-divergent candidates), different origins, different curriculum, different life experience, etc.
In the often adverserial world of exchanging labour and creativity for money, shares, prestige, and because we programmers care please don't blame us for not being able to assess how much the business should pay us.
Our job is attempt to know our expected potential worth and then negotiate for that recompense.
There are things we often don't know, i.e. megacorps may hire us to stop us working elsewhere.
If a business chooses to neglect to know this because it is either not worth it to them find out or they are incapable that is not on the worker.
This is esentially an attempted value crash or hyperdeflation of the monetary value of labour.
This meme of we can do more with hugely vastly less talent is a spreadsheet fancy of MBAs far from the coal face of code or customers.
The real cost is all the top talent people bail and only those who can't or won't leave remain.
But management think they can properly assess the value of people finessing a process they can't do and little understand and probably are insulated from and only see through metrics to which Goodhart's law means the stats are juked.
> We are overpaid for incredible working conditions and devs basically became capricious divas, despite the fact 90% of them are plumbers, and many not very good ones.
Overpaid relative to what? Certainly not the value created.
There's this meme that software engineers are just capricious and don't "deserve" the money (often it's a weird form of jealousy from other fields who simply can't match an engineer's output) but it completely ignores the enormous gains in productivity enabled by the field. Is there any other field where such an impact can be had by a small team in the Valley?
> If you had any professional doing the same, wasting so much resources as us, changing part of the tech stack every month, debating vocabulary on twitter ad nauseam instead of coding, and whining about how their first world problem should be the focus right now rather than doing their job, they would get laughed at.
You seem to have a very particular view of the profession.
> But we were incredibly lucky that IT is the most amazing productivity cheat code humanity has come up with so far, so that all this BS was accepted as the cost of doing business.
Is it luck? People saw the writing on the wall decades ago.
While your post was certainly hyperbolic I was uncomfortably surprised by how many people disagreed with you.
For me the thing that disappoints me today in our industry is the number of mediocre professionals who give themselves too high of an elevated status just because they get paid more than the average person in their society.
At the end of the day we’ll have bigger problems if the trash doesn’t get picked up or if that power line doesn’t get fixed than we will if some developer doesn’t solve some abstract problem.
We don’t “deserve” anything and maybe we earned it through hard work but not enough of us appreciate our good fortune of being professional programmers.
I mean no disrespect but a lot of this sounds like stereotypes of what software engineers do. Yes having free coffee and maybe even free lunch are nice perks, but those are very small compared to salary + office space cost. And RTO has shown the latter to be negligible anyways.
200 years ago it would’ve been crazy talk to get Saturdays off. In some countries it still is. I don’t think not having to work 996 is a sign of waste or laziness, it’s a sign of progress.
> We are overpaid for incredible working conditions and devs basically became capricious divas, despite the fact 90% of them are plumbers, and many not very good ones.
I once had a job where we created and supported a bunch of Golang microservices in the backend. Like you alluded to, I couldn't help but feel that I was a glorified, overpaid plumber.
> We are overpaid for incredible working conditions
I have to disagree with this for a number of reasons. By and large in tech people work overtime meaning per hour pay is lower than what you think. To be decent at your job you need to constanty learn, also not factored into wages.
The working conditions are horrible. You sit at a desk, usually in a crowded lowd office, which is highly detrimental to your health.
Often you need to commute long hours meaning you are detached from social and family life. Many struggle with starting a family.
Moreover, you have to constantly chase small tasks, constantly shifting focus and have to deal with obnoxious managers.
All things considered, tech is not a well paid job. Not by a long shot. While we "enjoy" sitting in offices and an apparent high income - at an enormous cost for us - the guy next door owning a corner shop enjoys a family life, likely owns a property and doesn't need to worry about keeping pace with daily changes.
I'm going to be honest with you, working in tech myself, these points feel incredibly out of touch.
> The working conditions are horrible. You sit at a desk, usually in a crowded lowd office, which is highly detrimental to your health.
In many professions you're expressly forbidden from sitting, even when carrying out tiring physical work. Most tech employers are willing to purchase an adjustable standing desk should you request it.
Outside of tech, people are frequently prevented from hydrating, nourishing, or relieving themselves unless given permission to do so by their employer.
> Often you need to commute long hours meaning you are detached from social and family life. Many struggle with starting a family.
This is in no way exclusive to tech workers. If someone earning a tech salary cannot afford to live reasonably close to their place of employment, how long do you think the commute is of the person serving them coffee or cleaning their office? We also benefit from having the option to work remotely.
> Moreover, you have to constantly chase small tasks, constantly shifting focus and have to deal with obnoxious managers.
Again, not unique to tech workers and certainly less impactful. Tech managers have relatively little power in comparison to other sectors. In the service industry your manager can effectively fire you with no oversight by simply not scheduling you. A server will be expected to manage 4 or more tables at a time, remembering who ordered what, even when interrupted by requests from other customers.
> All things considered, tech is not a well paid job. Not by a long shot. While we "enjoy" sitting in offices and an apparent high income - at an enormous cost for us - the guy next door owning a corner shop enjoys a family life, likely owns a property and doesn't need to worry about keeping pace with daily changes.
Most people working in tech enjoy those things as well. If you can't, you might consider re-evaluating your situation.
So.. exactly like most every other job in other sectors? With the added benifits of zero physical dangers, no exposure to the elements, no personal costs for tools, and comfy office chairs? There are innumerable trades people who envy such perks.
Long commutes, time away from family, not having family, skipping vacations, and working in crowded spaces are all par for the course in the modern economy. Id say more but it is 5am and im already late.
Exhibit A of the kind of thinking the OP is talking about. Try exchanging jobs for a week with 5 other randomly selected people in the U.S. much less third world countries and see who doesn't fight tooth and nail not to go back.
Please rattle down this list to a hotel employee. Gardener. Flight attendant. Bike mechanic. Store clerk. Teacher. Builder. Some poor fuck slaving away in a dead-end back office job.
I guarantee you that in the absence of witnesses half the people will give you a good walloping.
Lol. When I was a student, I worked all sorts of shitty jobs. Here's how it looks when I compare overtime hours I worked in different places.
1. Factory work: no overtime ever. It's against the safety regulation. When the bell rings, you must go home or punitive measures will be taken against you.
3. Working as a waiter / room service: you fight for overtime because you get paid extra. Especially if you do overtime on holidays. It's hard, but is totally worth it.
2. Other shift-based work, s.a. night guards, cleaning, cab dispatcher. It usually happens if the next shift is tardy / stuck in traffic etc. It's annoying, but doesn't happen a lot.
3. Bakery. Holly hell! You have to show up at work at like five in the morning and you get two breaks during the day when you can sit down. Your day ends up around five in the afternoon, unless it's a holiday when everyone wants extra donuts / cakes / pastry, then you go home at eight in the afternoon. No pay can possibly justify this, but you work for pennies.
4. Newspaper. Every now and then you need to sit in the office an wait for the important game to finish so that you can publish the score the day after. Meh. It's fine. You spend time sipping tea and chatting to the other person staying with you.
5. As a programmer: you switch your status in Slack to WFH. Also, the amount of overtime work I ever put in as a programmer was negligible. I know people in game development work their butt off and do a lot of overtime. But that's unique to that field. The rest of the programming world just doesn't see overtime at all. Well, maybe NOC, but they aren't really programmers.
----
6. I'm not a doctor, but my wife is. Doctors work the most overtime of all professions I know. They don't even really have that as a concept as, for example, if you are a surgeon, you just keep going until the surgery is done. If it takes days, then it takes days.
Do you mean "IT" as in the people who deal with technical issues / supplies that the company is facing (they might be working in shifts). Another way to use "IT" is to refer to the whole group of people who operate computers on a more than user level, but mostly including people who'd self-identify as programmers.
If you mean the former, then yeah, these people might work in shift, but usually don't (they might work in shifts only if the organization they support needs 24 hour technical support, or NOC, as I've already mentioned), which most organization don't need. This technical support interpretation of IT is very rarely programming anything (exceptions are SREs or PEs in large companies where there's so much complexity in their in-house infrastructure that they need tech support to program).
In the later case, there's no need for most of IT to work in shifts. Definitely not at night. Same way how there's no need for accountants to work in shifts (and definitely not at night).
My side gig which is not tech made more money than i ever did as a tech “worker”. That’s when i realised that we are living in a bubble. Step outside of it and you’ll be surprised at how much your mind’s being wasted by sitting on a chair all day chasing jira tickets like a drone (no insult intended). Our minds are designed to be analytical and organised. That’s precisely what can give you a massive competitive edge.
We like to discover, learn and master. Just as you learn a new technology in no time so can you learn how build a small business and be damn good at it.
The big tech giants are paying enough that most engineers don't consider the risk of opening their own business to be worth it.
Owning the shop provides intangibles like being your own boss, somewhat less sedentary lifestyle, more community interaction, etc. But the financial rewards are most likely no better than being a developer at a FANG.
That corner shop may have been started decades ago. The opportunity to buy it at a reasonable price has probably long past.
> We are overpaid for incredible working conditions and devs basically became capricious divas, despite the fact 90% of them are plumbers, and many not very good ones.
This is a great answer. And there's a niggling worry in my mind as to whether the sudden capability explosion of LLMs might eliminate the bottom 50% of us completely.
Yes but that may take 5 years as everyone learns the OpenAPI API (to call it) and integrate the response in a text box and release the AI autocomplete features around the product suite.
Boomers existed in a society were everything were not sold yet. Were every pieces of land was not occupied and optimized yet.
Now the theater is full.
It's not that boomers earned much and today a lot of people earn too little.
Is that then things were less rare than today compared to the demand.
Also, I earn way more than my father and grand-father, and my life style just cannot be compared, nor the energy I consume or waste I produce.
So as an IT guy, no, this is not enough to explain it. And boomers were certainly not divas. Their working conditions were definitly not as good as in IT today.
The country isn’t really full even if the Bay Area, NYC, and other large metros are.
At a certain point the cost of living disparities will move people to up and coming cities. Some of the teleworking gains from the pandemic are being wound back. But it’s still a view of the future.
Are we overpaid? The average software company is pulling in half a million per employee. That includes relatively worthless hr, admin, and other business people. Individual engineers prove over and over again that they can start multi-million dollar businesses on their own without any of those people.
Software and the engineers that make it proves itself to be very valuable, engineers should be compensated accordingly.
Individual engineers who start multi million dollar businesses also prove over and over again that they need those “worthless” other business people by immediately hiring them once they get to a certain size.
Not to mention that the engineers who do manage to start these companies are usually not top of the engineering field but rather the ones who are also good at sales and marketing and product management.
It's kinda the same for me, our product solved a problem that was not enough annoying to people for them to consider buying it, so we never had the product market fit, and never managed to sell a single license. The company still exists today, but it's mainly surviving with us doing freelance work and hoping one of our side project might generate some revenue.
Maybe as open-source, the product could have some success, but I don't know how to "sell" open-source or make a revenue from it.
... but aren't job postings a poor measure of actual hiring? Once you reach the point of not being able to hire the people you want in a timely manner, you can create _lots_ of postings, on different channels, at different levels of seniority, with different kinds of focus, etc, as a desperate and ineffective measure of getting some qualified applicants in your pipe.
I noticed there were a lot of job listings for crypto which just stayed up for ages. I suspect they had trouble hiring as they need someone smart enough to work on this stuff, while also lacking the ethics to avoid scamming people, and often being dumb enough to be scammed themselves in to having part of their payment be rug pull tokens with conditions blocking selling for x months/years (after the pull)
These no hope listings probably inflated numbers a fair bit.
Crypto companies have a more sinister reason for keeping job postings up.
It's all about marketing for crypto companies that have tokens on exchanges. Their goal is to raise the token price so they can dump on retail. Having many active job postings show that they're active in development and they have plenty of money to keep the project going. If they don't have any job postings, it can be interpreted that they've already rug pulled.
It's all part of how they fool the gullible.
There are literally analytics companies that use job postings as a data point for crypto companies to see if they're active or rug pulled.
Keeping job postings up is free.
Source: Worked in and consulted for a few crypto companies years ago and analyzed many crypto scams.
I have coworkers who were offered tokens in a previous engagement. They were given the option to sell the tokens as soon as they received them (and receive cash instead), and they made quite a lot of money from them.
I subscribe to a couple RSS feeds for some remote job sites.
A very large % of jobs are for crypto (ok "fintech" but look closely it's crypto) in March 2023. I wonder how these companies are still going, or maybe the listings just haven't been taken down yet.
But that would always be the case and wouldn’t explain the pandemic bounce. You need there to be a sudden relative increase in this type of behavior at the beginning of the pandemic.
> You need there to be a sudden relative increase in this type of behavior at the beginning of the pandemic.
1. Increased funds available to VC due to low interest rates and hot tech companies.
2. Many tech companies having a pandemic-fueled field day.
3. VCs pushing startups to hire aggressively to meet current demand (smart) and prepare themselves for the remote work, tech revolution paradigm shift (smart in moderation, less smart when leaned into too aggressively).
If you're looking for jobs on Indeed I'm confused as to what you're doing. I found nothing useful there when looking in February, meanwhile LinkedIn was incredibly useful.
Did about 15 interviews and accepted an offer. No issues.
Indeed never had a single attractive company to apply to in that time: It was all consulting.
It may be a location thing, but in the upper midwest I see just absolute garbage on LinkedIn and just mostly garbage on Indeed. Both are full of consultancy garbage, but LinkedIn refuses to show me local jobs. I suspect they don't want to return a mostly empty search page, so they fill the results with Accenture consulting listing that are supposedly for my smallish metro. Indeed at least shows me local listing.
Although my situation might be a hit niche for this board, since I'm in a small area and I'm not interested in remote work for mental health reasons. But odd that a geography change causes such a complete reversal of which site is better. LinkedIn is still worth it for the professional network features, but honestly I kinda would prefer the Craigslist interface since that is actually searchable.
I'm a tech lead in France with a detailed resume and LinkedIn sometimes shows me positions as a nurse in the US. No clue how their algorithm works, but I'm not highly confident in its quality :)
My last search, Indeed was great. They did free, decent resume consultation with me, and assigned me some rando to point out some good listings to apply for here and there (that surprisingly were generally well curated). Got several interviews and a job in short order.
(no affiliation, they just were surprisingly helpful... I still don't know why, my situation at the time did not make me an amazing candidate or anything)
I’ve noticed that many applicants for positions at my company from Indeed.com are entirely unqualified. Often truck drivers, etc. for a software engineering role and the resume will have no engineering related experience.
I’ve been told that many of these applicants are people applying to jobs they have no hopes of getting interviewed for. In many states you must prove you are actively looking for a job to collect unemployment benefits.
Man I thought it was just me, but yeah Indeed is truly terrible. My only way of getting interviews has been linkedin (both applying to and getting messaged) or directly applying on company's websites thus far in my career.
Are there any other job boards that are actually authentic?
Agree with this. Forgot they removed the board as it’s been kind of a while now.
One thing I didn’t really like was having your stack overflow profile be relevant in hiring decisions - probably good for some/many devs though
I heard GlassDoor also has jobs but I never used them, only LinkedIn and Indeed (I'm EU-based).
Never got hired through Indeed but they have a few jobs that aren't on LinkedIn.
Also, is it me not knowing to use Indeed or does their interface not support the most basic thing? EU-wide search. With LinkedIn I type "European Union" in the location box, shows me all EU. Indeed, I have to select every individual country, it's much more limiting and I'm getting bored fast.
Sure, I can select "United States" but dudes, having EU countries listed individually with no option of doing a whole area search is like having to manually search "Arkansas" or "Oklahoma" jobs in the US, it's retarded and myopic to say the least.
I can’t figure out why there seems to be a new heavily advertised jobs/employment site every couple of years. The first one I remember was monster.com. Now it’s Indeed. There have been a number of others. Why do these sites spring up, seem to be everywhere, and then get replaced after a couple of years?
They’re a terrible measure. From a comment I made about this assumption elsewhere on HN a few weeks ago:
I am hiring at an early stage startup myself. There is absolutely no way we’d extend offers for all of the roles we have open if a candidate arrived for all of them next week. Growing that much so quickly would be a disastrous onboarding experience for most of the new hires and potentially risk our ability to build a consistent culture with the team. So we’ll hire whichever candidate(s) successfully complete the process first and then pause the other roles until we’re ready to onboard more people.
Then there’s also the reality at this stage you need people who can wear multiple hats. And there’s a bunch of roles where your ideal candidate doesn’t have a neat pre-established label. So sometimes we’ll post the exact same role with different titles to try and make sure it gets the attention of someone who most strongly aligns with one of those job titles.
At a previous place we worked remotely, and so the same job would be posted as both remote and then also as a dozen different specific city locations too. But if we were hiring multiple of any roles you wouldn’t multiply that, you’d just post the one listing(s) and keep it open until all the positions were filled.
A jobs page is a marketing artefact for potential hires, it’s not a financial reporting/forecasting tool.
TLDR: don’t make any assumptions from a job board about how many roles a company is realistically hiring for.
This would also explain part of the uptick in postings post-pandemic, as companies become more open to hiring remotely are more likely to post dozens of identical job postings in a variety of cities.
The reality for most of us I suspect was sitting at home on our couch fixing an obscure bug in a CI pipeline while half-listening along to an all-hands meeting.
Yeah, the "day in the life of a software engineer at X name brand company" has been on YouTube for years and definitely predates the TikTok trend. It dates to at least a few years prior to the pandemic.
They don't necessarily match, but the aggregate directional numbers are meaningful. Indeed postings aren't free. The mismatch between tech and overall also matches the broader narrative about a tech recession, but the rest of the economy is doing sorta ok, at least until January 2023.
So you're right, but it matches enough things that I wouldn't dismiss it.
Actually I used to work producing data from job posting stats. I think there’s a strong correlation between job demand and job posts (job posts cost money, after all!) but as you point out, it may not be linear, as the more desperate people are to hire, the more they’ll post.
People were appalled by the Twitter firings, but seriously, how many people does it take to run old, established, (micro)blogging platform?
I don't know the number, but the number before the firings was too high. From what I've gathered by watching from the sidelines they are probably now gone too far south from the correct number.
However the lesson to learn is not to not fire people. Yes. Of course it sucks when you lose your job, especially if it was for no fault of your own - just because some middle manager jack-off got too budget one year and decided to hire everyone with body temperature above room temperature. But we need to start culling the herd.
We have too many noobs who are just wreaking havoc across our code bases, making products and user experience worse. Hardware gets faster and faster every single year, but end users don't see any benefits. We see some benefit since we are on the top of the food chain enjoying our $5K max specced Macs, but even then we should see much faster and way more reliable software than we see today. It is so fucking sad when I try to load a website and it takes 30-60 seconds for all the javascript bloat just to get to me and I'm even running all kind of blockers so I won't be getting nearly as much bloat as a normal end user.
Tech used to be cool. Now it is just money all the way down. Just think how fucking cool the current AI wave would be without all the neutering that is going on since none of the big companies dare to take a risk and actually let people have even a bit of fun.
Are you seriously trying to put the blame for software bloat and bad UX on "noobs" implementing things poorly?
Popular web pages being choked with ads and modal banners is the result of business decisions to put those there. Jira being notoriously laggy and confusing is the result of prioritizing feature breadth over UX. Apps being built in heavy frameworks is the result of founders and CTOs deciding that they can ship faster and be more likely to make it to the next funding round if they color within the lines than if they focus on speed. Major company products often being poorly designed is the result of organizational fragmentation and lack of a global perspective.
Does everybody always do the best job they could within the constraints they have? Obviously not. Are the problems with technology in general primarily the result of _bad engineers making implementation mistakes_? Absolutely not!
In the last 20 years, if there is one thing that really changed in this profession was the amount of leeway developers have to criticize product and UX decisions. We're approaching "none".
The amount of political capital you have to spend just to avoid bloat is not available to most senior developers, let alone to "noobs".
It seems like you're making a point of outrage of something that's evident. If you hire way more than the market was used to, you'll hire less qualified people, and unless tooling is better to replace the reduced skills and experience, lower quality software will be produced. I agree OP was doing a lot of drama about "noobs ruining everything" which is also a stretch, but this negative effect should be self evident.
It's not actually self-evident: even if you take for granted that less-qualified people get hired, that could mean that companies invest more in training, or that the marginal bit of software is equally good but slower to produce than if more-qualified engineers had written it (because it took more code reviews and produced more technical debt along the way), etc.
Besides, that's barely pertinent to my point, which is that if you see bugs, you can probably blame them on engineers, but if you see design and product choices you don't like, attributing it to "noob" programming is bizarre.
>Are you seriously trying to put the blame for software bloat and bad UX on "noobs" implementing things poorly?
>Popular web pages being choked with ads and modal banners is the result of business decisions to put those there
Who do you think has implemented all that shitty ad and banner code? It for sure wasn't a professional. If ads had initially been implemented by competent programmers they wouldn't have contained nearly as many avenues for abuse and actual malware spreading mechanisms.
More so professional programmers can pick their work. Noobs have to settle for writing shitty ad code.
>bad engineers making implementation mistakes
Most of these noobs aren't even "engineers" they are bunch of near tech illiterate "normies" who have just graduated from two week code camp. These are the people who unironically say "my job is just copy-pasting from StackOverflow". I am sorry to say this, but you (whoever is reading this and hard disagreeing) might be the problem.
> but seriously, how many people does it take to run old, established, (micro)blogging platform?
How many people does it take to run an international advertisement platform with a microblogging service?
I mean it is all a matter of framing here. I hardly believe most of twitter staff has been developers and systems people, but many people who talk with potential customers, deal with specific problems that arise from being present in 200+ countries etc. If you're doing that seriously that takes a certain amount of people.
Also: of course a company can downsize. But it matters how it is done. Defending the disregard for the human lives the way this was done had wins you nothing except the chance to be affected by similar methods yourself in the future.
That aside: Tech used to be cool in a time where tech people didn't have to care for all of that. When it was just us building our lego towers. But now lego towers run the world and building them the same with disregard for the outside world would be truly deranged.
Panic to literally print code along with other arbitrary hoops that sometimes materialize, encouraging employees to sleep at the office; a general climate of fear. The list goes on.
Engineers were hired for growth. Just maintaining a platform will result in the slow decline you are seeing at IBM. You need a culture of innovation to tap into new opportunities and for that you need to have engineers that are not working on just maintaining the core platforms.
I agree that the balance is tech hires was skewed in recent years. That was mostly because tech companies naively assumed the COVID hyper growth was sustainable.
> how many people does it take to run old, established, (micro)blogging platform
Not everything scales. Compliance doesn’t, for example. You need people to handle compliance across regions. That means lawyers, product managers, and ultimately engineers.
There was a tech bubble, period. Still is. Lots of tech with inflated value, driving massive hiring that would never result in enough value to justify the jobs.
I know multiple companies hiring for teams with dead or dying projects. They hire anyway because if they don't they'd have to explain to the higher ups how the projects those higher ups green lit are useless. They need more people to make it seem like the stupid shit they're doing will generate revenue, when they know it never will. Keep the gravy train going as long as possible.
Most of it was/is around other bubbles, like the crypto bubble, AI bubble, B2B bubble. Crazy ad spends have been making it worse. The pandemic inflated some sectors and deflated others, but mostly tech was fine while other industries floundered.
I'm not convinced that we are in an AI bubble. In June 2007, Apple released the iPhone. Its stock price was $4.45 adjusted at that time. A year later, it was $6.50. 15 years later, it's now $153.
The iPhone is a real product, though. AI is more like an automated fortune teller. Just because your fortune sounds really convincing doesn't mean you should use it to buy lottery tickets.
I think you just haven't seen the potentially revolutionary applications.
My side project is developing a tool to help me learn Chinese. Last night I asked gpt-3.5-turbo to:
1. Take the existing text which had Arabic numerals in it (e.g., "2006年" for "the year 2006" and "30万" for "300,000") and replace them with Chinese characters as they would be read (i.e., "二零零六年" for the year, "三十万" for the number)
2. Segment the text with some specific parameters (i.e., segment "二零零六" as four separate words, rather than one big word)
3. Convert the whole thing to pinyin, with specific parameters (don't group multi-character words and don't capitalize proper nouns).
I got the output I wanted with 5-10 minutes of playing around with it; I could theoretically use those prompts to process my entire library of study content. Do you have any idea how long it would take me to code something like that up?
> Just because your fortune sounds really convincing doesn't mean you should use it to buy lottery tickets.
You shouldn't buy lottery tickets anyway (and ChatGPT will tell you so, if you ask it). I have difficulty to understand where this negativity (not speaking of other downsides) regarding AI comes from. Yes, it talks BS sometimes, but so do people all the time. And, no, it's not the solution for all our problems, but for some things it is damn useful.
I think there’s negativity because it gets hyped too much.
We were supposed to have self driving cars all over the roads 2 years ago. We still don’t have anything close but now people are all “How is the way AI works any different from the way the brain works. Looks the same in my non-expert opinion!”.
I think there’s negativity because it gets hyped too much.
You only can recognize the true revolutions in hindsight. I remember well the people who said the Internet was overhyped and it's just a novelty that will go away. In hindsight it was not hyped enough. My personal opinion is that what we see in AI is more like the early Internet than the hype about self-driving cars.
What I see more of an issue is that some people expect AI to solve all their problems and when they are disappointed they overlook that it solves some problems better than anything we had before.
This is true. I think it was hard to reason about how important the internet would be because it was expensive to get on and it needed Google to make discovery exponentially better. It took a long time for access to become cheap enough to be almost ubiquitous in a developed country.
ML, AI, whatever, is clearly usable now. But it has limits. And I think it's clear that it's not fully baked relative to the promises. We certainly don't have a Killer App yet.
What we have isn't good enough, and I say that at someone who in 2009 expected cars that didn't have or need steering wheels to be purchasable by normal people in 2019[0], but the tech does exist and can drive you around.
(And derive you around, as autocorrupt decided to inject a suitably ironic twist that I almost didn't spot).
> but now people are all “How is the way AI works any different from the way the brain works. Looks the same in my non-expert opinion!”.
In general: yes, totally. (Though I've only heard that specifically being used for "why stable diffusion is no worse than art school WRT copyright issues").
Always the way, though: "Consciousness is mysterious, quantum mechanics is mysterious, therefore they are the same thing".
> You shouldn't buy lottery tickets anyway (and ChatGPT will tell you so, if you ask it).
I was curious so asked..
ChatGPT:
As an AI language model, I cannot endorse or encourage gambling activities, including playing the lottery. However, I can provide some information about the lottery and the best strategies that people use to increase their chances of winning:
- Play regularly: The more you play, the better your chances of winning. But remember, playing the lottery can be addictive, so it is important to set a budget and stick to it.
- Play less popular games: Less popular lottery games tend to have smaller jackpots but better odds of winning.
- Join a lottery pool: Joining a group of people who pool their money together to buy lottery tickets can increase your chances of winning.
- Choose your numbers carefully: Some people believe in choosing numbers that are meaningful to them, such as birthdays or anniversaries. Others prefer to choose random numbers. It's important to note that the odds of winning are the same no matter what numbers you choose.
- Consider the odds: It's important to understand the odds of winning the lottery. The odds vary depending on the game, but in general, the odds of winning the jackpot are very low.
It's important to remember that playing the lottery is a form of gambling, and there are no guarantees of winning. It's important to play responsibly and never spend more than you can afford to lose.
I also asked it before I wrote my comment and the answer I got was much more diplomatic than yours. I used the brand new GPT-4 and here is the answer I got:
As an Al language model, I don't have personal opinions, but I can provide some information to help you make a decision.
Buying a lottery ticket is a personal choice, and it depends on your reasons for wanting to purchase one. If you're buying a lottery ticket for fun or entertainment, and you can afford to lose the money you spend on it, then it might be an enjoyable experience for you. Keep in mind that the odds of winning a large lottery prize are extremely low, so it's important to have realistic expectations and not rely on the lottery as a means to solve financial problems or achieve long-term financial goals.
If you are considering buying a lottery ticket as an investment or as a primary means to improve your financial situation, it's essential to understand that lotteries are not a reliable or sensible investment strategy. The odds of winning are incredibly slim, and you are much more likely to lose money than gain any significant returns.
In summary, if you want to buy a lottery ticket for fun and can afford the expense without any negative impact on your finances, go ahead. However, if you're relying on the lottery to improve your financial situation, it's important to explore more reliable and effective methods for achieving your goals, such as saving, investing, and financial planning.
I thought it fairly obvious that the poster you're responding to was making an analogy about using GPT/LLM models for decision-making, not actually endorsing playing the lottery.
And they're correct. Don't use generative models for extractive tasks. That's like using the handle of a screwdriver as a hammer. Sure, it might work for a while, up until there's a screwdriver in your eye.
Good point, and I wholly agree. It's not only that GPT/LLM models are not the solution to everything, as some like to pretend - some applications are even potentially dangerous.
You're saying this as if people weren't saying worse things about the iPhone than you're saying about AI. Steve ballmer said it was an expensive piece of junk without a keyboard that nobody would want. You just have to google iPhone and iPad critics from that time period to see that you're having your own Ballmer moment. If you're right or not only time will tell.
I use ChatGPT, GitHub Copilot, Midjourney, Fireflies, Cleanup.Pictures, and a host of other AI-based tools in my daily workflow. I pay money for much of that. Anyone saying that these tools aren't products are, to be frank, deluded.
For some reason, people are equating inaccuracy with redundancy. I can't fathom why people do so.
I have a copy-writer, editor, artist, co-coder, photo editor, reporter, all at my finger tips. I need to give them advice on how to iterate their designs, and I know not to ask them for explicitly "correct" information, but they save me hours of my life, and hundreds in hiring costs. I can get templates, corrections, suggestions, tips, etcetera, in seconds. It's incredible.
Most of my non-programmer white-collar friends use ChatGPT daily for their work. They are aware that it hallucinates stuff, so they don't ask it factual questions. Instead, they use it to transform existing data. It's wonderful at that task and saves them a lot of hours.
If you want to use that logic, why aren't we talking about banana farming? Also sold, also a product.
The biggest difference is one is an ongoing service of questionable utility to most people in it's current form. iPhone is pretty instant. I have an ipod. I have a cellphone. Now I have one device. Chatgpt can't really help me on the day to day right now. I'm sure as hell not paying for it. In 2008 if I could have paid for it, the iPhone solved a real thing I was facing and I would have purchased it.
You might not pay for GPT-4 but my employer would gladly fork over many thousands for each of our tens of thousands of employees if we could train it on our own dataset and run it in-house.
"In its current form" is also a weasel word. GPT-4 will replace hundreds of thousands of jobs over the next few years. It's already disrupted search, it'll soon disrupt most fields.
I see iphone being used by everyone. Trains, buses, even beggars have iPhones. Do you honestly think AI will have that level of penetration? As in not as piggy backed on another platform? That’s the only way to extract a lot of value.
There is a few nice improvements here and there like iOS handwriting recognition (similar function existed before AI) and clipping picture, but the products already existed for decades and AI only bring a few incremental novelty.
On the other hand, algorithmic feeds ands ads have made social networks way worse than they were a decade ago, and spamming is now on another level thanks to GPT.
So for now positive contributions of AI have been tiny and balanced by the negative ones.
• A photos app with an AI that identifies image subjects and allows search by content
• Computational photography
• 17 language translation built-in, more if they get Google Translate
• OCR when you tap on an image containing text
• Voice-to-text transcription in the default keyboard
• Fall detection and step counting powered by machine learning
I think AI is likely to be like "3D" or "multimedia" in the 90s: it will be a sales buzzword until it's everywhere and therefore no longer a differentiator.
3D acceleration itself is still a big market, it's just that when everyone had a GPU (unlike my Performa 5200), the marketing shifted to which specific brand — NVIDIA, AMD, or whatever.
I'd say a fundamental expansion in how much people use computers is what made smartphones a big shift. Maybe conversational generative AI will have the same effects, though I'm not too sure about that, people already use computers a lot.
Of course it was a bubble 2021 was nuts. People with just a few years experience were getting senior jobs earning hundreds of thousands of dollars, then leaving soon after to a new job earning even more. All the while WFH doing so sweet FA. I've never seen anything like that job market.
I’m baffled how so many smart people bought into the “new normal” narrative and believed that our pandemic era weirdness would last forever.
It was across the board. People in my hometown set up restaurants, cafes, coworking spaces - businesses that don’t see an ROI for a few years at least - all because they thought the young people who had returned home during the pandemic would stay there forever.
Of course, now almost everyone has gone back and these businesses are dead or struggling.
No they aren’t. They got paid large joining bonuses and 4 months of severance. They probably made 2 years worth of money in a single year. I’m recommending most of my friends at FAANG and FAANG-adjacent companies to go join a late stage startup that has high cash comp, since the equity value is pretty shitty everywhere right now. Once the market improves in a year or 2. Their resume will start getting shortlisted again because it had FAANG on it.
I think the days about caring about your employer are now in the past, given how companies have treated employees.
For the love of god you can get for hundreds of thousands of dollars actual senior engineers from Europe. It's baffling that US companies are not doing it.
Every mass layoff article posted on HN (Microsoft, Google, Meta, etc) had people posting comments showing the staff counts sometimes doubled in size in a few years prior then reduced by 1/5th (or less) of the pre-growth number via a layoff. A bubble would typically cannibalized all the growth, not a small subset.
I can see how an overeager company who bought into their short term growth numbers as long term realities might overhire during unexpected growth when you don't really know what the upper limit is.
A short term (relative) explosion in demand growth can put businesses in a difficult spot where they can't handle the growth presently and they don't know if the 10% increase will turn into 20%. Hiring is a longer term investment you can't just turn on/off since on-boarding and training is a big investment, and humans are involved, so there's a fundamental timeline disconnect.
There's also the flood of capital from the public markets. Even before COVID tech stocks increased quite a bit.
This is pretty different than a "bubble" which is a completely fake boom that explodes 100% then falls back to zero when people realize it's bullshit. This is more like a boom that goes from +20% to +10%. Which yes still ultimately left a sizeable group in the crosshairs.
Finally someone who isn’t over eager to catastrophically moralize the situation instead of being level headed.
I was a 9 year fang veteran whose pay was grown 70% without my input over Covid. Everyone hired except me who felt this would cool. I tried to stay level headed and build a money making division. I succeeded. And was laid off for my efforts.
So I hope the people who like telling people like me we were just dumb idiots overhyping ourselves never get a taste of how things really work.
I don't know the specifics but think it is distinct, since there's a pool of labor that people go in and out of rather than through a sequence of steps. Bullwhip might describe connecting labor to education, though.
Still counts as a bubble. The bulk of the tech hiring done during that period wasn't just to meet the increased pandemic demand but rather based on lofty corporate projections saying that the demand was here to stay. But then a couple years later the exponential growth didn't continue, didn't slow down, didn't even stabilize – it completely reversed back to pre-pandemic levels. And then all those companies found themselves with significantly higher payroll costs than they could afford.
Not every company is Google or Microsoft. You have to remember that the vast majority of the ones that had layoffs, including $100B+ "unicorns", have never made a single dollar of profit in their entire existence.
If there weren't protests, then we would still be locked down. Government loves lockdown. It is wrong to refer to it as pre-pandemic as opposed to pre-lockdown. But you probably are still wearing a mask in your car when driving by yourself.
One feeling I'm stuck with after reading the comments here, is that every commenter is "above average". It's "the others" that are bad developers that's just doing plumbing, or changing the frameworks every month, or just cashing a paycheck. What I'm doing is of course good and valuable work...
To be fair, I suspect there is a correlation between being interested in things like HackerNews and being an above-average developer. I've met some pretty poor developers who couldn't really give a stuff about innovations, best practice, etc, they just churned out mediocre code they'd learned at university, or copypasted, then clocked off at 5pm.
I get what you're saying, but there was an equally unconfirmed supposition in the parent comment that all types of developer (good and bad) were equally likely to use Hacker News. I'm just saying there are at least some data points in a small anecdotal sample that did not confirm this equal distribution and so it might be worth questioning whether that is true. Sure, to be certain, we'd need to actually measure it somehow.
I am half kidding, probably we are on the same page. I tend to ask where people get their technical news fix during SWE interviews and observe a strong correlation with quality of answers (won't speculate on which sources are better than others) and strength of developer skills.
I mean, if you just use the metric of "care enough about my industry to follow news and discussion threads about it", HN is already going to filter bad out somewhat, I'd imagine.
I don't assume it's the top half only, but it should weed out the worst kind, right?
I wonder if we’ll go through an irrational counter bubble. Where the prevailing wisdom will say “don’t hire too many devs, they create trouble with their activism” and “look at what <startup> did with 10 people” and “with copilot/ChatGPT we don’t need hardly any devs”
I'm honestly surprised this hasn't been more prevalent. Freshdesk [0] did this way back in 2011. There's certainly no lack of talent in India, I'm not sure if it's a cultural or systemic issue that has prevented Indian startups from getting more traction.
India has some of the most exciting growth in startup sector recently. If anything they're going through a massive bull market buoyed by the modernization of their online infrastructure. Their digital finance (UPI vs Zelle) and digital identification system (Aadhar vs SSN) is the envy of other countries now.
I think just about anything would be improvement over SSN if the system was actually designed as a national identification system instead of being improvised into one.
There is a lot of talk about Google as the role model of what not to do. Their employee base became a center of activism that bullied the management around. Not much came of it at the time because the company continued to generate tons of cash. But now everyone is seeing how mismanaged Google has been and likely distracted.
It’s not the devs fault but rather bringing in too many people that believe the workplace is a political place.
> too many people that believe the workplace is a political place.
I heard a kid ask what politics was all about. It's an interesting question for anyone to stop and think about.
Everyone will come up with their own conclusions, but I personally would ask what in a workplace is _not_ political ?
From hiring practices, environmental impact, customer/community management (minorities, disabled people, freedom of speech etc.), social mission, where the funds come from, where the lobbying money goes...your argument could be that employee should ignore everything a company does except what they are explicitly ordered to care about, but that doesn't look like what we socially expect from employees and people will be personally affected by how their company behaves as a whole.
I'm not arguing for extreme activism everywhere you work, but saying the workplace is free of politics goes way too far on the other extreme.
> From hiring practices … [to] where the lobbying money goes.… What in a workplace is _not_ political?
These are all great examples of "not your job" precisely because the people responsible for these concerns are the owners, and by extension the board, and by even further extension the "officers" appointed to concern themselves with these issues.
To take just one role as an example, it's certainly within a tech lead's purview to write a memo if she observes something that can be improved, but much beyond that and I think the tech lead would be straying from their mandate.
No matter how good their intentions, they're not going to be effective because they're not charged with those duties and so they rarely have the tools and authority to implement the changes they might be advocating. That lack of agency leads to frustration and discontent, and it distracts those looking to you for direction.
What is _not_ political as a tech lead? Making sure tickets are scoped well. Identifying poorly tested parts of the codebase. Anticipating future requirements and designing systems that will accomodate them. Setting expectations and ensuring that your team meets them, such as good commentary, an appropriate level of testing, or the quality of code reviews given amongst your team.
Of course, you need to be a good leader to accomplish those things, and there's a political aspect to good leadership, but those duties don't have much to do with, say, the environmental impact of the company.
To be clear, I'm not saying any of the causes you brought up aren't important, but I do feel like a lot of people have taken up those causes in venues where they're personally unlikely to be very effective (and content).
On the other hand, perhaps I have been formatted by being in too many small/start-up companies, but my experience is that is no such thing as "not your job". You can entrust someone to do that job while you're working on something else, but if you realize that a tool or process is broken, you should probably do something about it.
There definitely is such a thing as "spending way too much company time on politics", though.
I don't disagree with you. Maybe a more nuanced way of putting it than "not my job" is to ask "what is my role to play here?" In a smaller company/start up, your role might well be to look for things that need doing and do them.
> To take just one role as an example, it's certainly within a tech lead's purview to write a memo if she observes something that can be improved, but much beyond that and I think the tech lead would be straying from their mandate.
It's trickier than that I think.
Imagine for instance as a tech lead you realize the resumes of some specific minority all get shutdown by HR before interviewing. As a tech lead you make a polite inquiry, and they tell you nothing's wrong, they have their reasons they can't tell you, and you should mind your business.
You could argue any move from there could be out of your mandate...but as a tech lead you're told to bring the most technical value possible to your team. And hiring the best people fits into these optics. So you'd talk with some colleages about how you think it makes your team worse you couldn't hire that specific person you pushed as a candidate. The discussion extends to hiring criteria in general. More people come to you to share ideas. And now you have a group chat about hiring ethics that makes the management uncomfortable.
> ... as a tech lead you're told to bring the most technical value possible to your team.
This feels like a rationalization for a crusade.
This falls into the "write a memo" bucket. You notice a trend (like all applications by a specific minority get shut down). You document it by pulling a report. You identify the potential ramifications. Hit send. If it gets round filed and the trend continues, it's time to vote with your feet and explain why in your exit interview.
I think if you start bringing it up with your colleagues after you've been told "there are reasons and mind your business", and you encourage a conversation about the company's hiring criteria, and finally start an informal, unsanctioned working group on a matter you have no authority to change... You've definitely exceeded your mandate and distracted your colleagues.
It's only political because of the choice to build a faction and enter into a power struggle with HR vs. call attention to it so that those whose job it is to worry about it, can worry about it if they weren't aware of it.
> This falls into the "write a memo" bucket. You notice a trend (like all applications by a specific minority get shut down). You document it by pulling a report. You identify the potential ramifications. Hit send. If it gets round filed and the trend continues, it's time to vote with your feet and explain why in your exit interview.
Thing is, I don't think this attitude would apply to many other subjects.
For instance you notice your company has difficulty hiring so you talk to HR to give more visibility in tech conferences for instance. They drag their feet and don't want to bother, but you start looking around, discuss internaly and see the idea has traction and people are willing to volunteer to do it, so you come up with a realistic proposition for meetups on friday evenings. The plan is greenlit by your boss, you talk to your office manager, get the ball rolling, and 6 months later your company officially has a meetup event every month, and everyone's pretty happy you did it.
So where's the line between expanding company's hiring practices and expanding company's PR practices ?
My point is, at some level (I assume "tech lead" is not some grunt worker) you're supposed to interpret your mandate as broadly as it still makes sense from a practical point of view, and will be rewarded for moving things in the right direction.
Saying "getting the right people for the job in your team" should stop at the memo level doesn't fit my experience of what is expected from that kind of role.
Everything is also math, or physics, or chemistry, or biology, or computation, or philosophy, or economics, or psychology.
It's not profound to put on blue-tinted glasses and marvel at how blue everything is. The only thing it tells me is the POV you chose for yourself.
"Everything is politics" is usually used as a wedge against uninterested people. Either you join my cause, or you're responsible for all current evil. It's certainly not an enlightened neutral observation. It's a call for culture war.
Ex-Googler here. I'm kind of surprised Google leadership lets themselves get bullied so hard. E.g. when Diane Greene lied several times about the nature of Google Cloud's involvement with the Air Force, a ton of employees complained, but it's not like they quit the next day. The retention numbers were always very smooth curves seemingly unaffected by the drama of the day. If you went around and talked about the latest thing, 80% of employees had no idea what you were talking about.
It's kind of cynical, but if Google management doesn't want activism, they should ignore their employees, wait for the activists to burn themselves out and quit, and then be happy with all the paycheck collecting cogs that remain.
It's because they don't "get bullied". They _want_ this kind of stupid identity activism and encourage it from the top with their DEI propaganda officers. Any "activism" and identity outrage is an opportunity to make it look like they're doing something "progressive" without actually doing anything meaningful (i.e. expensive) and precludes any material change that would actually help people and threaten the upper classes' power or wealth. The instant that "activists" aren't useful to them anymore, they get axed immediately. Just look at Microsoft's AI ethics team. An entire team dedicated only to whitewashing their corporate image.
I don't think this is true. Why would they spend millions of dollars making elaborate surveillance and censorship tools for a Chinese version of Google search if they wanted everyone to accidentally find out about it later and throw a hissy fit? Why would they try to make deals with the Air Force, DoD, and ICE if they want their employees to sour them? Why create an underclass of contractors if they want their employees to advocate for them? There have been so many political conflicts between employees and management and they mostly end up in the nytimes.
DEI is kind of a different beast. Largely, the employees and management are aligned. There isn't a political war going on.
The AI ethics teams and DEI teams are often powerless appendages that make it seem like they are doing something, but they aren't activists. They are just doing their job.
Not all of the office politics is in their favor. I think modern companies have realized that it's much easier to refocus employee's attention than to outright forbid any dissent. Surveillance and contractor abuse are actual issues that materially affect people, so I suppose it's good that the employees called them out on it. Other issues that get pushed by upper managers... not so much. Also, I don't want to say that all the DEI officers and upper management are conspiring to evilness. Many of them do believe in the cause, but in the grand scheme it works out that way. Political parties certainly know how the ball rolls and abuse identity politics to the maximum extent. Private companies are then all too glad to go with that, even if the good-meaning employees don't know what they're really doing.
> they aren't activists. They are just doing their job
But they try to make their job some sort of activism (especially academics who should strive the most to be unbiased), which usually leads directly to discrimination in some form or the other, since that is the only thing they're allowed to do.
A lot of people were talking 5-10 years ago about what would happen to Google when the original devs cashed out and mediocre company suits took over. We are living through that reality now. For the first time in 25 years Google actually looks vulnerable- the opportunity to win in internet search is up for grabs now in a way that it hasn't been since the '90s.
And when the new search giant steps onto the scene, you can bet that they will be paying top dollar for their devs.
Google's wounds were self-inflicted and totally public: introduce a product, let in linger for 2 years, kill it, over and over again, dozens of cycles of this.
No political explanation needed, just very obvious senior level mismanagement.
There is a ton of noise in data like this. Also, large companies don't hire the same way and for the same reason as smaller companies.
1. A lot of job postings are fake. They are meant to keep recruuting pipelines open. There can be a long delay between posting an ad, getting applicants, filtering them, doing phone screens, doing interviews, doing team-matching, extending an offer, negotiating that offer, having it accepted and that candidate start. It can easily take months. You may start recruiting before you have a position to fill. Or you may hire proactively;
2. You can't discount the effect of visa job postings. This is where you have to "prove" you can't fill a job with a US resident or citizen. One thing you have to do is post a job. There's a whole cottage industry in posting these jobs in obscure places (eg newspapers, caulk boards in obscure places in an office) and in finding legally valid way to reject applicants.
3. Smaller companies will have positions to fill. Larger companies have hiring targets and budgets, fill positiosn and then figure out what to do with them.
4. Part of all the above is large companies definitely over-hire as normal practice in case people are needed for something and to deny those people to other employers.
What we're seeing now is a reduction in that over-hiring buffer plus some virtue signaling layoffs all while having a useful opportunity to attack labor costs. It's a double bonus.
When we talk about tech hiring we often focus on demand and don't think about supply. The number of IT/software graduates has been steadily increasing as well. And they have to work somewhere. If there are no jobs these new grads can "flip burgers" or something, but more likely they'll find someone willing to hire them to crank out code. Of course the cause and effect are intertwined. More code means more apps get made, which means more investors think the market will keep going up, which makes them invest. That's the "bubble".
I've been coding for 10 years. The coding profession has been a thing for maybe 40 years. The industry still hasn't worked out how to plan for these labor shortage and surplus cycles.
What I've noticed, at least in New Zealand, is a lot of work that is being done is replacing legacy systems with many organisations having kicked the can down the road only to get the point that they've run out of road. The IT cycle appears to come in waves where there are moments of transition only for things to die down as organisations move to maintenance mode until the next technology arrives and thee is a start of a new transition. A good example is how browsers have become more sophisticated and it is now possible to do things that used to require bespoke programs to do tasks that can now be pushed into the cloud. A good example of that would be at my work place where moved to using Amazon Connect via Chrome, from having a locally run Exchange server to it being thrown into Google Workspace along with macros used in Microsoft Office being migrated into Google Docs. It'll be interesting to see whether there is a bounce back as the big players embrace AI or whether it'll be about improving what already exists rather than entirely new software platforms.
> And they have to work somewhere. If there are no jobs these new grads can "flip burgers" or something, but more likely they'll find someone willing to hire them to crank out code.
If there are no jobs, then how? The market doesn't have to use up the entire supply of skilled workers. It's not going to create jobs just to give people jobs.
1) create enough of a safety net or give people enough cash that they can start businesses taking advantage of the increased productivity of ai
2) austerity: let them get fired from jobs in failing industries with no safety net, they will not have the opportunity to create businesses, and they will work in crappy tedious jobs a robot or ai could easily do, but because companies can afford cheap humans and they don’t have capital for robots, they will hire people desperate to do anything to keep their family from homelessness.
No they don’t. Plenty of professions generate way too many graduates. Law is a good example—probably 2-3x as many graduates than are really needed.
It leads to a bimodal distribution of salaries. The big firms pay 200k to a grads with decent grades from good schools. But a lot of people can’t even find a 60k job in law.
I think it doesn't nearly tell the story though, the tech market is highly stratified and specialized.
Was there an over-hiring of sales people, hr, middle managers, and underqualified engineers? Yeah, definitely. I saw bartenders becoming "tech recruiters" or "front end engineers" overnight with no formal technical education. I'm not a credentialist, but I am a realist, and the safer bet is someone w/a BS in CS to most companies.
Is there still massive demand for experienced and qualified engineers? Absolutely. I'm cautiously optimistic about the future of tech, there is a massive flee to value that usually accompanies turmoil in the markets and it' time that tech focused more on value.
That's a weirdly-specific conclusion to make when even the article's own graph shows that general job postings stagnated at the same time programmer and broader IT postings stagnated. Tech jobs declined sooner, but it looks as if general job postings are starting to decline as well.
It's becoming increasingly apparent that we're either on the verge of a recession or already in one. This is more than a particular job sector being a bubble; the whole economy is arguably a bubble.
General long term workforce participation by those not in specialized or career fields has been overall declining since the mid 2000s. 2012-2017 saw a spike in specialized field jobs thanks to online media, but career positions continued to decline in overall numbers while still being occupied [1]. In general labour the trend has been to reduce staff sizes, drive more hours per day per worker, and reduce overall weekly hours [2]. Which naturally results in fewer job positions.
There is no recession coming up, because the truth is we never escaped the one that occurred in 2008 to begin with. We've just been covering it up.
Career positions in government and finance were stable in hiring and retention if slightly reduced month over month, while non-career positions such as retail reduced in number.
No, you're not suppose to notice. I mean just last week there was another "tweak" to the CPI calculation[1], but we still came in absurdly hot at 6% YoY. It's about the 4th change in two quarters if I'm remembering correctly.
> Starting with January 2023 data, BLS updated weights annually for the Consumer Price Index based on a single calendar year of data, using consumer expenditure data from 2021. This reflects a change from prior practice of updating weights biennially using 2 years of expenditure data.
Well, a lot of small corrections seem better than a few large ones to me. Maybe it captures some noise, and it's worse, but I don't think one year of data is small enough to go past the optimum.
It is healthy to have some amount of mistrust of official data, but those adjustments are very well defined, and there's no evidence that they deviated from the usual procedure on making them. It's not them happening that taints the data.
We're in an employee shortage, recession or no recession, the largest segment of workers in the US ever (the boomers) are retiring and the largest cohort just turned 65 the last quarter of 2022. They've been squawking about the impending recession since 2015 and here we are, they keep saying it but it doesn't happen.
> They've been squawking about the impending recession since 2015 and here we are, they keep saying it but it doesn't happen.
Mostly because the Fed has been creative about keeping the economic bubble going and preventing the numbers from "officially" indicating whether a recession has started. Those folks "on the ground" are more exposed to the reality of the situation: wealth inequality is expanding, wages haven't kept up with rents in decades, inflation and plain old profiteering are resulting in higher prices for worse quantities/qualities... for anyone other than those at the top of the economic ladder (who are largely insulated from the effects of their own economic policies), it's readily apparent that the current economic system is unsustainable, and it's entirely a matter of "when", not "if", the Fed and the broader financial system runs out of tricks to maintain our suspension of disbelief.
True, but tech has grown a lot in the past 30 years, so the worker age distribution skews younger. At older tech companies that are big, boring, and still do things (Intel, Cisco), it probable skews a little older.
Something I saw with a chart shows that despite massive layoffs recently total Microsoft headcount has only just returned to pre COVID-19 levels. On a long term trend from 2008 to present it's still a significant increase.
That's nice. But doesn't help those of us who lost their jobs with lots of experience and actual previous work ('not just sitting around/overhired') and now can't find anything.
The odd thing is that it's not (just) these recent "overhires" getting laid off but lot of us more experienced employees as we are more expensive.
Can somebody explain what's happening? If I just scroll a bit, then try to go back I get another graph. Then if I try again to go back it works. It's probably accidental, but I'm curious what is causing this behavior.
It would be interesting to see some stats on the types of engineers that have been laid off in recent months, unfortunately, I don't think there's any data that really breaks it down to that level of granularity.
Postings don't necessarily correspond to hires, but where did all these people come from?! I can't imagine there were that many more CS grads around that time, or that a lot of CS grads were immediately going to a different job after graduation.
Not sure I agree with the "bubble". If something decreases during a crisis it doesn't automatically mean it was a bubble before that crisis.
And if it ever was a bubble, it was just a FAANG and startup bubble, but the world of tech is far, far greater than that. That guy that writes Java for a tyre manufacturer is a tech guy. So is that guy that writes C# at a bank.
I feel BLS is a bit of a lossy dataset imo - their classification for professions is a bit meh (CIS for all types of Software adjacent roles for example).
USCIS Datasets might be useful as a proxy but aprochryally white collar immigration to the US basically tanked from 2020 onwards due to a mix of COVID restrictions and xenophobia so I'm not sure if such a proxy would be useful in the 2020-2023 timeframe compared to anytime between 2010-2020z
Was there a tech-hiring bubble, or has everyone just realized that posting on Indeed gives you garbage results, and has adjusted their posting activity accordingly?
This was confirmed for me back in 2021 when a personal trainer at my gym decided to do a quick coding boot camp and immediately got hired in the industry.
Tech companies have done a lot of aggressive hiring expecting a rise in business, but with the economic downturn coming, the business opportunity is dwindling .so they have to cut cost. And ofcourse as the corporate logic goes, instead of reducing pay or withholding certain benefits of the top paid executives, it's the lower paid employees who are Expendables and hence on the firing line first.
We are overpaid for incredible working conditions and devs basically became capricious divas, despite the fact 90% of them are plumbers, and many not very good ones.
If you had any professional doing the same, wasting so much resources as us, changing part of the tech stack every month, debating vocabulary on twitter ad nauseam instead of coding, and whining about how their first world problem should be the focus right now rather than doing their job, they would get laughed at.
But we were incredibly lucky that IT is the most amazing productivity cheat code humanity has come up with so far, so that all this BS was accepted as the cost of doing business.
Well, here is the wake up call.
No, we are not paid to rate the best cappuccino of the valley, converting the most stable software of your org to Elm nor write a commit hook so that nothing can be pushed before the diversity committee validated the change set.
We are paid to solve problems.
If you don't solve problems, when the hard times come, and they always do, you become part of the problem.