Hacker Newsnew | past | comments | ask | show | jobs | submit | more materielle's commentslogin

Wait, so if we give the foreign workers the same at will employment rights as Americans, then they are no longer interested?

I thought they needed these foreign workers because no American could do the job?


No, what they wouldn't be interested in is paying $100,000 to help someone enter the country, with no compensation if they ditch you on day one.


The idea would be that you would pay that employee at above market rates, so they wouldn't ditch you on day one because you pay them more than any of their other alternatives.

Right now, the H1B system is used to bring over cheap labor, willing to work for compensation and conditions worse than native labor. This is not the stated goal of the program, the idea was to bring over highly skilled labor doing jobs that no-one native is able to. The system detailed above is supposed to be a way to change it from how it currently is to what it was supposed to be.


It should be marshaled into {} by default, with a opt-out for special use cases.

There’s a simple reason: most JavaScript parsers reject null. At least in the slice case.


Not sure what you mean here by "most JavaScript parser rejects null" - did you mean "JSON parsers"? And why would they reject null, which is a valid JSON value?

It's more that when building an API that adheres to a specification, whether formal or informal, if the field is supposed to be a JSON array then it should be a JSON array. Not _sometimes_ a JSON array and _sometimes_ null, but always an array. That way clients consuming the JSON output can write code consuming that array without needing to be overly defensive


My opinion is that AI isn’t actually the root of the problem here.

It’s that we are heading towards a big recession.

As in all recessions, people come up with all sorts of reasons why everything is fine until it can’t be denied anymore. This time, AI was a useful narrative to have lying around.


I think a kind of AI complacency has set in. Companies are just in chill mode right now, laying off people here and there while waiting for AI to get good enough to do actual work.


Everyone is bracing for a labor supply shock. It will move in the direction opposite what investors expect.

2030 will be 2020 all over again.


Why?


If (a) companies lay too many people off because the magic robots will make engineers unnecessary and (b) the pipeline collapses, because being a software engineer is an undesirable career because it is being replaced by robots and (c) it emerges that the robots are kinda bullshit, then there's going to be one hell of a shortage.

When I started a CS degree in 2003, we were still kinda in the "dot com crash has happened, no-one will ever hire a programmer again" phase, and there were about 50 people in my starting class. I think in the same course two years ago, there'd been about 200. The 'correct' number, for actual future demand, was certainly closer to 200 than 50, and the industry as a whole had a bit of a labour crunch in the early 10s in particular.


I believe we are vastly underestimating the number of programmers needed, as some companies reap unusually high rewards from hiring programmers. Companies like Google can pay huge sums of money to programmers because they make even higher sums of money from the programmer's work.

This means that they inflate programmer salaries, which makes it impossible for most companies that could benefit from software development to hire developers.

We could probably have five times as many software developers as we have now, and they would not be out of work; they would only decrease average salaries for programmers.


But if only Google or similarly sized companies can pay that well, and there’s tons of programmers, obviously the average salary will balance out lower than what Google pay but will still be competitive to the thousands of programmers who didn’t get hired at Google.


>but will still be competitive to the thousands of programmers who didn’t get hired at Google

Why would this be the case? Many programmers join Google or Meta (or similar tier companies) and immediately double or triple their income. Software salaries are famously bimodal and people often transition from the lower mode to the higher mode practically overnight.

In fact (and I'm not an economist) I conjecture that the lower mode exists because the upper mode exists. That is, people purposefully don't really care what their salary is (i.e. don't put upward wage pressure) when they're at lower-mode companies because they know one day they'll make the leap to the upper-mode. In other words, the fact that Google-tier companies pay well allows other companies to pay poorly because those guys are just padding their resumes to get a 350k job at Google and don't really care whether Bank of Nowhere pays them $90k or $110k.


People absolutely do care what their salary is. And most people never work at Google...


Well clearly not enough to make the two modes meet.


You could make this argument about almost literally every field.


If a company could benefit from software developers but can’t afford them, then they can purchase Saas offerings written by companies that can afford developers. I don’t think we’ve run out of opportunities to improve the business world with software quite yet.


The fact that there is a market for these products, but they are almost universally terrible, supports my point.


I think it might be worse that that as staff reductions are across the board, not just in software development roles. My hope is start up creation will be unprecedented to take advantage of the complacency. They will wonder why AI deleted their customers when they thought it was supposed to delete their employees.


Holding on for that sweet sweet pay bump after the coming AI winter


Combine a bunch of factors:

1) fewer students are studying computer science, I'm faculty at a top CS program and we saw our enrollment decline for the first time ever. Other universities are seeing similar slowdowns of enrollment [1]

2) fewer immigrants coming to the united states to work and live, US is perhaps looking at its first population decline ever [2]

3) Current juniors are being stunted by AI, they will not develop the necessary skills to become seniors.

4) Seniors retiring faster because they don't want to have to deal with this AI crap, taking their knowledge with them.

So we're looking at a negative bubble forming in the software engineering expertise pipeline. The money people are hoping that AI can become proficient enough to fill that space before before everything bursts. Engineers, per usual, are pointing out the problem before it becomes one and no one is listening.

[1]: https://www.theatlantic.com/economy/archive/2025/06/computer...

[2]: https://nypost.com/2025/09/03/us-news/us-population-could-sh...


1. OBBB rolled back the R&D deduction changes in Section 174 that (allegedly) triggered the layoffs and froze up hiring in 2022-2023.

2. It looks like rates will keep going down.

3. Fewer people are going into CS due to the AI hysteria. You might say oh there's a 4 year lag, but not quite. We should see an immediate impact from career changers, CS grads choosing between career and professional school, and those switching out of CS careers.

The tech AI fear hysteria is so widespread that I've even heard of people avoiding non-SWE tech careers like PM.


First thing I thought of was Benioff saying he cut thousands of customer support roles because AI can do it better then turning around and giving lackluster earnings report with revised down guidance and the stock tanks


I have never, ever seen SVPs, CEOs, and PMs completely misunderstand a technology before. And I agree with you, I think it's more of an excuse to trim fat--actual productivity is unlikely to go up (it hasn't at our Fortune 500 company)


>> productivity is unlikely to go up

I wonder how that would even be measured? I suppose you could do it for roles that do the same type of work every day. I.e. perhaps there is some statistical relevance to number of calls taken in a call center per day or something like that. One the software development side however, productivity metrics are very hard to quantify. Of course, you can make a dashboard look however you want, but impossible, essentially to tie those metrics to NPV.


Productivity = profit / employees


> I have never, ever seen SVPs, CEOs, and PMs completely misunderstand a technology before.

I'm legit not sure if that's sarcasm or not


> we are heading towards a big recession

Who is we? One country heading into a recession is hardly enough to nudge the trend of "all things code"


The last US recession that didn't also pull in the rest of the western world was in 1982, over 40 years ago. Western Europe, Aus, NZ, Canada, and the US all largely rise and sink on the same tides, with differences measured in degrees.


Enough of the tech industry is America-based that a US recession is enough to do much more than nudge the trend of "all things code". Much as I would prefer that it were not so.


America's recessions are global recessions.


Sadly yes - "When America sneezes, the World catches cold"


If that “one country” is the US and not, say, Burkina Faso, it is a major impact on financing, and software has an unusually high share of positions dependent on speculative investment for future return rather than directly related to current operations.


Traditionally, there are two strategies:

1) Use the network thread pool to also run application code. Then your entire program has to be super careful to not block or do CPU intensive work. This is efficient but leads to difficult to maintain programs.

2) The network thread pool passes work back and forth between an application executor. That way, the network thread pool is never starved by the application, since it is essentially two different work queues. This works great, but now every request performs multiple thread hops, which increases latency.

There has been a lot of interest lately to combine scheduling and work stealing algorithms to create a best of both worlds executor.

You could imagine, theoretically, an executor that auto-scales, and maintains different work queues and tries to avoid thread hops when possible. But ensures there are always threads available for the network.


That really seems to be the defining characteristic of the 21st century elite: they’re shameless and proud of it.


Only 21st century? Have you read any history at all?


This has actually been one of the ideas floated by regulators.

The idea is that merit based admissions is actually pretty complicated, so we can allow individual universities continue to experiment with their own implementations and approaches.

However, we can hold them accountable by grading them based on retrospective data.


I think there are two things to keep in mind.

1) Apple and Firefox have enough resources to implement the most recent web standards. When you see a feature which goes un-implemented for too long, it’s almost surely because nobody was even working on it because of internal resourcing fights.

2) Devs aren’t created equal. It’s possible for a team of 8 people to be 10x more productive than another team of 8.


> When you see a feature which goes un-implemented for too long, it’s almost surely because nobody was even working on it because of internal resourcing fights.

Or because they are reluctant to implement it for technical reasons? Not every "standard" that gets thrown on the table and implemented by Google is a brilliant idea.


I think a problem with AI productivity metrics is that a lot of the productivity is made up.

Most enterprise code involves layers of interfaces. So implementing any feature requires updating 5 layers and mocking + unit testing at each layer.

When people say “AI helps me generate tests”, I find that this is what they are usually referring to. Generating hundreds of lines of mock and fake data boilerplate in a few minutes, that would otherwise take an entire day to do manually.

Of course, the AI didn’t make them more productive. The entire point of automated testing is to ensure software correctness without having to test everything manually each time.

The style of unit testing above is basically pointless. Because it doesn’t actually accomplish the goal. All the unit tests could pass and the only thing you’ve tested is that your canned mock responses and asserts are in-sync in the unit testing file.

A problem with how LLMs are used is that they help churn through useless bureaucratic BS faster. But the problem is that there’s no ceiling to bureaucracy. I have strong faith that organizations can generate pointless tasks faster than LLMs can automate them away.

Of course, this isn’t a problem with LLMs themselves, but rather an organization context in which I see them frequently being used.


I think it's appropriate to be skeptical with new tools, and being appropriately, respectfully, prosocially, skeptical, point out failure modes. Kudos.

Something that crosses my mind is if AI generating tests necessitates that it only generates tests with fakes and stubs that exercise no actual logic, the expertise required to notice that, and if it is correctable.

Yesterday, I was working on some OAuth flow stuff. Without replayed responses, I'm not quite sure how I'd test it without writing my own server, and I'm not sure how I'd develop the expertise to do that without, effectively, just returning the responses I expected.

It reminds me that if I eschewed tests with fakes and stubs as untrustworthy in toto, I'd be throwing the baby with the bathwater.


But this is the problem. Our premier academic institutions shouldn’t merely exist as job training programs for big tech.

If anything, tech is still one of the better off fields in the university.

Look at history or literature programs for where this is heading. I’d imagine that most literature majors don’t even read at all these days. As recent as 50 years ago, the requirement involved hundreds of pages of reading per week, over a sustained 4 year period.

Honestly, just close down the university at this point, if all it wants to do is print out degree certificates for social signaling in the job market.


oldpersonintx2, your account is shadowbanned.

Which colleges did you send your kids to, what kind of degrees (just bachelors? undergrad and grad?), and how many kids?

The $800k figure without that context tells us nothing. If that's for 2 kids to get a BA/BS/BE, you got ripped off. If it's for 4 or 5 kids it makes much more sense when examining current costs.


[flagged]


I understand your feelings about this but on HN we still need you to follow the guidelines, which include avoiding uppercase for emphasis and avoiding personal swipes like this:

> I'm laughing at your naive take

https://news.ycombinator.com/newsguidelines.html


What's more interesting is that it's their second account, because the previous one was banned. What's HN policy on ban evasion?

Previous account: https://news.ycombinator.com/user?id=oldpersonintx


Thanks for that. If an account is banned but the user signs up a new account and starts contributing positively and respecting the guidelines, that’s a good outcome. If they just pick up where they left off, that’s what we call a “serial troll” and we’ll ban the new account with fewer or no warnings.


The C++ standard committee is definitely smart. But language design requires sense beyond just being smart.

They didn’t do the best with what they had. Sure, some problems were caused by C backwards compatibility.

But so much of the complexity and silliness of the language was invented by the committee themselves.


As someone that really enjoys C++, I would say that the current issues are cause by lack of preview implementations before being voted into the standard, this is just broken, but there are not enough people around to be able to turn the ship around.

Those folks eventually move to something else and adopt "C++ the good parts" instead.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: