These results are to be expected, if IQ is normally distributed, but we push more people into obtaining university degrees.
The value of a university degree has severely deteriorated since every white collar job essentially requires having one and it will continue getting worse.
I would assume that it's worse in the US than in Europe, because in EU it seems that education is less commercialized and you can get a degree for free if you are above average.
We also no longer have as clear a signal that can be used to get high-IQ people to associate with each other then move into jobs that benefit from a very high IQ. Although on the other hand the idea that military officer intelligence is dropping (as presented in the article) is pretty concerning. If there is one place we want unusually high IQ it'd be there.
It is interesting to look at history and see how episodes of progress often require little clusters of geniuses to make them work. In the worst case, it might be harder to set that up now. Although it is hard to tell. Maybe it'd be good to normalise just testing IQ directly if that isn't already a thing.
> in EU it seems that education is less commercialized and you can get a degree for free if you are above average.
"Above-average" students (as measured by, for example, the SAT, ACT and/or GPA) are offered free university educations in many/most states in the US. It's just that average and even below average students can and do also gain admission, though obviously the completion rate is lower. College mostly only costs money in the US if you are not academically excellent and/or you elect to go to a private or out-of-state university without a scholarship.
Doesn't this mean the value of a university degree has increased?
If you now have to have a degree just to make it to the 40th percentile of the earning bracket that surely means it is more required than ever, and thus worth more.
The degree is worth what it can "buy". If a generation ago the degree got you to the 20th percentile earning bracket, but today it gets you to the 40th, then that degree is "buying" you less earning power overall.
Concrete example: a generation ago you could get your foot in the door at a bank with a high school diploma as a teller. Today? You need a 4-year degree in finance. Further, if you walked into that same bank with that same 4-year degree a generation ago, you'll probably be much further up the ladder than teller.
I'm not sure I agree. If more people have a university degree, then it's no longer than differentiator that it used to be. To obtain this degree, you have to take on a substantial student loan in the States from what I understand. To me it seems like a big financial risk, which might not pay off in many cases.
So, if we consider value to be the expected life time value of obtaining a degree with respect to the person's net worth and plot it over time I would assume that this value would be decreasing.
The trends for HS and Dropout are also negative. To me it implies education across the board is getting worse (assuming IQ is a good measure of “good” education).
My comment is not about changing the language but about the fact that whenever we rewrite it, we improve the quality because of the lessons learned during past iterations.
> My comment is not about changing the language but about the fact that whenever we rewrite it, we improve the quality because of the lessons learned during past iterations.
So many developers have said this over the years but it almost never comes true.
You just end up with different bugs in a different language.
I dunno why I misread "write in any language" with "write in another language".
I'm still skeptical about rewriting - the bluetooth spec is notoriously buggy itself, and many "bugs" and glitches in BT are due to how poorly the spec is written.
I'm not too sure I agree that the spec itself is buggy, certainly the implementations vary wildly from Sony almost doing their own thing to Chinese off the shelf copy pasting whatever makes a noise.
That said I have worked extensively with Bluetooth within Ericsson and while there is a learning curve, I never found the spec to be lacking.
Your last sentence is exactly what I was thinking. The problem with BT isn't necessarily on the kernel's bluetooth driver. The spec is buggy and also a lot of makers of bluetooth devices don't implement the spec properly. But the spec itself isn't spectacular to begin with.
A rewrite might simply make it more resilient through changes in the base architecture. However, I know nothing about Linux's bluetooth stack and I assume that it's probably taking into account a lot of those glitches already.
That highly depends on who is doing the rewriting and whether they were involved in writing the current system. If someone new starts rewriting the system in a memory safe language then it's quite likely they will make many of the same mistakes the original author did.
Debatably, as language rewrites can bring their own problems. Especially with a newer language like Rust that lacks experienced eyes to review. You get more benefit rewriting the bluetooth drivers in their current languages.
Agree wholeheartedly with the conclusion of the article.
But the post makes it seem that there was no real query-level monitoring for the Postgres instance in place, other than perhaps the basic CPU/memory ones provided by the cloud provider. Using an ORM without this kind of monitoring is sure way to shoot yourself in the foot with n+1 queries, queries not using indexes/missing indexes etc
The other thing that is amazing that everyone immediately reached for redesigning the system without analyzing the cause of the issues. A single postgres instance can do a lot!
What's your recommended way of implementing this in a simple App Server <> Postgres architecture? Is there a good Postgres plugin or do you utilize something on the App side?
I've used pganalyze which is a non-free SaaS tool. Gives you a very good overview of where the DB time is spent with index suggestions etc. There are free alternatives, but require more work from you.
The huge role of Soviet Union was pushed by the Soviet Union as part of its Great Patriotic War narrative is continued to this day by Russia's propaganda. Without the US Lend-Lease, the situation could have turned out much differently for the soviets (as even Stalin himself admitted).
I wasn't trying to claim that the USSR was the sole contributor to the defeat of the fascists, sorry if it sounded like that.
I just wanted to make a pretty strong example of the winners rewriting history and how this propaganda becomes fact for the society.
I'm sure everyone here agrees that it's good that the fascists lost the war and that the USA enabled Europe to stay democratic. It was a very brutal period of time in which human life was sadly undervalued.
Ironically a better example of this would your own claim that focuses solely on the crimes of Nazism without any mention of Communist crimes. Since USSR was on the Allied side, Communism never quite turned into the embodiment of evil that Nazism has become. Due to this, today many academics are proud to call themselves Communists, whereas you would be hard-pressed to find any self-proclaimed nazis, at least in the mainstream of academia. All because history is written by the victors.
For the record, i never claimed anything beyond that the public opinion changed over the years, likely because of propaganda. A lot of factors were at play, including the neverending resistance by good people, both in the Reich and occupied territory, the categorical extermination of educated people that didn't subscribe to the Nazi believes and more. Summing everything up to be because of "n" has always been incorrect.
Which is why I didn't attribute truth to either of these claims, instead I pointed out that the USSR would've likely taken over Europe without the direct involvement of the USA, i don't think that this is controversial at all, or do you believe they would've stopped in the middle of today's Germany if they weren't forced to?
Beyond that, Nazis are hardly vilified, because they're literally villainous. Please remember that Nazism isn't National Socialism, what the Nazi regime was based on. It was one specific version of it, which includes mass murdering vast amounts of people. You can make an argument that national socialism isn't necessarily bad, but doing the same for the Nazis means you also condone mass murder, as a simple example: exterminating queer people and people with disabilities was core to their doctrine.
I'm utterly at a loss how I'm supposedly ignoring the crimes of the USSR. Would I have been glad that the USA protected Europe from it if I considered them faultless?
I like to add a bit of historical context and that is that communism as an ideology,field of study and it’s ideas were quite spread out over Europe and beyond (Way before the Russian revolution and after)
Karl Marx ideas were very revolutionary for its time (a good primer can be found in the book The Value of Everything) and set a lot of ideas in motion. And there was also debate in that regard on how to institute communism. So communism could never reach the “evil” moniker like Nazism, even though it was apparent the Soviet Union was quite a brute. So the ideology was kind of separated cognitively from its implementation by the Soviet Union.
In the Cold War period, the SU was definitely seen as something that had to be defeated. There was a lot of fear of nuclear escalation between the superpowers. The Soviet Union was seen as different at best, and something to be defeated in all cases. And yes, also evil. Just watch some action movies from that time to get a general idea.
yes, this is exactly the way it is. most people are perfectly fine with throwing their ideals out of the window if they were applicable to people they don't like.
Perhaps it was run by some introverted developer-types? Or there were some cultural differences at play?
The reaction seems a bit excessive and I'm not sure if it characterizes the company as a whole.
To people who hate running, run slower! Most people who hate running are conditioned to go way too fast. Once you build up a base, then you can speed things up.
There is little enjoyment if you are constantly struggling. A smartwatch can give you a pretty good idea of how difficult a run is based on your pulse.
I've come to the conclusion that such inept statements can only come from people who have lived their lives in very safe environments where they have never needed police protection. Such environments do not represent the safety of the country or the world in general.
It strikes me as odd that someone as familiar with history as yourself would single out the US and the practice of slavery whereas slavery has existed almost everywhere in the world, including here in Europe. Yet nobody in Europe thinks that the formerly enslaved peasants' much temporally disconnected descendants are substantially worse off because of that or that this would somehow translate into today's policing.
In conclusion, abolishing the police is a completely stupid idea that the American people in general do not even support [1] that hurts not the tech elites of Silicon valley, but normal people living in high-crime areas.
No, it's the other way around. Advances in computer hardware have allowed the use more inefficient programming languages allowing more inexperienced and unskilled programmers to create programs leading to more resource hungry programs. When there are little resource constraints the only real constraint becomes developer time.
I don't see how having IDEs implemented in browsers has anything to do with security, the speed of light or compatibility. It's just the lack of constraints allowed by advances in computer hardware.
Most software is written with no performance considerations in mind at first and the performance issues are addressed only when they become visible. However, if there is abundant memory available, why bother?
> I don't see how having IDEs implemented in browsers has anything to do with security, the speed of light or compatibility. It's just the lack of constraints allowed by advances in computer hardware.
This isn't a compatibility issue? We've seen about 8-16 branches of the write-once, run-everywhere tree over the past 25 years, I'm not sure how that isn't seen as a constraint on programmers. JWT, Swing, Web, Cordova, QT, React/<web front-end> Native, Xamarin, Electron, Flutter and even quirky ones like Toga have all attempted to solve this problem. The only unifying thread has been that managers follow greedy algorithms and choose the lowest common denominator platforms as possible. QT, the Java tools and Xamarin at least can't be lumped into the inefficient language bucket, though the UX is just awful. Other than hardware drivers, it's hard to think of a clearer example of compatibility constraints.
> JWT, Swing, Web, Cordova, React/<web front-end> Native, Xamarin, Electron, Flutter and even quirky ones like Toga have all attempted to solve this problem, and
... and for the most part they have. You can write your app right now and the only thing you need to worry about is screen size. If you use bootstrap, even this is mostly solved. Your app is accessible on Windows, Linux and Mac; Chromebooks and Tablets; iPhones, Android and even the one Symbian user. Of course it's not perfect yet, there are edge cases and you cannot do everything, but let's not act like things have gotten worse.
> The only unifying thread has been that managers follow greedy algorithms and choose the lowest common denominator platforms as possible.
Yes, I agree. But for nearly every use case, it's good enough. Take HN as an example: Does it need anything more?
Of course, if you need access to specific hardware, you'll have to go deeper. But if you do not, it would simply be you taking the lowest common denominator. And I'd argue that the framework probably did a more thorough search.
I basically agree, with the caveat that I'd still prefer a world where our write-once-run-everywhere lowest common denominator at least required native widgets and an ability to integrate new platform-specific capabilities at the expense of writing a small amount of native code, rather than barfing up web UI/UX at users (e.g. the execrable MS Teams).
While there is certainly more complexity in modern software, it does not necessarily need to translate to increased memory, CPU usage and increased latency for the user. Are you saying that this increase in software complexity definitely increases these requirements?
Java is definitely not a good example of a memory-efficient language when compared to its non-GC alternatives.
It all comes down to economics, software is written as inefficiently as possible as long as it does it jobs and is not hindered by this and this actually the crux of "Wirth's law".
The “Java bloat” mostly has nothing to do with the language itself, but is caused by the ecosystem around it. On one hand you have overly abstract frameworks and on the other you have inexperienced programmers who don't understand how such frameworks are supposed to be used and write code that actively fights against the framework...
The value of a university degree has severely deteriorated since every white collar job essentially requires having one and it will continue getting worse.
I would assume that it's worse in the US than in Europe, because in EU it seems that education is less commercialized and you can get a degree for free if you are above average.