Hardware engineers are pushing the absolute physical limits of getting state (memory/storage) as close as possible to compute. A monumental accomplishment as impactful as the invention of agriculture and the industrial revolution.
Software engineers: let's completely undo all that engineering by moving everything apart as far as possible. Hmmm, still too fast. Let's next add virtualization and software stacks with shitty abstractions.
Fast and powerful browser? Let's completely ignore 20 years of performance engineering and reinvent...rendering. Hmm, sucks a bit. Let's add back server rendering. Wait, now we have to render twice. Ah well, let's just call it a "best practice".
The mouse that I'm using right now (an expensive one) has a 2GB desktop Electron app that seems to want to update itself twice a week.
The state of us, the absolute garbage that we put out, and the creative ways in which we try to justify it. It's like a mind virus.
Actually, for those who push for these cloudy solutions, they do that in part to make data close to you. I am talking mostly about CDNs, I don't thing YouTube and Netflix would have been possible without them.
Google is a US company, but you don't want people in Australia to connect to the other side of the globe every time they need to access Google services, it would be an awful waste of intercontinental bandwidth. Instead, Google has data centers in Australia to serve people in Australia, and they only hit US servers when absolutely needed. And that's when you need to abstract things out. If something becomes relevant in Australia, move it in there, and move it out when it no longer matters. When something big happens, copy it everywhere, and replace the copies by something else as interest wanes.
Big companies need to split everything, they can't centralize because the world isn't centralized. The problem is when small businesses try to do the same because "if Google is so successful doing that, it must be right". Scale matters.
Agreed and I think it's easier to compare tech to the movie industry. Just look at all the crappy movies they produce with IMDB ratings below 5 out of 10, that is movies that nobody's going to even watch; then there are the shitty blockbusters with expensive marketing and greatly simplified stories optimized for mindless blockbuster movie goers; then there are rare gems, true works of art that get recognized at festivals at best but usually not by the masses. The state of the movie industry is overall pathetic, and I see parallels with the tech here.
> Software engineers: let's completely undo all that engineering by moving everything apart as far as possible. Hmmm, still too fast. Let's next add virtualization and software stacks with shitty abstractions.
That's because the concept which is even more impactful than agriculture and the computer, and makes them and everything else in our lives, is abstraction. It makes it possible to reason about large and difficult problems, to specialize, to have multiple people working on them.
Computer hardware is as full of abstraction and separation and specialization as software is. The person designing the logic for a multiplier unit has no more need to know how transistors are etched into silicon than a javascript programmer does.
Billions of people are on the internet now, vs 20 years ago. I dare say millions of lives have been saved (due to various things) in the past 20 years, due to the things built and deployed on the web.
We may have failed at some abstract notion of craftsmanship or performance efficiency. But we as an industry shipped. We shipped a lot, actually. A lot of it also sucked. But not enough to say the whole industry was a failure, IMHO.
What are you having difficulty understanding? I'll be happy to try help.
> The web is slower than ever.
No it isn't.
> Desktop apps 20 years ago were faster than today's garbage.
Some are, some aren't. For the same thing they clearly aren't. A typewriter makes your PC of 20 years ago look glacial garbage, if that's your standard.
> We failed.
Speak for yourself. Computers are used far more often, for more things, and by more people than they were 20 years ago, and nothing they used to be used for has been replaced by something else. You'll always have the get off my lawn types, but you did in the 2000s from the curmudgeons stuck in the 80s too.
You assessment has no impact. Nobody disagrees with the notion that programmers trade performance for reduction in complexity or better productivity. This isn't some astounding discovery, it's a tired old gripe that doesn't add anything.
Heh, there's a mention here to Andy and Bill's Law, "What Andy giveth, Bill taketh away," which is a reference to Andy Grove (Intel) and Bill Gates (Microsoft).
Since I have a long history with Sun Microsystems, upon seeing "Andy and Bill's Law" I immediately thought this was a reference to Andy Bechtolsheim (Sun hardware guy) and Bill Joy (Sun software guy). Sun had its own history of software bloat, with the latest software releases not fitting into contemporary hardware.
> The mouse that I'm using right now (an expensive one) has a 2GB desktop Electron app that seems to want to update itself twice a week.
I'm using a Logitech MX Master 3, and it comes with the "Logi Options+" to configure the mouse. I'm super frustrated with the cranky and slow app. It updates every other day and crashes often.
The experience is much better when I can configure the mouse with an open-source driver [^0] while using Linux.
I use Logi Options too, but while it's stable for me, it still uses a bafflingly high amount of CPU. But if I don't run Logi Options, then mouse buttons 3+4 stop working :-/
It's been like that for years.
Logitech's hardware is great, so I don't know why they think it's OK to push out such shite software.
Let me add fuel to the fire. When I started my career, users were happy to select among a handful of 8x8 bitmap font. Nowadays, users expect to see a scalable male-doctor-skin-ton-1 emoji. The former can be implemented by bliting 8 bytes from ROM. The latter requires an SVG engine -- just to render one character.
While bloatware cannot be excluded, let's not forget that user expectations have temendously increased.
We're not a very serious industry. Despite uhm, it pretty much running the world. We're a joke. Sometimes I feel it doesn't even earn the term "engineering" at all, and rather than improving, it seems to get ever worse.
Which really is a stunning accomplishment in a backdrop of spectacular hardware advances, ever more educated people, and other favorable ingredients.
We're much more like artisans than engineers, in my opinion (maybe with the exception of extremely deep-in-the-stack things like compiler engineering).
The problem seems to be that because there's no "right way", only wrong ways, discussions end up being circular. I'm not a civil engineer, but I imagine there is a "best way" to build a bridge in any landscape, where any decisions and tradeoffs have well defined parameters, gained through trial and error and regulation over literally thousands of years of building bridges.
Us "Software Artisans" spend almost as much time arguing as lawyers do because, like law, it's all made up. Information, and human-to-human communication via CPU instructions abstracted to the point of absurdity.
I also get the vibe that greybeards like Uncle Bob and Martin Fowler understand this very intuitively.
I get what you're saying but I reject the notion that some of these tech choices are 100% subjective and that there's no "right way" at all.
If hardware has increased in speed/capacity by a factor 10-100 in a decade and our "accomplishment" is to actually make software increasingly slow, shitty and bloated with no new added value to the user, you'll have an idea of the absurd waste and efficiency of our stacks.
When you add lanes to a highway, it generally does not improve congestion or travel times. Drivers adjust and fill up the new lanes, until travel times are roughly the same as before (but with slightly more throughput now).
So it is with hardware and software. I don't see any reason to correlate faster/better hardware with an expectation that software must also get better. It would be economically irrational for the software industry (whatever that means) to spend resources/energy on improving efficiency when the "gains" from hardware are essentially a free lunch to eat... Who would pay for lunch or spend time making their own, when hardware guys are giving you bigger portions for free?
That doesn't mean you have to like the outcome, but at least it should be perfectly predictable, given what we know about economics and game theory and incentives.
Software engineers don't want to be managing physical hardware and often need to run highly available services. When a team lacks the skill, geographic presence or bandwidth to manage physical servers but needs to deliver a highly-available service, I think the cloud offers legitimate improvements in operations with downsides such as increased cost and decreased performance per unit of cost.
Hardware engineers are pushing the absolute physical limits of getting state (memory/storage) as close as possible to compute. A monumental accomplishment as impactful as the invention of agriculture and the industrial revolution.
Software engineers: let's completely undo all that engineering by moving everything apart as far as possible. Hmmm, still too fast. Let's next add virtualization and software stacks with shitty abstractions.
Fast and powerful browser? Let's completely ignore 20 years of performance engineering and reinvent...rendering. Hmm, sucks a bit. Let's add back server rendering. Wait, now we have to render twice. Ah well, let's just call it a "best practice".
The mouse that I'm using right now (an expensive one) has a 2GB desktop Electron app that seems to want to update itself twice a week.
The state of us, the absolute garbage that we put out, and the creative ways in which we try to justify it. It's like a mind virus.
I want my downvotes now.