Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's a perpetual complaint.

The truth is that a lot of conveniences we take for granted have a cost that adds up a lot. A 4K screen has 17 times more pixels than a 800x600 one, and uses 32 bit color. So the raw size of graphics made for a modern display is around 68 times bigger.

Where before a static picture was acceptable now the norm is a high quality, high framerate animation.

Arial Unicode is a 15 MB font, which wouldn't even fit in the memory of most computers that used to run Windows 95.

Spell checking everywhere is taken for granted now.

And so on, and so forth. That stuff adds up. But it makes computers a whole lot more pleasant to use. I don't miss 16 color video modes, or being unable to use two non-English languages at once without extremely annoying workarounds.



This doesn't explain it since within the very same constraints of 4Ks and Unicode etc there are apps that are orders of magtitude more efficient


It does have a cost. But maybe the cost isn’t worth it sometimes. I’ll take the 4K screen please. I’m willing to pay for those pixels.

But animations for no reason other than to make it seem like waiting for something is less of a chore? Nope.

Unicode is something I’m willing to pay for.

But we should be able to draw the glyphs on the screen in single-digit ms like we did in 1981. Yes more pixels and more glyphs but it’s possible just not a priority.


> But we should be able to draw the glyphs on the screen in single-digit ms like we did in 1981. Yes more pixels and more glyphs but it’s possible just not a priority.

In 1981 we drew fixed-width bitmap fonts at low resolution. In 2023, a font is a complex program that creates a vectorial shape, which is carefully adjusted to the display device it's being rendered on for optimal graphical results and antialiasing. That said, performance isn't bad at all.

Just resize the comment field back and forth while having a bunch of text within, and you'll see that text rendering performance is perfectly fine. I see no slowness.


So okay, all the artifacts take much more resources but even after consuming 100 times more compute resources why software is still excruciatingly slow?

This comment is something like even after paying 100K for performance BMW car engineer tells the user car will take 30 sec for 0-60 mph. And since user is not perf expert they have to take it face value.


> So okay, all the artifacts take much more resources but even after consuming 100 times more compute resources why software is still excruciatingly slow?

It's not?

Software used to be way, way slower. I had a 386. I experienced things like seeing the screen redraw, from top to bottom in games running at the amazing quality of 320x200x8 bits. I've waited hours for a kernel build. I've waited many seconds for a floppy drive to go chunk-kachunk while saving a couple pages of Word document. I've waited minutes for a webpage to download. I remember the times when file indexing completely tanked framerates.

Today all of that is pretty much instant.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: