Hacker News new | past | comments | ask | show | jobs | submit login

I grew up on the Commodore 64 (1 Core, 1 hyper-thread :-), almost 1 MHz clock freq, almost 64 K usable RAM).

The machine was usually pretty responsive, but when I typed too quickly in my word processor it sometimes got stuck and ate a few characters. I used to think: "If computers were only fast enough so I could type without interruption...". If you'd asked me back then for a top ten list what I wished computers could do, this would certainly have been on the list.

Now, 30 years later, whenever my cursor gets stuck I like to think:

"If computers were only fast enough so I could type without interruption..."




Similar in sentiment to "I have always wished for my computer to be as easy to use as my telephone; my wish has come true because I can no longer figure out how to use my telephone."


I'll definitely steal this!


Bjarne Stroustrup said that[1].

[1]: https://en.wikiquote.org/wiki/Bjarne_Stroustrup


The best part is he said it in around 1990.


The worst is when this happens when you're not doing anything that should be computationally intensive, just entering text in a web app that abuses JavaScript.


Even just entering text into the search bar uses loads of resources. Every character entered does a full search of your history and bookmarks, sends a request to Google to do autocomplete, prefetches the web page you are typing, applies spellcheck, etc.


Sure, but I for one have yet to experience input lag typing into a URL bar when my computer wasn't otherwise under extremely heavy load.


I have almost the same background (Vic-20, if you thought programming was hard try having only 3.5k of RAM) and feel the same irritation...but I don't wonder why computers aren't faster, I wonder why operating systems and application is so utterly shit.

I have a Moto X phone (not too old) and watching it slow down is almost comical. Sometimes I'll decide to close a few apps and hitting the selector button brings up screenshots of all the open apps. Then I have to wait a few seconds for the X to appear on the app frames so I can close them.

If I had the inclination and temperment to be one of those YouTube personalities who makes a schtick out of complaining about things I would have no shortage of material for a 'designed by idiots' channel.


I think most of the shit is concentrated on a couple of notable platforms: Windows and Android. Outside of that space I've noticed less productivity hampering nightmare tools and OS features. Really I get perhaps one or two bits of stupid from Linux a year on the server-side, usually when integrating it with Windows ironically, and on the macOS/iOS front I haven't had a notable issue since I switched about a year ago.


I'm using Linux for my daily work for more than ten years now and develop for MacOS since around 2000, and I honestly cannot confirm this. If you have a fast and well-tuned machine, the sluggishness of modern applications might not be so notable, but it surely is there, and then there are also many usability issues of desktop software on Linux. Not to speak of browser-based applications, which mostly have unusable user interfaces anyway. For MacOS, usability is still high, but the multithreading and API layering on Cocoa is and always has felt sluggish to me. I no longer use Mail.app but remember it as a particularly bad example.

I agree with the OP, for actual use computers can do more powerful things than they used to be able to, no doubt about that, but programs and operating systems continue to feel slow and clumsy. Android in particular, but in the end all operating systems.

Bloated GUI frameworks, use of unoptimized images in GUIs, and non-optimal multicore programming are to blame, I guess.


Good luck on improving the situation as long as you have people running around who consider this all fine and normal.

https://ptrthomas.wordpress.com/2006/06/06/java-call-stack-f...


Idiomatic Java is idiotic.

Most of the hate for Java that I see is really hate for the idioms. The nice things about idioms is that you don't have to follow them. But for whatever reason, Java devs stick to them. And that's how you get monstrosities like that stack trace and Fizz Buzz Enterprise Edition.


Looking through that graph like "you know what we need? More abstractions!"


This actually annoys me badly. One of the problems I see regularly is applications that fail to log enough of the stack to see what the entry point of the thing that actually went wrong is because the syslog packet size is set to 512 bytes. The problem is clearly syslog then, not the 12KiB of stack your app throws when something goes pop!?!?


Um, yeah, the problem is someone setting an arbitrary limit because it was easier to implement. To argue otherwise is basically to claim that there is no possible justification for a deep stack, which, well, good luck proving that. Anyway, even if I don't like all that abstraction (I don't), I may not have a choice in platform (or logging system), so blaming useless syslog messages on the app developer is adding insult to injury. Not that blame is terribly useful when the best course of action is to just burn the whole thing down and start over. :)


The arbitrary limit is a performance thing. Fits neatly in one UDP datagram and passes through everything unhampered by MTU etc.

Agree that burning the whole thing down is the right thing when you get to this place :)


Logging the stack in what format? 500 bytes could fit as much as two hundred numeric entries, or as few as five fully detailed text entries.


If you think Java or Ruby is bad, don’t even try looking at the JS ecosystem.


I still can't face looking at the JS ecosystem after dealing with Netscape 4 back in the day. It did me some psychological damage which will never go away.


The nice thing about GNU/Linux is that the you can almost completely avoid all of the "desktop applications." I only ever start X when I need Firefox or mupdf. Everything else is a nice lightweight TTY app that I run in tmux. My 1.3 GHz celleron netbook is incredibly responsive set up this way.


Similar but with XFCE, it rarely feels sluggish and I nearly always know why.


I had a laptop celeron 600mhz running xfce and abiword (I think it was xubuntu 14.04 or something) with no networking (only a dialup modem was available). It was a great, responsive typewriter with formatting and backspace!


Aye, for me XFCE is the sweet spot of does what I want without getting in the way and speed.

It's funny really because I mostly use it on an 8 core 32GB Ryzen 1700 desktop.


Expect Firefox to demand a potent GPU just to load soon enough, thanks to GTK3...


Gtk3 literally uses the exact same rendering API as Gtk2, Cairo, which is accelerated by your Xorg driver the same as it has been for decades.

None of this even matters because Firefox isn't a traditional application and renders most thing itself using Skia.


Isn't the current version of firefox GTK3? I run it on my laptop with just the EFI framebufer and while it's /the/ most sluggish app installed it's still usable.


It is, and it may work for now. But as i become familiar with the mentality of the GTK devs i worry how long that will be an option.


Same, but I don't want to play text adventures for the rest of my life either. Graphics are a Good Thing.


> I no longer use Mail.app but remember it as a particularly bad example.

Huh. I used to think that my Mac Mini was just too puny for my huge mailboxes.

I now use Evolution on openSUSE running on a Ryzen 1700 with an nVME SSD, and it still feels kind of slow-ish. So maybe that program is in need of some loving optimization, too (would not surprise me if it did), or my mailboxes are just unreasonably big (would not surprise me, either). Probably a bit of both.


That's just Evolution. It's a big, complicated turd that uses a thread pool with no priorities. So you can sit there waiting to read a message while it is blocked checking your other folders. Also it's keyboard shortcuts are stupid.

Thunderbird is a lot better.


Unfortunately, Thunderbird does not like to talk to Microsoft Exchange Servers in their native tongue. If it weren't for that, or if my employer did not run on Exchange, I would be using Thunderbird already.


Let me just clarify that I don't use Linux on the desktop. I find it quite horrible to work with, particularly since the demise of Gnome 2. On the server and for development, it is fast and efficient. Most of the development work I do is from the Mac desktop to Linux machines.

I find macOS the "least bad" desktop experience. I think that's the true assertion from my initial comment. But I have a very high end MBP.

As a point for comparison which holds true across different applications, if I take the Photos application on macOS and port the data to the Photos app on Windows 10 and on Android, both of the latter are unusable with a dataset of 30GiB. I have tried this (in reverse!)

I haven't had any problems with Mail.app but I only use that for trivial home email. I use Outlook. Now that's a turd.


the op was building chrome. building on osx exhibits similar issues. there are instructions in the readme on how to configure macos not to chug and lose all responsiveness when building chrome.


You don't have to wait for the X. Just swipe them away.

Or just don't close activities ever, it's pretty pointless. It doesn't actually kill the app if it's still open.


The point I'm making is that the core UI shouldn't be limping along like this. Bad performance for an individual app is understandable, but when the basic UI is no longer efficient or responsive it makes everything worse.


way more naive but my first own desktop was a P75, a few years down a friend of a friend got his hand on a batch of P233MMX and I started to fantasize that these could be so fast, the menus may appear before you even finished clicking. I didn't know how non linear computers were at the time.

Few things also, I'm often surprised how fixed is the responsiveness of systems, hardware grew 3-4 orders of magnitude but the cruft and chaos plus the change of resolutions keep latency around the same value. It's sometimes even worse. Psychologically when I boot a 64MB windows machine and can enjoy word / IE / winamp I feel very weird (video could be offloaded to a chromecast for that matter).


Except without interrupts, your computer would miss all the keystrokes. :-)


That's so deep, it hit me in the IDT


Why i refuse to indulge the latest eyecandy and bling from the desktop world if possible.

If i didn't need a GPU to do window switching back then, why do i need it now?


FVWM and VTWM switch windows and workspaces instantaneously on my crappy netbook with just the EFI framebuffer.

I always laugh a little inside watching people try to do the same on windows 10 and OSX with all the hardware acceleration and waiting multiple for seconds.


To save battery power.


Then stop drawing all the eyecandy in the first place.


It's all just trade offs. However fast or powerful your machine is, software will use as much of that resource as possible, up to the point where it occasionally interferes with input (but not too much or you'll switch to something else).


But why? What good is an operating system on a multi-core device that allows anything to get that close to the performance envelope? This is a fine example of competition driving change for change's sake rather than real innovation and everything ending up worse as a result. I like new features as much as the next person, but not when they compromise core functionality. Not being able to type is inexcusable.


I agree. However at this point, I don't see anything an OS can do to help.

I see plenty of typing slowdowns every other day now. But I'm not sure just how many of them are the OS's fault. When your typing seems to lag, there are two places that can be slowing it down - the input side (reacting to hardware events) and the output side (drawing and updating the UI).

I suppose keyboard buffers are pretty well isolated, and native UI controls tend to work fine too. The problem is, everyone now goes for non-native controls. You type in e.g. Firefox, and it is slow not because of your OS, but because Firefox does all its UI drawing by itself. And God help you if the application you want to use is done in Electron. There's so many layers of non-nativeness on top of that, that the OS has close to zero say in what's being done. There's no way to help that - resource quotas will only make the problem worse, and giving such program free reign will only take everything else down.

All in all, it's just - again - problem of people writing shitty software, because of laziness and time-to-market reasons. Blame "Worse is Better".


Agreed, abstraction seem to be exploding these days and I’m not even sure we are at the end of the road yet! Linux or Windows never had any trouble with essentially realtime keyboard feedback in their Terminal windows. It’s not the OS.


Intellij still freezes while indexing on my work desktop which has 8 cores and 16 threads, at least they finally allowed pause/resume for that so that's a win.


It's hard to believe the OS couldn't help when Windows 10 has the problem but Windows 7 doesn't.


In this particular case, yes. But this subthread was about a more general principle of letting an app exhaust system's performance. What I'm saying is that, when facing a crappily coded app, the OS can at best choose between letting it suck or letting the performance of everything suck.


You have alternatives though. I've been using basically the same linux environment for about a decade now.

I don't have a proper desktop environment like Gnome or KDE, just Xorg and StumpWM as a window manager. Then I have Firefox, Emacs and urxvt running tmux. I use a handful of GTK applications when the need arises like Gimp, Inkscape, Evince and maybe a couple others. Done.

It boots up in a few seconds from a SSD, it's always snappy. It worked fine on a core2duo and HDD 10 years ago, it works even better on an i5 and SSD now.


Yeah I have a Linux environment on a keychain USB device that I now use often enough to think seriously about abandoning Windows (though I don't hate Win as such, but kept using it because of some applications I relied upon).

Linux has (sometimes) had sort of the opposite problem, insufficient innovation in user interface design. I'm looking forward to Gnome 3 now; I felt that when CSS took over the web a lot of UI innovation moved to the server end and stalled at the client (think the really long hiatus in the development of Enlightenment, which was at one time the cutting edge of UI design/customizability while still being fast and responsive).

If you want ideas for where Linux capabilities should b going, please go check out Flowstone, which I think is criminally under-appreciated. The current version uses Ruby, but previous incarnations allowed you deploy code in C or assembler(!) within the visual programming environment. It's doin' me a heckin' confuse that this isn't a standard development environment option for everything from shell scripts to large-scale applications. Once you go to flow-based programming text-only IDEs look masochistic and pointless, and text-only is a terrible way to teach programming to people because discovery and syntax are inaccessible and really better done by computers. I like NoFlo for js development but the Linux desktop is crying out to be brought into a flow-based paradigm.

Sorry about going a bit off-topic but when I see exhortations to go with extremely simple solutions like StumpWM I have the opposite-but-similar reaction to the OP: why am I running some beast of a computer (at least by historical standards) so I can have a 20 year old user interface? Surely there is some middle ground between cancerous levels of abstraction/feature creep and monk-like asceticism.


That's is better rewritten as "however fast or powerful your machine is, software will waste as much of that resource as possible".


"...if you let it". Some of us have decided that the value of marginal eye candy et al isn't worth interrupting our UI flow. I suspect that many people would have decided the same if most modern OSes weren't built around removing so much control from the user.


There's a talk by Guy Steele 'growing a language', he says that ideally a language should shrink as it gets smarter semantics.


I remember writing C code on my Commodore 64 in my early teens using Power C from spinnaker software. If memory serves I had to put 3 different 5 1/4 disks in to compile / link. There were compiler 1, compiler 2, and a linker diskette.


Plus ça change, plus c'est la même chose...


"The more it changes, the more it’s the same thing." [0]

[0]: https://en.wiktionary.org/wiki/plus_%C3%A7a_change,_plus_c%2...


The common idiomatic translation is "the more things change, the more they stay the same."


It's the same meaning but the tone seems completly different to me.


It does?


For English the equivalent meaning would be, "the more things change, the more they remain the same".

I don't think I've ever heard the literal translation in English.


Weird that no-one here is mentioning how Windows is the only platform that still has this problem.

UI threads on a graphical desktop should always be the most privileged processes.


Your Commodore 64 had one hyper-thread? Wow, that's forward thinking.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: