Of course they aren't, but most people aren't like Drew Devault either. Most people don't use ancient hardware to prove some sort of point. The "Drew Devault will approve of it"-argument generally doesn't show up in a business case.
In a perverse way, I am glad that programmers come up with better and better ways to waste hardware resources. In doing so, they ensure the continued progress of the semiconductor (and battery) industry through mass market demand.
I do not believe it is a coincidence that Moore's law has slowed significantly right at the time when computers became "good enough" for the everyday user. If it wasn't for gamers spending hundreds of dollars on pushing more pixels on-screen, we'd be way behind on deep learning.
I've purchased a number of computers over the last several decades, yet using facebook/slack is still slower and less responsive on a computer with a Core i7 processor and 16 MB of system memory and a 100 Mbps downstream connection compared with using usenet/irc on a 486 with 16 MB of system memory and a a 28.8 kbps downstream connection.
Of course they aren't, but most people aren't like Drew Devault either. Most people don't use ancient hardware to prove some sort of point. The "Drew Devault will approve of it"-argument generally doesn't show up in a business case.
In a perverse way, I am glad that programmers come up with better and better ways to waste hardware resources. In doing so, they ensure the continued progress of the semiconductor (and battery) industry through mass market demand.
I do not believe it is a coincidence that Moore's law has slowed significantly right at the time when computers became "good enough" for the everyday user. If it wasn't for gamers spending hundreds of dollars on pushing more pixels on-screen, we'd be way behind on deep learning.
Buy a new computer, Drew.