Hacker News new | past | comments | ask | show | jobs | submit login

I remember when power wasn’t 51c/kWh and a powerful PC didn’t suck down >1000w. I know we get way more perf/watt these days, but I don’t feel as a user like that translates into a notably better experience. For most things for $work, Slack, Spotify, GMail, Google Docs, and VSCode feel a lot less responsive than irssi, Winamp, Sylpheed, OpenOffice, and $ide_of_choice did a decade or two ago.

What Dr. Lisa Su giveth, Satya taketh away I suppose.




We also used to write code in low-level languages, and system designs were simpler compared to our current approach of writing a lot of stuff in JS and using Electron everywhere, having a huge dependency tree and in general - optimizing for fast development rather than efficient runtime.

Having said that, optimizing for perf is rarely worth it because people don't usually care, and when it is - you can often make the biggest difference by figuring out why some step requires O(n^3) instead of O(n) like it should and fixing that, rather than rewriting the entire stack to get some 10-20% improvement...


Someone should do an accurate figure of compute power usage, including things such as cars, or media centres, and then try to analyse how the existing trend of "wasteful development" affects this.

I bet all the upset over bitcoin power usage, is a tiny thing, compared to (for example) the whole node ecosystem.

Even something such as gmail is massively overengineered. Heck, there are very few improvements over the interface a decade ago, yet I bet it used 5% of the power it does now.

Yet think of how much gmail is used.

And it's not just raw compute. 3d on a page for no reason, massive js binaries for no reason eg ram + bandwidth.

A modern phone could last weeks as a browsing device, if looking at static html.

But all of this means more development skill, less use of fluff and junk, care for size of page loads, and on and on.

Put another way, people concerned about power usage, should realise that using node, means burning more coal/NG. Literally.

Because that's where the excess power comes from.


Hit the nail on the head. Inefficiency, which has reached gargantuan proportions, and then blown past that, doesn't just mean page loads so slow that loading a news article takes upwards of 20s on a pocket supercomputer (whereas it took barely a second or two on a Pentium II on a 56kbps connection), all the while text and buttons jump around as assets are loaded. It also means very real waste of energy and battery degradation.


I mostly agree with you, however:

> A modern phone could last weeks as a browsing device, if looking at static html

That's what I do. Weeks is a bit of a stretch. Maybe one of screen time is limited. That said, discord (the website) drains my laptop battery almost as fast as gaming does. They do spawn a separate WebGL context for each animated emoticon. Crazy, crazy thing

> Put another way, people concerned about power usage, should realise that using node, means burning more coal/NG. Literally.

Sure. At the same time, though, a single car trip to the movies burns orders of magnitude more energy than all your gadgets running JS for an year.


shit like slack and other electron abominations are not just 10-20% improvement going native. We are talking a simple chat application, irc glorified to load images and preview, and this somehow takes more memory and cpu than the total resources I had in 1995 times 100.

these systems are so abominable, and people have just gotten used to it, hell, many users never even tried anything else, but saying 10-20% is just ridiculous


Man I want a native TUI client for Teams.



Make a windows XP VM and install some of those old versions and see how fast it runs on modern hardware (even while emulating the entire PC to run it on!). Many modern devs chose ease of development over end user efficiency/performance.


51c / kWh?? I pay 20 in Spain even now with the energy crisis.

I think back in those days the price was around 10c

I know this is euro cents but the dollar exchange rate is not too big to make a huge difference.


> a powerful PC didn’t suck down >1000w

My Ryzen Threadripper 2950X, which isn't a top-of-the-line system at this point, but is still a 16 core beast-ish machine, sucks down (much) less than 400W


My 3990x is almost 500w. My last dual socket Xeon workstation was closer to 750w.

That's before you add on a GPU. My 3080 uses 350w under load.


Are these power figures measured at the wall? If so, both are astonishing. My file server with an old Xeon, 16GB RAM and 5x 7200RPM HDDs uses about 95W when idle and about 150W when I throw something at it that works the drives. My desktop with an I7-4770K, 32GB RAM, some SSDs and a single spinner ids about the same. IIRC if I push something compute intensive at it, it can break 200W.

OTOH a Pi 4B with 2 7200RPM HDDs uses about 25W and another with an SSD uses about 7W.


I don't have a kill-a-watt meter handy anymore, but I do have a smart meter. Guesstimating using that, I hit ~ 850w with my threadripper when compiling and my work project and running a game at the same time, and about 650w when _just_ compiling (so presumably the difference there is GPU being under full load).

They're obviously extreme examples, but on the flip side, I've got an M1 Macbook Pro which lasts an entire day on a single charge of battery including running docker, slack, and some "mild" compiling (not large C++ projects). I'm not sure how to measure the energy consumption of it on battery, but suffice to say it's "very low".


Powers is around $0.10/kWh for residential, even lower for industrial


Power prices vary dramatically in the United States - and are even higher in most of the rest of the world.

Here’s the prices for the United States - https://www.eia.gov/electricity/monthly/epm_table_grapher.ph...

Even these data are noticeably low for my area where we had a _massive_ rate hike in January that results in total prices for electricity (generation + delivery) without an alternative supplier being up around $0.48/kWh.


Big cities in the US tend to have very expensive power ($0.50/kWh or higher), while rural areas can have nearly free power, particularly if you are near a dam or a nuclear power plant. The price you're quoting is definitely not a big city price.


It's closer to 50c/kWh here in the UK [0]

[0] https://www.gov.uk/government/publications/energy-bills-supp...


I'd pay 0,77 euro per kWh if the government hadn't introduced a 0,40 euro per kWh price cap.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: