Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

From the technology point of view running-real-software-as-JS-in-your-browser is always impressive and entertaining to watch. Cheers to the developers for such fun projects!

When we talk about efficiency of (other) electron apps made for productive use… the web page says this project does run Doom (cool) at a resolution of 640x480. So electron cripples my 2022 machine down to something like a 100 Mhz Pentium. That is totally fine for a funny tech demo, but I hate the need to use such programs for my daily work.

But, sorry, Pentium, I must take that crippling stuff back. I had chat apps in 1998 that had much higher responsiveness and faster start-up time than the electron chat apps I need to use for work today.

You, dear reader, may have already guessed my point here. Today software development is done often with many layers of bloat and glue, just to save some developer time and necessary developer expertise in the short term. The price we pay for this is a tremendous waste of resources.



> just to save some developer time

That's actually the #1 reason to use any of these bloated development environments. And it's also the reason why it's so popular. But I agree, here we are, getting angry at bitcoin because it uses so much energy, but if I would need to guess, could it be that Discord and Slack _alone_ use more energy world-wide than the entire bitcoin network? Those are on 100% of the time on most PCs.


> But I agree, here we are, getting angry at bitcoin because it uses so much energy, but if I would need to guess, could it be that Discord and Slack _alone_ use more energy world-wide than the entire bitcoin network? Those are on 100% of the time on most PCs.

No. The answer is clearly no.

Lets do some quick back of the napkin math to see why...

Discord reports ~150 million monthly active users. Lets assume they're all on all the time.

Lets assume most of these users are hardware that's running in the 50watt range (very high for mobile, high for a laptop, lowish for desktop) - Further lets assume that discord uses ALL of it (and we know this is false - it's likely under 5%).

So we have 150 million machines, using 50 watts. This is ~7500 megawatts of power for discord.

Now we run the same numbers for Slack - we'll keep the hardware estimates the same, but use the latest slack numbers of ~12 million active users.

So we have 12 million machines, using 50 watts. This is ~600 megawatts of power for slack.

So combined, for discord and slack, assuming they use the ENTIRE power budget of the machine they run on, they represent about 8100 Megawatts of power usage.

The best number I can find for bitcoin puts the usage at about 150 terawatt hours a year.

If we take our 8100 megawatts and run it for a year - we get... drumroll: 70.956 terawatt hours of annual consumption.

Or roughly half the power consumption of bitcoin, even when we assume discord and slack are literally consuming the ENTIRE power budget of the machines they're running on, and that they're running literally all the time.

A more realistic guess might be that slack and discord use roughly 1% of the electricity that bitcoin uses. And personally - I think they're providing a boat-load more value for the watts.

Fair caution - All my numbers are back of the envelope, and I did the math quickly and without paper. Feel free to point out if I missed something or dropped a step in a calculation.


Agree that Discord & Slack alone are probably not the sole culprits. But if you expand this to all existing computers (including datacenters) and programs using unnecessary abstraction layers, I wouldn't be surprised if we heavily surpassed bitcoin.

All that to say that IMO there is a fairly big incentive to re-think our software stack. Part of the issue is that it is hard to optimize software for all platform/hardware, therefore creating the need for abstraction. I wonder how this would all look if we suddenly stopped developing new hardware and put everyone on the same page, would software slowly become more efficient?


I think it would be far worse if we stopped developing new hardware.

The focus on mobile devices, and battery life in particular, has done a tremendous amount to reduce power consumption.

A typical desktop computer from the 2000s could easily use 1000w. Some of the pentium 4 variants burn through more than 100w alone - not including any of the rest of the hardware in the machine, or the monitor.

A modern laptop will usually come in under 50w - and can usually idle down under 10w.

Servers 20 years ago were mostly big power hungry machines running in closets somewhere - now they're heavily power optimized machines running in AWS/Azure/GCP/Other's data centers.

CPU cycles have gotten ridiculously cheap (both monetarily and in terms of power usage). Single-threaded performance hasn't gotten all that much better.

The market is already optimizing for reducing power usage, because power isn't free and because batteries are a constraining factor in mobile devices.


I know, my point was about how software developers could be forced to make more efficient software instead of relying on more and more powerful hardware.

Also, how many CPU generations would be required to compare to our current hardware with more optimized (or simply less bloated) applications?


And is the juice even worth the squeeze? We're using astronomically more computing power to enable developer conveniences that are (however you define or measure it) not astronomical improvements.

I've heard the developer time argument since I started coding in the 90s and have been skeptical of it just as long. I think it's a post hoc justification for practices we already do.

Software development is partly fad and fashion driven, is passed on by imitation, and code is often easier to write than to read. Almost all of the effort has gone towards expanding the numerator in that write/read ratio, and there's neither sufficient incentive to making tools to cut out layers nor widely accepted practices for how that works comparable to those we have for adding layers.

All this adds up to a strong entropic gradient toward software inefficiency, independent of whether or not it actually makes developers more efficient, but leading to a worse experience for users either way. Bucking the trend to make software more efficient can also improve developer time, as Linus Torvalds' argument why it was worthwhile to make git fast. I contend the justification that producing bloated software saves developer time is only true when selectively applied.

My axe to grind here is I question whether we wouldn't have similar improvements if we'd put the similar effort into making better tooling, education, and API design for native (and/or) compiled software ecosystems that we do towards things-that-we-use-because-they-are-there (whether or not they're the best tool for the job in a broader sense).


Exactly this.

I have a brand new laptop and I could swear I had much more responsiveness in any other chat app 15 years ago than I have now in MS Teams.

Everything is delayed and "not immediate" while 15 years ago it was in Skype!

How much power is being wasted, CO2 generated globally because of this?


Microsoft Teams is a disaster and its slowness is not due to Electron but Angular.


Two crème de la crèmes of modern software.


Not really, Angular.js is obsolete technology.


The new Angular which is based on Typescript is not


Computing resources were scarce, which made us better code-writers. We had to think about this. Scarcity leads to efficiency. Abundance leads to complacency.

The new generation just assumes that all database/network calls and UI plumbing is magical and free.


Bitcoin / Ethereum will peak any device perform if the calculation such as an ASIC, CPU or GPU which is 200 watts or more. I don’t think that Discord or Slack aren’t even peaking CPUs or GPUs while they are running and also they don’t run 24 hours a day.


Then again they run on magnitudes more computers.


WARNING: lots of handwaving arithmetic, but in the spirit of "if it's worth doing, it's worth doing with made-up statistics" (https://slatestarcodex.com/2013/05/02/if-its-worth-doing-its...)

I was curious so I checked on my computer. I don't have Discord, but I do have Slack on and use it all the time (calls, screen sharing, the whole shebang). All Slack-related processes in total have used about 0.6% of my total CPU time (a bit of a high estimate because of how that calculation takes into account things like wakeup, but let's go with it). My computer uses probably about 15 watts on average as a high-end estimate. That gets us to about 0.09, say 0.1 watts on average. My computer is on for about half the day, so let's say 0.05 watts.

If we consider Slack's current active user stats (it looks like 10 million?) that gets us 500 kilowatts. It seems like most sources agree that Bitcoin's power usage is probably around 10 gigawatts. So based on Slack's current usage stats I'm guessing there's around 4-5 orders of magnitude in difference between Slack power usage and Bitcoin power usage.


ASICS can pull 3 kilowatts. A MacBook Air m1 pulls 15 watts and a gaming computer is around 750 watts.


The difference is that they handle 10,000,000 more transactions per second than Bitcoin. People are able to talk, have multi person voice and video conversations, even entire communities. On blockchains, no matter how many nodes join, it’s the same capacity. In fact it’s the only tech in history that promises to become slower and more expensive with time.


On Discord or Slack I can copy a screenshot and send it to loads of people with simply Ctrl-V, I don't have to manually upload to some random image host website and copy and paste links which usually link to some page with a thousand ads.

I can join a voice channel seamlessly, talk to my friends, listen to music together, stream my screen, watch others streaming their screen, build custom interactive bots that manage rich media, send messages from my apps to a channel with webhooks which I can create in seconds, see what other people are playing or listening to, reply with custom emotes, create threads, add syntax highlighted code blocks, etc. I could go on for an hour.

None of the old school 1998 irc channels had these capabilities and I'm sick of everyone on HN pretending like it's the same thing. If it was, people would use it.


You can do those things in Telegram, fast and efficient made with Qt.


You are citing incorrectly. The webpages says that: "I can recommend that you switch to a resolution of 640x480 @ 256 colors before starting DOS games - just like in the good ol' days."

Maybe I am nitpicking, but it also runs doom with 1024x786 or higher. He just recommend it for the "good old days".


I think you might be misunderstanding what this post is about.

That version of Doom is not running on Electron directly. It is running on v86, which is a x86 WASM emulator, so essentially the entire graphics stack of the game and the entire OS its running in is emulated.

The resulting performance characteristics have absolutely nothing to do with a normal Electron app.

> When we talk about efficiency of (other) electron apps made for productive use

Other electron apps for productive use is what the post is specifically NOT about. I find it higly irritating, when comments like yours are used to inject the never-ending and frankly tiring Electron-performance discussion into anything Electron related, drowning out discussion that would actually relevant and on topic.


I can see the mountains of coal being used to power electron apps in my head…


It's all super fuzzy. Let's say I want to build native app instead. I start Visual Studio, which doesn't feel faster than Electron, then I start new C# project which doesn't feel faster than Electron. But wait... C# is also not native code, it runs on some sort of VM so I should switch to C++ but it supposedly isn't much faster andbit uses same gui controls and libraries that C# does and it also doesn't feel any faster.

How to be so much better than Electron that it matters?


> I start Visual Studio, which doesn't feel faster than Electron

Well...Visual Studio is a part of the horribly written software collection.

Here's an example of actually good software: https://www.youtube.com/watch?v=r9eQth4Q5jg&t=2s


Looks fast but that's just a debugger. Do you know such fast native ide that makes fast native programs?


There are no fast native IDEs, which is why I just use vim. Fast native programs is up to the programmer, not necessarily the tools.


I wonder why is that.


C# can be easily compiled to native code. The JIT Compilation is just a convenience that ultimately helps with development. You don't have to use it in production.


> C# is also not native code, it runs on some sort of VM

This is not true. .NET code is JIT-compiled into native code.


I think code running on JVM is JIT-compiled too. But it's still a JVM.


Can experts weigh on the feasibility of Vala/Lua?


> electron cripples my 2022 machine down to something like a 100 Mhz Pentium

Yes, every Electron app you use runs X86 emulation inside. Very observant of you to notice this dirty secret.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: