From the technology point of view running-real-software-as-JS-in-your-browser is always impressive and entertaining to watch. Cheers to the developers for such fun projects!
When we talk about efficiency of (other) electron apps made for productive use… the web page says this project does run Doom (cool) at a resolution of 640x480. So electron cripples my 2022 machine down to something like a 100 Mhz Pentium. That is totally fine for a funny tech demo, but I hate the need to use such programs for my daily work.
But, sorry, Pentium, I must take that crippling stuff back. I had chat apps in 1998 that had much higher responsiveness and faster start-up time than the electron chat apps I need to use for work today.
You, dear reader, may have already guessed my point here. Today software development is done often with many layers of bloat and glue, just to save some developer time and necessary developer expertise in the short term. The price we pay for this is a tremendous waste of resources.
That's actually the #1 reason to use any of these bloated development environments. And it's also the reason why it's so popular. But I agree, here we are, getting angry at bitcoin because it uses so much energy, but if I would need to guess, could it be that Discord and Slack _alone_ use more energy world-wide than the entire bitcoin network? Those are on 100% of the time on most PCs.
> But I agree, here we are, getting angry at bitcoin because it uses so much energy, but if I would need to guess, could it be that Discord and Slack _alone_ use more energy world-wide than the entire bitcoin network? Those are on 100% of the time on most PCs.
No. The answer is clearly no.
Lets do some quick back of the napkin math to see why...
Discord reports ~150 million monthly active users. Lets assume they're all on all the time.
Lets assume most of these users are hardware that's running in the 50watt range (very high for mobile, high for a laptop, lowish for desktop) - Further lets assume that discord uses ALL of it (and we know this is false - it's likely under 5%).
So we have 150 million machines, using 50 watts. This is ~7500 megawatts of power for discord.
Now we run the same numbers for Slack - we'll keep the hardware estimates the same, but use the latest slack numbers of ~12 million active users.
So we have 12 million machines, using 50 watts. This is ~600 megawatts of power for slack.
So combined, for discord and slack, assuming they use the ENTIRE power budget of the machine they run on, they represent about 8100 Megawatts of power usage.
The best number I can find for bitcoin puts the usage at about 150 terawatt hours a year.
If we take our 8100 megawatts and run it for a year - we get... drumroll: 70.956 terawatt hours of annual consumption.
Or roughly half the power consumption of bitcoin, even when we assume discord and slack are literally consuming the ENTIRE power budget of the machines they're running on, and that they're running literally all the time.
A more realistic guess might be that slack and discord use roughly 1% of the electricity that bitcoin uses. And personally - I think they're providing a boat-load more value for the watts.
Fair caution - All my numbers are back of the envelope, and I did the math quickly and without paper. Feel free to point out if I missed something or dropped a step in a calculation.
Agree that Discord & Slack alone are probably not the sole culprits. But if you expand this to all existing computers (including datacenters) and programs using unnecessary abstraction layers, I wouldn't be surprised if we heavily surpassed bitcoin.
All that to say that IMO there is a fairly big incentive to re-think our software stack. Part of the issue is that it is hard to optimize software for all platform/hardware, therefore creating the need for abstraction. I wonder how this would all look if we suddenly stopped developing new hardware and put everyone on the same page, would software slowly become more efficient?
I think it would be far worse if we stopped developing new hardware.
The focus on mobile devices, and battery life in particular, has done a tremendous amount to reduce power consumption.
A typical desktop computer from the 2000s could easily use 1000w. Some of the pentium 4 variants burn through more than 100w alone - not including any of the rest of the hardware in the machine, or the monitor.
A modern laptop will usually come in under 50w - and can usually idle down under 10w.
Servers 20 years ago were mostly big power hungry machines running in closets somewhere - now they're heavily power optimized machines running in AWS/Azure/GCP/Other's data centers.
CPU cycles have gotten ridiculously cheap (both monetarily and in terms of power usage). Single-threaded performance hasn't gotten all that much better.
The market is already optimizing for reducing power usage, because power isn't free and because batteries are a constraining factor in mobile devices.
I know, my point was about how software developers could be forced to make more efficient software instead of relying on more and more powerful hardware.
Also, how many CPU generations would be required to compare to our current hardware with more optimized (or simply less bloated) applications?
And is the juice even worth the squeeze? We're using astronomically more computing power to enable developer conveniences that are (however you define or measure it) not astronomical improvements.
I've heard the developer time argument since I started coding in the 90s and have been skeptical of it just as long. I think it's a post hoc justification for practices we already do.
Software development is partly fad and fashion driven, is passed on by imitation, and code is often easier to write than to read. Almost all of the effort has gone towards expanding the numerator in that write/read ratio, and there's neither sufficient incentive to making tools to cut out layers nor widely accepted practices for how that works comparable to those we have for adding layers.
All this adds up to a strong entropic gradient toward software inefficiency, independent of whether or not it actually makes developers more efficient, but leading to a worse experience for users either way. Bucking the trend to make software more efficient can also improve developer time, as Linus Torvalds' argument why it was worthwhile to make git fast. I contend the justification that producing bloated software saves developer time is only true when selectively applied.
My axe to grind here is I question whether we wouldn't have similar improvements if we'd put the similar effort into making better tooling, education, and API design for native (and/or) compiled software ecosystems that we do towards things-that-we-use-because-they-are-there (whether or not they're the best tool for the job in a broader sense).
Computing resources were scarce, which made us better code-writers. We had to think about this. Scarcity leads to efficiency. Abundance leads to complacency.
The new generation just assumes that all database/network calls and UI plumbing is magical and free.
Bitcoin / Ethereum will peak any device perform if the calculation such as an ASIC, CPU or GPU which is 200 watts or more. I don’t think that Discord or Slack aren’t even peaking CPUs or GPUs while they are running and also they don’t run 24 hours a day.
I was curious so I checked on my computer. I don't have Discord, but I do have Slack on and use it all the time (calls, screen sharing, the whole shebang). All Slack-related processes in total have used about 0.6% of my total CPU time (a bit of a high estimate because of how that calculation takes into account things like wakeup, but let's go with it). My computer uses probably about 15 watts on average as a high-end estimate. That gets us to about 0.09, say 0.1 watts on average. My computer is on for about half the day, so let's say 0.05 watts.
If we consider Slack's current active user stats (it looks like 10 million?) that gets us 500 kilowatts. It seems like most sources agree that Bitcoin's power usage is probably around 10 gigawatts. So based on Slack's current usage stats I'm guessing there's around 4-5 orders of magnitude in difference between Slack power usage and Bitcoin power usage.
The difference is that they handle 10,000,000 more transactions per second than Bitcoin. People are able to talk, have multi person voice and video conversations, even entire communities. On blockchains, no matter how many nodes join, it’s the same capacity. In fact it’s the only tech in history that promises to become slower and more expensive with time.
On Discord or Slack I can copy a screenshot and send it to loads of people with simply Ctrl-V, I don't have to manually upload to some random image host website and copy and paste links which usually link to some page with a thousand ads.
I can join a voice channel seamlessly, talk to my friends, listen to music together, stream my screen, watch others streaming their screen, build custom interactive bots that manage rich media, send messages from my apps to a channel with webhooks which I can create in seconds, see what other people are playing or listening to, reply with custom emotes, create threads, add syntax highlighted code blocks, etc. I could go on for an hour.
None of the old school 1998 irc channels had these capabilities and I'm sick of everyone on HN pretending like it's the same thing. If it was, people would use it.
You are citing incorrectly. The webpages says that:
"I can recommend that you switch to a resolution of 640x480 @ 256 colors before starting DOS games - just like in the good ol' days."
Maybe I am nitpicking, but it also runs doom with 1024x786 or higher. He just recommend it for the "good old days".
I think you might be misunderstanding what this post is about.
That version of Doom is not running on Electron directly. It is running on v86, which is a x86 WASM emulator, so essentially the entire graphics stack of the game and the entire OS its running in is emulated.
The resulting performance characteristics have absolutely nothing to do with a normal Electron app.
> When we talk about efficiency of (other) electron apps made for productive use
Other electron apps for productive use is what the post is specifically NOT about. I find it higly irritating, when comments like yours are used to inject the never-ending and frankly tiring Electron-performance discussion into anything Electron related, drowning out discussion that would actually relevant and on topic.
It's all super fuzzy. Let's say I want to build native app instead. I start Visual Studio, which doesn't feel faster than Electron, then I start new C# project which doesn't feel faster than Electron. But wait... C# is also not native code, it runs on some sort of VM so I should switch to C++ but it supposedly isn't much faster andbit uses same gui controls and libraries that C# does and it also doesn't feel any faster.
How to be so much better than Electron that it matters?
C# can be easily compiled to native code. The JIT Compilation is just a convenience that ultimately helps with development. You don't have to use it in production.
In the mid-2000s I used Lotus Sametime at work. We could send text, emoji, pictures, and more. It had a plugin system that would extend the client side to do things like automatically look up issue numbers found in the chat and expand them into hyperlinks with full descriptions.
It was cross platform with a native UI thanks to Eclipse RCP, Java, and SWT. We moaned at the time that it sucked up 200 megabytes of memory and had a long startup time.
I use Microsoft Teams daily and sometimes chuckle that in some areas we seem to only reinvent the wheel with dubious outcomes.
Just ran it to find out. On my machine, about 292 MiB, with the default machine, which seems to give about 128 MiB to the VM. That's measured using the sum of private working sets of all processes, since Electron/Chromium is multi-process.
This actually doesn't seem like such a bad outcome, though, I don't know how it will scale if you increase the amount of RAM available to the VM.
To be fair, there was no memory pressure, and as such nothing was swapped out. (I checked. The active private set was equal to the private set.)
Counting shared memory would not have dramatically changed the value, if I am not mistaking, but Windows Task Manager does not have a good built-in way to do so without counting the same shared bytes. I do not believe it has the proportional set size option. Perhaps there is a nice third-party utility for it.
If it changes the order of magnitude of the memory usage I would be surprised since the size on disk of all of the binaries is less than the private set anyway, but if it does then I stand corrected.
Agreed, that doesn't seem too bad for the RAM they gave the system. Wonder how stable that number is when things are running? My assumption is it should be about the same since electron is running the emulation program, regardless of what the OS does.
I know replying to this is wrong, but what the hell.
1. The underlying machine has 128 MiB of RAM, not 8 MiB. Windows 98 can run with 8 MiB of RAM, but that's not what they did here. My family had a Windows 9x computer when I still lived with my parents, and before we retired it, it had 512 MiB of RAM.
2. Dividing the memory usage by the guest machine's amount of memory does not seem like a good approach to calculate the overhead. There may be some per-byte-of-RAM overhead, but I bet it's more likely that the memory overhead does not scale much with guest machine RAM, since it is probably just a giant arraybuffer.
3. Compared to what? I'm going to go out on a limb and say that most virtual machine software has RAM overhead of some kind, so we need to establish at least some reasonable baseline if we're really going to try to figure out how within reason this is.
(Expounding on the last point, people are happy to spend a lot of resources on emulation, certainly well exceeding "35x overhead" in some cases. And at that point, we are talking about overhead that would scale.)
I am pretty sure v86 allocates all of the memory ahead of time, using one big array buffer. It’d probably be easy to find the code.
However, that doesn’t mean that Chrome is actually allocating it all at once. It could very well be paging it in on demand. I have no idea.
In that case, to measure the overhead, you’d need to fill the guest memory.
I didn’t bother looking into it that deeply because I don’t really think it’s that interesting; an Electron x86 emulator is pretty impractical no matter how you shake it. It’s still pretty funny, though.
This is great, can you install other software on it?
If so it would be actually quite useful in science where old computers are used because of old software used to run old instruments. The old GC-FID i used during my PhD ran on windows 95. If that software could run on electron they could actually get some modern hardware to run those old machines
In the README it says in a trolling way "should it have been a native app?" but actually it's not a bad concept at all.
The reason to use Electron is to not have to take care of app portability and also to limit the number of expensive C++ developers you need to hire.
That's also why doing a mobile website / Progressive Web App / Bundled Web App is a hack that makes sense, though the "right" way is to develop a native app with one or two additional developers (which is absurd, because then you have roadmaps that get out of sync between platforms).
Hardware is cheap, especially when you are not the one paying for it. Engineers are expensive for your company.
I'm not sure if you're trolling or not, but I'm thinking the same for my fictive automotive manufacturing corporation. Engineers are expensive; I'd rather quickly cobble up something with chinesium parts that somewhat rolls and gets 50MPG with a terrible safety record, so I can transfer the costs to society. Gas and insurance policies are cheap, especially if you're not the one paying for them. Engineers are not.
Let's do the same with aircraft and software and everything else and see how things turn out.
We absolutely do this, in every possible industry. The first cars weren't anywhere near the efficiency they have today, and the ones we have today aren't anywhere near theoretical max efficiency either.
In the real world, you always have to make a tradeoff between delivering some value now and delivering the maximum theoretical value at some potentially non-existent point in the future. And after some time, throwing more money at the problem won't increase the value output by that same amount, and so you have to accept that your product is imperfect, but still adds some value.
Add to that some market mechanisms that exist in the software world, where it's often better to move fast and capture a large share of the market than to polish your product and be left with no users. And suddenly building your app with Electron doesn't seem so bad from the point of view of someone who just wants to solve their users' problems.
If the app turns out to add real value, you can always polish it later and do a rewrite in assembly. If it doesn't, at least you didn't spend five years writing that chat app when you could have done something more interesting.
You're joking, but I've read that's not far off from how a lot of modern car manufacturing works. The development of most parts gets outsourced to various companies and the car company just assembles everything. I learned about this in an interview which claimed that Tesla was bucking the outsourcing trend by increasingly inhousing more and more car parts in order to have a greater competitive advantage.
>Let's do the same with aircraft and software and everything else and see how things turn out.
Exactly. Who needs a quality steak when you’ve got McDonald’s around the corner? So much cheaper, too. In fact let’s just replace all restaurants with McDonald’s. It makes a profit so it must not be wrong.
Wow, its based on v86 which "emulates an x86-compatible CPU and hardware. Machine code is translated to WebAssembly modules at runtime in order to achieve decent performance."
I'm not sure I understand what it is exactly, the page mentions QEMU so I assume it's emulating a PC and booting Windows on it? So does it mean that they've compiled QEMU for a Javascript target and packaged it in Electron?
If so, does it mean that it's a generic PC emulator in electron that can run any kind of PC-compatible software, including other OS?
I had the opportunity to do some modest contributions to the underlying project (v86). I wanted to implement the full SIMD instruction sets, but it was almost impossible due to the current limitations of javascript.
The only way to do it properly would have been to wait for the support of SIMD instructions in webassembly. I had a chat with the main contributor Fabian, he had plans to port v86 in webassembly. I wonder if he ever succeed.
So this not a complete emulation as far I know, there is room for improvements.
Hello, so occasionally you might get this white screen that prevents you from being able to stop, restart, or reset.
I'm shocked that no one else running Windows 95 noticed this problem.
Hacker News is the worst. You could kill yourself for two years on personal project, that is a towering achievement of innovation and technical accomplishment, and the Show HN thread will fixate on the font kerning in your announcement blog post.
I’m old enough to remember jokes about it when it was new: "Windows 95: a 32 bit shell for a 16 bit patch for an 8 bit OS originally written for a 4 bit processor by a 2 bit company that can’t stand 1 bit of competition".
I would bet that by porting a x86 emulator to electron it actually runs on _less_ platforms than native x86 emulators. I mean, emulators are practically one of the most ported software and as a consequence one of the most portable software overall.
There are some fairly recent versions of Gecko that have been backported to Windows 95. I played around with RetroZilla a few years ago on Windows 95 and was able to get many modern sites to work. The K-Meleon project has even more recent work in this regard. It'd probably be easier to start with Gecko for this work than Chromium.
Maybe I'm just going insane, but I have extreme deja vu about this comment and its replies. Did this post and its comments get refreshed or something to look new? I swear I read all of these verbatim a while ago.
I've had the same sense of deja vu on various topics, like Electron. I'm pretty sure the HN crowd revisits favorite themes regularly, and repeats basically the same arguments back and forth, usually adding new information or ideas on every iteration.
When we talk about efficiency of (other) electron apps made for productive use… the web page says this project does run Doom (cool) at a resolution of 640x480. So electron cripples my 2022 machine down to something like a 100 Mhz Pentium. That is totally fine for a funny tech demo, but I hate the need to use such programs for my daily work.
But, sorry, Pentium, I must take that crippling stuff back. I had chat apps in 1998 that had much higher responsiveness and faster start-up time than the electron chat apps I need to use for work today.
You, dear reader, may have already guessed my point here. Today software development is done often with many layers of bloat and glue, just to save some developer time and necessary developer expertise in the short term. The price we pay for this is a tremendous waste of resources.