Hacker News new | past | comments | ask | show | jobs | submit login

I don't specifically miss SETI@home as much as I miss the spirit of the internet back then, the "what will they think of next!?" The idea that all of us could together could crack RSA keys wasn't just a thought exercise, people signed up and did it. Now computers are so complex and task heavy and fraught with problems it seems like more hassle than it's worth. I liked when wondering "why is my machine so slow", that "top" or "ps auxww" had a chance of fitting on a screen or two. Now I wouldn't look forward to unleashing an eternal, 24-7, intensive task with a connection to the internet, even though I have a lot more bandwidth.



The last time I ran SETI@home (and Folding@home), I had a fixed desktop machine that I always left on, and I think it was old enough that the fan speeds weren't variable, so it was just constant background white noise.

In that context, running those apps in the background all day and all night was fine, and if they ever noticeably interfered with performance while I was using the machine, pausing them for a bit was easy to do.

I haven't had a desktop computer in at least 14 years. When I'm not using my laptop, I close it and it goes to sleep. Running CPU-intensive apps on my laptop spools the fans up to their loudest, and makes the bottom of its chassis uncomfortably warm. The CPU cores will sometimes hit thermal throttling too. I just don't want to deal with that sort of thing all day.

Then again, I do have a couple Raspberry Pis, an old Mac Mini (running Linux), and a CM4-based NAS in various places in the house running 24/7. I could probably run an @gome-type app on one or more of those. But for whatever reason, doing so just never really occurred to me.


I remember when power wasn’t 51c/kWh and a powerful PC didn’t suck down >1000w. I know we get way more perf/watt these days, but I don’t feel as a user like that translates into a notably better experience. For most things for $work, Slack, Spotify, GMail, Google Docs, and VSCode feel a lot less responsive than irssi, Winamp, Sylpheed, OpenOffice, and $ide_of_choice did a decade or two ago.

What Dr. Lisa Su giveth, Satya taketh away I suppose.


We also used to write code in low-level languages, and system designs were simpler compared to our current approach of writing a lot of stuff in JS and using Electron everywhere, having a huge dependency tree and in general - optimizing for fast development rather than efficient runtime.

Having said that, optimizing for perf is rarely worth it because people don't usually care, and when it is - you can often make the biggest difference by figuring out why some step requires O(n^3) instead of O(n) like it should and fixing that, rather than rewriting the entire stack to get some 10-20% improvement...


Someone should do an accurate figure of compute power usage, including things such as cars, or media centres, and then try to analyse how the existing trend of "wasteful development" affects this.

I bet all the upset over bitcoin power usage, is a tiny thing, compared to (for example) the whole node ecosystem.

Even something such as gmail is massively overengineered. Heck, there are very few improvements over the interface a decade ago, yet I bet it used 5% of the power it does now.

Yet think of how much gmail is used.

And it's not just raw compute. 3d on a page for no reason, massive js binaries for no reason eg ram + bandwidth.

A modern phone could last weeks as a browsing device, if looking at static html.

But all of this means more development skill, less use of fluff and junk, care for size of page loads, and on and on.

Put another way, people concerned about power usage, should realise that using node, means burning more coal/NG. Literally.

Because that's where the excess power comes from.


Hit the nail on the head. Inefficiency, which has reached gargantuan proportions, and then blown past that, doesn't just mean page loads so slow that loading a news article takes upwards of 20s on a pocket supercomputer (whereas it took barely a second or two on a Pentium II on a 56kbps connection), all the while text and buttons jump around as assets are loaded. It also means very real waste of energy and battery degradation.


I mostly agree with you, however:

> A modern phone could last weeks as a browsing device, if looking at static html

That's what I do. Weeks is a bit of a stretch. Maybe one of screen time is limited. That said, discord (the website) drains my laptop battery almost as fast as gaming does. They do spawn a separate WebGL context for each animated emoticon. Crazy, crazy thing

> Put another way, people concerned about power usage, should realise that using node, means burning more coal/NG. Literally.

Sure. At the same time, though, a single car trip to the movies burns orders of magnitude more energy than all your gadgets running JS for an year.


shit like slack and other electron abominations are not just 10-20% improvement going native. We are talking a simple chat application, irc glorified to load images and preview, and this somehow takes more memory and cpu than the total resources I had in 1995 times 100.

these systems are so abominable, and people have just gotten used to it, hell, many users never even tried anything else, but saying 10-20% is just ridiculous


Man I want a native TUI client for Teams.



Make a windows XP VM and install some of those old versions and see how fast it runs on modern hardware (even while emulating the entire PC to run it on!). Many modern devs chose ease of development over end user efficiency/performance.


51c / kWh?? I pay 20 in Spain even now with the energy crisis.

I think back in those days the price was around 10c

I know this is euro cents but the dollar exchange rate is not too big to make a huge difference.


> a powerful PC didn’t suck down >1000w

My Ryzen Threadripper 2950X, which isn't a top-of-the-line system at this point, but is still a 16 core beast-ish machine, sucks down (much) less than 400W


My 3990x is almost 500w. My last dual socket Xeon workstation was closer to 750w.

That's before you add on a GPU. My 3080 uses 350w under load.


Are these power figures measured at the wall? If so, both are astonishing. My file server with an old Xeon, 16GB RAM and 5x 7200RPM HDDs uses about 95W when idle and about 150W when I throw something at it that works the drives. My desktop with an I7-4770K, 32GB RAM, some SSDs and a single spinner ids about the same. IIRC if I push something compute intensive at it, it can break 200W.

OTOH a Pi 4B with 2 7200RPM HDDs uses about 25W and another with an SSD uses about 7W.


I don't have a kill-a-watt meter handy anymore, but I do have a smart meter. Guesstimating using that, I hit ~ 850w with my threadripper when compiling and my work project and running a game at the same time, and about 650w when _just_ compiling (so presumably the difference there is GPU being under full load).

They're obviously extreme examples, but on the flip side, I've got an M1 Macbook Pro which lasts an entire day on a single charge of battery including running docker, slack, and some "mild" compiling (not large C++ projects). I'm not sure how to measure the energy consumption of it on battery, but suffice to say it's "very low".


Powers is around $0.10/kWh for residential, even lower for industrial


Power prices vary dramatically in the United States - and are even higher in most of the rest of the world.

Here’s the prices for the United States - https://www.eia.gov/electricity/monthly/epm_table_grapher.ph...

Even these data are noticeably low for my area where we had a _massive_ rate hike in January that results in total prices for electricity (generation + delivery) without an alternative supplier being up around $0.48/kWh.


Big cities in the US tend to have very expensive power ($0.50/kWh or higher), while rural areas can have nearly free power, particularly if you are near a dam or a nuclear power plant. The price you're quoting is definitely not a big city price.


It's closer to 50c/kWh here in the UK [0]

[0] https://www.gov.uk/government/publications/energy-bills-supp...


I'd pay 0,77 euro per kWh if the government hadn't introduced a 0,40 euro per kWh price cap.


I just cant get a laptop i feel comfortable working on, they all have that slow feel, where a desktop is just crisp and i can always just switch out anything i dont like about the current build..


Part of the problem is just raw TDP. There’s only so much horsepower you can squeeze out of 45-65w laptop chips. The brand new Ryzen 9 7945HX is damn fast though, with 16 real cores on a way better node size than Intel is using for their 13th gen parts that, at most, get you 8 real cores plus a 16-core Atom on the same die. Stuff like the Asus Zephyrus Duo have RAID-capable m.2 (and that one has two screens, one of which is a 16:10 with high refresh). Get something with a dedicated GPU as well, as that makes a huge difference even for desktop compositing. Despite what Intel says, integrated graphics continue to be a joke compared to even budget dedicated graphics. These mobile workstations are not going to rival 1kw desktops, but they get within 40% or so. The Intel stuff, 12th and 13th gen in particular also get bad battery life, like 4ish hours. Blame the FAA for their 100wh limit on batteries — not many laptop mfgs want to make something that you can’t technically take on a plane, even if the mall cop TSA agents wouldn’t ever notice.


I lucked out with getting one of these on sale just before they went away: https://www.bhphotovideo.com/c/product/1746503-REG/lenovo_82...

The 300W power brick provides enough juice to feed both the CPU and GPU, and the laptop variant of the 3070 ti isn't too much slower than the desktop version in practical terms (I don't feel that 30% less performance). I've never tried it on battery or on the 135W USB-C power limit, but reviews report getting sufficient performance.

I don't have room for a desktop, so it's nice to not feel like I'm compromising on power with something that fits. The memory and SSDs are user accessible, too.


My MacBook Pro with M1 Pro is the first laptop I've had that hasn't had that slight hesitation that laptops have always had. I think this is why people praise Apple Silicon so much even though the benchmark numbers may not always back it up - so many other people are also coming from similarly-hobbled laptops.


I have a great desktop, and every day I curse Apple and Tim Cook because I cannot get an iPad Pro that runs full macOS.

That would be the ideal device for me when I'm on the go, that I can still use to watch Youtube videos in the bathroom. I do not need a full laptop when I work at home with a $5,000 desktop PC, nor I want to buy a crappy $300 laptop that always feels slow and underpowered.

It is really sad to have the tech to do something, but it doesn't exist because the bean counters said no.


> nor I want to buy a crappy $300 laptop that always feels slow and underpowered.

Meanwhile, the expensive laptops with actual desktop-class components have unacceptably shitty peripherals, so you can't even pay more money for what you want; it simply doesn't exist.

This is what upsets me the most about laptops.


> what you want; it simply doesn't exist.

Once the patents on Apple's trackpad expire, then we'll see.


I love Force Touch trackpads more than anything. I specifically got an Apple trackpad to use with my desktop because it's so much better than anything else out there.

But having a good trackpad doesn't make up for other blunders like an annoying keyboard layout, a low-DPI screen, or etc., and you bet nobody who makes the thick and heavy high-performance machines is going to care about a good trackpad anyway.

You don't really see OEMs that put top-of-the-line hardware inside a thick and sturdy chassis, and then... add a nice 4K screen and a numpad-less keyboard and put the charging/USB ports on the left instead of the back.

The list of possible blunders is so long that you'd be hard-pressed to find a machine that hasn't been subject to most of them. That's just not how laptops work. Laptops are a package deal, all or nothing. You can't ensure every requirement individually like you can with a desktop.

And the reason I don't like the market is because nobody makes a really good one.


What do you mean by shitty peripherals?


Low-DPI screens, bad webcams, keyboards with the wrong arrows/numpad layout, off-center trackpads etc. it's different for each person what they consider "shitty", but laptops are always a total package deal (except for Framework) and it can just so happen that nobody makes powerful laptops that also have the exact right permutation of peripherals.

You can find all sorts of beautiful 4K ultrabooks with the perfect keyboard and trackpad layout, and great webcams too—if you don't mind putting up with a Celeron and 4GB of RAM... If you want a really modern and powerful CPU with lots of RAM, you'll be looking at the likes of Clevo and AORUS and etc. that put desktop-class parts in a thick chassis, but... that bodes terribly for the peripherals they end up using.

Some of the Clevo machines have individual gimmicks like 300Hz displays, but none of them seem to have all of "high DPI, good keyboard layout, good port arrangement, high performance". And this is true for basically all laptop vendors.

OEMs think good peripherals always equate to "thin and light" which results in bad performance, either through bad parts or bad thermals. Maybe Frore Systems' new cooling solution will help slightly with ultrabook thermals, but until that happens, laptops are just suck.

You may notice that this only matters because I even care. You may also notice that this is exactly what building a desktop is for. I am building a desktop, but I also have to build a laptop from scratch for the desktop so I can operate it from bed. Which would have been avoidable if I could've found a laptop in the first place.

The point anyway isn't to say that there aren't any good laptops out there, but rather that if you want high performance (e.g. you are bothered by slowness), you can only get it as a package deal with the rest of the laptop... obviously. But the package deals all suck. The laptop market is so annoying sometimes.


I haven't really been in the market for a non-Apple laptop for a while (previously had to buy MacBook bc of specific software), so this gives me a great idea of what to watch out for when I get my next laptop. Thanks a lot!


For what it's worth, the ROG Flow X16 (2022)[0] has some good specs (8-core CPU with 3070 Ti), a great keyboard layout (with correct arrow keys!), and other niceties like a mux switch and upgradeable RAM. It is so close to perfection—but doesn't have a 4K screen, ruining the whole package.

Big reason why I dislike the market so much: the existence of that machine with only a single flaw that nonetheless ruins the value of the entire product. There's nothing else like it. It seems to be the absolute best machine the market has to offer... and it just barely misses. :(

However, the screen resolution seems to be the only problem with it, so anyone who doesn't require specifically 4K (most people) would probably find it worth considering.

[0]: https://rog.asus.com/laptops/rog-flow/rog-flow-x16-2022-seri...


Another option is to get the iPad (and magic keyboard and pencil) and remote desktop into your home machine.

Put it behind a VPN, use wake-on-lan and boom, iPad is essentially a fully blown computer.


No, that would turn it into a fully blown thin client.

What is the point of having half a TB of storage, however many gigs of RAM, lidar, radar, sonar, and the best processor architecture money can buy to render a glorified, interactive video?

I would like to be able to write code without having to pay the price of internet latency. I don't need a $1000 tablet to render RDP.


Termux on Android does what you're asking for, but I doubt it's what you actually want. shrugs


On the other hand, there's a lot of great tech (including, I think, the iPad pro and Apple silicon) that just wouldn't exist if the bean counters hadn't said yes, it's a good idea to invest a few billions of dollars into this...


Uhm, allocating money to a budget doesn't create great products anymore than wishing very hard for something before blowing the birthday candles.

It's the engineers that actually create the cool things.


> Uhm, allocating money to a budget doesn't create great products anymore than wishing very hard for something before blowing the birthday candles.

I didn't claim that it was sufficient, but it is necessary.

> It's the engineers that actually create the cool things.

Of course. With other people's money (typically).


Agree, I’ll probably never buy another Intel laptop as long as AMD and Apple keep killing it with these things. I was hopeful that the Surface laptops would be good with ARM too but they’re not using the latest chips, they went for low/mid-tier performance, and Windows on ARM still sucks / relies too much on emulation to be fast. Some day maybe we’ll get REAL ARM hardware with dozens of cores, tons of memory bandwidth, USB4, and decent freaking GPUs. Sorry, but the idea of spending $2k on a laptop that runs last year’s mid-tier smartphone processor just doesn’t cut it. Maybe that’s fine for a Chromebook for schools or something.


Don't worry. Apple will figure out how to put that hesitation back in with a future MacOS upgrade. Probably with yet more phoning-home daemons in the name of security theater.

Sorry for the snark. I'm just bitter because Apple is about to ruin Dropbox because they don't allow KEXTs any more.

https://tidbits.com/2023/03/10/apples-file-provider-forces-m...


> that slight hesitation that laptops have always had

For me just getting a decent low-latency mouse (e.g., wired g203) and correctly configuring 100hz+ monitors has vastly improved my experience on my M1 Air and the shitty Thinkpad I use for work -- to the point where I haven't used my desktop in a long time.


Amen brother. It's like driving on a mountain highway with the windows down compared to commuting in a subway.


Personally I no longer use laptops, and instead do 99% of my computing on a fast desktop with a great keyboard, great monitors, etc. If something breaks or needs to be upgraded, I can pop it open and get it done, usually on the same day. On the rare occasion I need to travel I'll take my smart phone and a note pad if there is some need to take notes.


I run fold@home I'm at 207M points... Sucks I'm in 14K range... I'll probably never hit #1 haha


I'm at 466M points and within top 10k donors.....that's just thanks to running folding@home on my own machines for the last 15 years lol.

https://stats.foldingathome.org/donor/id/9171


wow that's nuts... I don't know how many months I'd take me to catch up/by that point ahead again

15 years is insane though... I only recently started with an i9 and 3080Ti on medium/highest priority


Power and cooling were at fixed levels and computers couldn’t really sleep, so cpu cycles were just wasted without seti@home. Nowadays computers idle efficiently, so something like that would make your computer louder and more expensive to run.


I’m probably about median age for HN and when I was looking into folding Seti@home I had no idea and didn’t worry about power cost because it was subsidised by my parents or university. Plus energy was cheaper then. Now as an adult seeing a month of a few GPUs running on the utility bill really hurts.


The fact that energy hasn't gotten more affordable really demonstrates a deficiency of planning.

We've had access to a source of energy that is cheap, zero-carbon and essentially limitless, since the 1950s: nuclear power.


After looking at my utility bill, less than half of the cost is electricity and natgas; the rest is delivery charges -- that is, paying to maintain the power grid itself. Assuming the utility is breaking that down more or less honestly, reducing the cost of electricity here probably wouldn't decrease my utility bill as dramatically as one might guess. (And I live in California, where electricity prices are notoriously high.)


What doesn't make sense to me is why this delivery charge keeps going up? Is it just PG&E throwing a fit about having to modernize its lines so they don't cause fires anymore?


Or, alternatively, that they were only so low before this because someone decided to skip maintenance for 50 years.


Ah, the good old pass-the-buck-to-my-future-self strategy that is leaving the US with trillions in infra repair bills to be paid by my (millennial) generation and younger.


*waves in British*

• Long-term debt used to buy off the slave owners in 1837 and not getting fully paid down until 2015: https://en.wikipedia.org/wiki/Slave_Compensation_Act_1837

• Napoleonic war debt was also still being serviced until 2015: https://www.gov.uk/government/news/repayment-of-26-billion-h...


Everything in California is explained by boomers voting themselves out of paying taxes. In this case the regulator never let them raise prices for maintenance so it's all coming due now.

(PG&E is basically nationalized already, they don't make their own decisions here.)


> Assuming the utility is breaking that down more or less honestly

That's a bit of an odd one, as the amount of electricity you consume has little direct effect on maintenance costs. Now, it might be in some sense "fair" to charge heavier consumers more, and there's a second order effect whereby if total demand exceeds some threshold, the grid must be upgraded to cope, but if that "delivery fee" scales with your consumption then you're definitely looking at some "creative" accounting.


>Assuming the utility is breaking that down more or less honestly

...

>And I live in California,

This made me laugh out loud. (Worked with utilities all my life, including in CA.)


I have had around 13 power outages this week. I live right in the middle of Silicon Valley. Most of the charges from PG&E are indeed the grid charges, the rest is supposedly a mandatory green energy. This is depressing.


Fellow SV resident (does Cupertino count?) and my eyes water when I look at my energy bill. Then I lose power for two days and learn that people REALLY dislike PG&E. And I thought National Grid was bad.


Let's put all our power lines up in the air, they said. It'll be fine, they said.

While other countries have stable and reliable power in the face of hurricanes and ice storms, our power is routinely knocked out by cars, trees, and basically any act of nature.

I love how expensive everything is for how behind we are from the entire rest of the world. Too bad the expenses keep me trapped and poor, unable to do anything about it—even just move to any other country.


I'm guessing much of that delivery cost is due to the additional energy transmission and storage infrastructure needed to incorporate intermittent energy from solar/wind.

Solar and wind are extremely expensive when all costs are taken into account. One study found that the "true cost of using wind and solar to meet demand was $272 and $472 per MWh": [pdf]https://web.archive.org/web/20220916003958/https://files.ame...

Mind you, this could change if we get an economical way to generate hydrocarbon fuels from renewable electricity, so I'm certainly not pessimistic about solar/wind's long term prospects.


With articles like "Save North Dakota Oil and Gas!" I'm sceptical of that study.

Wind and solar were 30% of Great Britain's electricity last year. The National Grid is a private company, so their costs are clear, 3% of the average bill.


The study advocates nuclear and hydroelectric as the means of getting to zero-carbon, so doesn't seem motivated by pro oil/gas bias.


Nuclear power has proven to be a lot of things, but cheap is not one of them. In theory it is possible to build nuclear power plants at scale and at reasonable cost, but in practice that has not worked out in the real world.


It has actually, in places and times where regulations were reasonable.


You mean the places and times and regulatory environment that brought us Chernobyl, Three Mile Island, and Fukushima?

I'm not going to get into a debate over the safety of nuclear over other generation options, but is it conceivable that, if the modern regulatory environment were in place when those three plants were built and operated, they never would have suffered major incidents?


Having a fukushima and a three mile island happen every year would be easily worth it if it made nuclear 20% cheaper (feel free to do the maths if you dont believe me). And it would actually make things fourfold cheaper judging by the difference in price per KW between the 80s and now.


Those power plants wouldn't have been allowed to be built... so yes? With the modern regulatory environment, those three nuclear power plants would never have melted down (because they wouldn't exist).


But you can't both have cheap nuclear and the current regulations since they are in conflict.

But I think molten salt thorium plants can offer superior safety


Molten salt reactors might provide superior safety (the thorium is irrelevant to that). However, they also would likely imply higher operating costs, because now instead of being sealed in fuel elements your fuel is flowing through a larger part of your reactor, and volatile fission products are escaping into the off gas and have to be trapped in another system (which will have to be cooled to prevent melting). So you now have complex systems that are too radioactive for anyone to touch for maintenance. Reactor structural elements are also now exposed to more neutron radiation as well.


There was less than 100 deaths from chernobyl, and only one person died directly from fukushima[1]. I suspect more people have died from falling off roofs while fitting or cleaning solar panels than that.

You're correct that nobody builds rbmk reactors any more, but others of the design continued to operate safely for many years.

[1]https://ourworldindata.org/what-was-the-death-toll-from-cher...


> There was less than 100 deaths from chernobyl

There may well be tens of thousands of deaths from Chernobyl. No, we can't state that these deaths absolutely occurred/will occur -- they are spread throughout large populations, mixed into the vast number of naturally occurring cancers -- but technology regulation is not like criminal prosecution where things must be proved beyond reasonable doubt. Technologies are not innocent until proven guilty.


> Technologies are not innocent until proven guilty

That depends entirely on the technology, and who is paying who. Fossil fuels have been proven guilty of numerous sins time and again and are still relentlessly defended as innocent. Leaded fuel in particular enjoyed an undeserved benefit of the doubt. Food additives are another category of chemical with very permissive regulation, provided they don't cause lab rats to immediately keel over.

Technology regulation is unprincipled at best and outright corrupt at worst, and even with the most pessimistic estimates nuclear power has the best safety record per watt of any power generation technology (and it's not close).


That's nice, but it's no justification for that earlier poster lying about the number of people killed by Chernobyl. There's very good reason to think the number is much larger than 100.

Fossil fuels being bad is an argument for getting off fossil fuels, not an argument for nuclear in particular.


Did you read the source I linked? It gives a good estimate.

    "2 workers died in the blast.

    28 workers and firemen died in the weeks that followed from acute radiation syndrome (ARS).

    19 ARS survivors had died later, by 2006; most from causes not related to radiation, but it’s not possible to rule all of them out (especially five that were cancer-related).

    15 people died from thyroid cancer due to milk contamination. These deaths were among children who were exposed to 131I from milk and food in the days after the disaster. This could increase to between 96 and 384 deaths, however, this figure is highly uncertain.

    There is currently no evidence of adverse health impacts in the general population across affected countries, or wider Europe."


Yes, and you totally ignored the cancers that could occur due to low levels of additional radiation in the population at large. You are simply presuming that such cancers cannot occur.

What you are doing here is demanding that the cancers actually be demonstrated, requiring that radiation be treated as innocuous until it can be conclusively shown it isn't. This is the mindset behind those ranting against the linear no threshold theory of radiation carcinogenesis.

But regulation doesn't work this way. Nuclear isn't something that must be presumed innocent until proven guilty.


which times and places were those?


80s USA.


First, the costs of nuclear power are quite substantial AND they are front-loaded making it a bad investment. You will need to go through a lot of red tape and spend all the money right away to get some benefit later when it's built.

Second, it's not zero carbon if you include building the plants, since the plants have a limited lifespan, so there's still a carbon footprint for the materials.

Third, if you switch all of our energy to nuclear today with a magic switch, we would run out of all easy to mine uranium in a few decades. While that is not a concern in our lifetimes, some progress on thorium plants and so forth must be made for nuclear to be a long term solution.


> Second, it's not zero carbon if you include building the plants, since the plants have a limited lifespan, so there's still a carbon footprint for the materials.

Nothing is zero carbon either. Batteries have a very long supply chain that involves a lot of carbon heavy steps, and their disposal also creates pollution. The same goes for solar panels, but people conveniently choose to ignore that.

> we would run out of all easy to mine uranium in a few decades

We dont need uranium to run on nuclear power. There are several alternatives already.


>>Third, if you switch all of our energy to nuclear today with a magic switch, we would run out of all easy to mine uranium in a few decades.

There are vast quantities of uranium in ocean water in concentrations that while low, still allow it to be extracted. More importantly, breeder reactors can harvest 100X more energy from uranium than currently deployed reactor types, and in the process eliminate all of the long-lasting nuclear waste, i.e. those with a half life of millions/billions of years.

With known supplies of uranium and thorium on Earth, nuclear fission can provide 100% of humanity's current energy needs for something like 2 billion years. For reference, the sun's rising luminosity is expected to boil Earth's oceans away in 900 million years. We could 1000X current energy consumption with nuclear, and have enough fuel on Earth for over a million years. Long before the fuel runs out, we will be able to exploit resources beyond Earth.


All those options are significantly more expensive than what we've been building. I mean, we've built fast reactors, and thermal breeders with thorium, and had pilot efforts for U extraction from sea water. None of them could compete with LWRs with a once-through fuel cycle on conventionally mined uranium.

Fast reactors also present the possibility of prompt fast criticality (that is, an actual nuclear explosion) in a serious accident if enough of the core melts and moves around.


Thanks for the sobering information.

In any case, costs can be brought down significantly with 1. more reasonable regulations and 2. economies of scale, which just requires building more plants.

So if - due to massive expansion of nuclear power - we get to the point that uranium supplies are dwindling and causing the price of uranium ore to rise, we may still see breeder reactors with lower construction and operating costs than current generation reactors.

I suspect modularity in reactors, that enables a genuine global market to emerge in reactor manufacturing, would be instrumental in bringing down reactor construction costs.

>>Fast reactors also present the possibility of prompt fast criticality

I'm quite the layman in this subject, but my understanding is that there are fourth generation fast reactor models, with passive safety features, that effectively eliminate that possibility.


The claims that costs can be brought down with better regulations and economies of scale are often made, but the evidence for that is marginal.

For regulations: how does one identify "reasonable" regulations? How is this to be determined? The usual way this is done in other industries is by allowing accidents to happen, then add regulations that would have prevented them. I doubt that would be acceptable for nuclear: unlike (say) aviation accidents, the cost of an individual accident can be extremely large.

As for economies of scale, the evidence for that in nuclear is slight. Maybe S. Korea? But elsewhere costs seem to have increased with experience rather than decreasing. Nuclear involves large units that take a long time to build; there's not much room for anyone to gain much experience over their career. If anything, experience decay (as organizations and individuals age) seems to overwhelm experience growth.

> fast reactors

Fast reactors inherently have this problem, since a fast chain reaction occurs without a moderator. In a LWR, concentration of the fuel in an accident reduces reactivity, as the moderator is reduced. In a fast chain reaction, concentration of the fuel will increase reactivity. Putatively safe 4th generation reactors merely assert that such rearrangement cannot happen. If the analysis that led to that conclusion is found to be in error then anyone who has built such a reactor will be in deep trouble. Perhaps molten salt fast reactors will have sufficiently credible analysis, but those have their own issues (such as need for large amounts of isotopically separated chlorine in molten chloride reactors and exposure of the walls of the reactor to unshielded/unmoderated fast neutrons.)


I'm only including things that are viable. It seems it's just going to be cheaper to build out significant wind and solar, while keeping nuclear around for the times there's neither wind nor sun.


That's a complete terrible way to use nuclear. The cost per unit of produced energy will be so large that other systems (like storage, including burning hydrogen in turbines) would be much more economical to cover those dark-calm periods.

Nuclear should either be operated all the time, or it shouldn't be used at all.


I believe you can scale nuclear up in times of demand. So you'd be running it at lower capacity when it's not needed


I agree that nuclear power would have been great.

However it's no deficiency of planning.

It's down to public opinion and misguided regulation.


That's like a project manager blaming the missing of deadlines on "the opinion of certain people in the company and the bureaucracy within the company".

Not sure if that would fly.


Not sure that analogy holds.

No heroics from any project manager would eg let 1980s Ford stop selling internal combustion engines and go all-in-one electric cars. So any deadlines would be missed.


Better not to build one than building a nuclear power plant and then just not turning it on.

Zwentendorf was such a disaster.


Another way to look at it: there is no miracle solution.

It´s easy to look at any situation and say ¨If only they´d _just_ XXX¨. It´s never that simple.

Even if you deeply believe your solution is flawless and people ¨just¨ need to be convinced, the very fact that the situation is happening at all is an indication it will be a long and dire fight...

> essentially limitless

oh well


Counterpoint: France is doing extremely well in Europe and is the only country that is doing well (I'm talking energy-wize). France is a nuclear energy country.

So yes, there is 1 pretty great solution to the energy problem and yes it has existed since the 50's..


France has the same energy price rises as the rest of Europe last year, since so many of their reactors weren't working. Most of them are also aging.

Their average prices are only a little below the EU average.

https://ec.europa.eu/eurostat/statistics-explained/index.php...


Sure, when you convert energy you need to do maintenance and keep things up to date and if you don't do that bad things happen. That goes for all methods of energy conversion.

How does that have anything to do with the discussion about energy conversion types methods and their pros / cons?


France is one of Europe's top energy exporters, despiting consuming a significant amount of energy itself. Domestic energy prices levelling with prices in neighboring countries, while energy exports rise, is what you'd expect with a country that has a relative abundance of energy along with free trade with neighboring countries.


I'd take your point if French were massively (like 90%) in favor of more nuclear, and saw it as their main energy source for the future centuries. That's definitely not the case [exit: a poll mandated from a nuclear power organisation puts it at 50% of people seeing nuclear plants as a positive asset for the country:

https://www.ouest-france.fr/environnement/nucleaire/sondage-... ]

You can then think what you want of the French people and why they could be dead wrong for not thinking like you, but it's another debate: they'll still want to get out of it if given the time/choice.


Uranium and expensive plant resources are much too valuable to waste on this application. Buy some solar panels and do your hobby/discretionary computing using the much cheaper and unlimited power from our sun.


Watts per unit of fuel are cheap but only to the power companies who mark it up. We’re in the middle of Record gouging in energy and food and you think a power company won’t gouge you?

They still charge us for the electricity PLUS extra charges for a fund to eventually decommission the nuclear plant. It’s so expensive.


I don't think it's cheap, at least current costs of building new nuclear tell us that.


And why do you think current ones cost 4x as much as 80s ones even after you adjust for inflation and power output?


Well, construction is a lot safer now. The death rate of construction workers has fallen by a factor of 5 since the 1960s in the US. It's possible those lower construction costs were at least in part due to a cavalier disregard for worker safety.


It'd be valuable to determine the additional lives lost due to increased energy costs from these higher construction standards, and the lives saved from these standards.

My wholly intuition-based guess would be that the net effect of these higher construction/safety standards is massively more lives lost.

But if that is the case, it's worth remembering that the lives lost from higher construction standards are a result of second order effects and populist government-interventionist political ideologies are generally quite poor at factoring in second order effects, as we saw with COVID lockdown policies, which resulted in far more lives lost from the second order effects of the lockdowns than lives saved from the lockdowns mitigating the spread of COVID.


Another possibility is that the US simply isn't building as much these days, and heavy construction might have lost economies of scale. The Baumol Effect is another possible contributor. Renewable construction is not of the same kind, especially PV.

Specific construction expertise for nuclear very likely has degraded (the US can no longer forge the large reactor vessels for PWRs, for example.)

Nuclear is going to be further negatively affected if there's a transition away from fossil fuels, since the industrial infrastructure for making steam turbines will no longer have as much support. Bespoke turbines from smaller makers will be more expensive.


Building out transmission lines for low-intensity power sources like wind/solar is labor intensive, like nuclear power plant construction, so I suspect factors like the Baumol Effect similarly affect the cost of both.

Nuclear plant construction has the potential to become much less labor intensive if it moves to smaller, standardized nuclear reactors that can be manufactured in factories, using mass-production techniques.

>>Nuclear is going to be further negatively affected if there's a transition away from fossil fuels, since the industrial infrastructure for making steam turbines will no longer have as much support.

True. However this is a surmountable problem. It's just a matter of using subsidies to achieve economies of scale in nuclear plant construction.


I am skeptical of the SMR-built-in-factories schtick. It sure isn't helping NuScale control costs -- they're up to $15/We now, and it will not be surprising if that goes even higher.

The problem with the subsidy argument is there's no good experience to show that subsidies would do more here than just burn money. Nuclear has not had good experience effects. This is in sharp contrast to renewables and storage.

Nuclear is weighed down by the boat anchor of its historically poor performance.


I have a datacenter at home (that is, the top shelf of a cupboard) with my fiber connection, router, a 10 years old tower with a mix of HDD and SDD, a small computer that acts as the router/DHCP/... and a few other equipment - behind a UPS.

I measured the consumption at some point and it is a steady 60W.

I expected more, this setup has a marginal consumption compared to other appliances - and it gives me happiness.

Without doing advanced research, I would guess that a lot of consumption i fixed and hardly compressible (fridge, dishwasher, water boiler, induction plate, typical lights) - at least I hope so.


That's a Homeserver, not a Datacenter.

The definition of a Datacenter includes that it's complex, which your given example just isn't


Sorry, I was trying to be funny with the datacenter, using that word together with cupboard upper shelf and my equipment :)

This may have been funny for people who know me and my 30n years of work with real datacenters.

This is also an example of how badly written words on internet convey emotions (especially humour)


I just dislike purposefully misused technical terms, including in jest, as that's how we've gotten to this point where every second disagreement boils down to people not agreeing to the same definition.


Solar panels.


So now you need additional capital expenditure to handle the extra load of running your CPU and GPU 24/7, above your base load.


At grid scale, providing that is still likely cheaper than doing everything with new nuclear. The difference in LCOE is just that large.


Rental apartment.


Nah, HLT is pretty old and even back then had meaningful impact on power consumption. If you were running Win9x though you would've had to install an additional driver for it, as those really just busy waited otherwise when nothing was happening.


> The idea that all of us could together could crack RSA keys wasn't just a thought exercise, people signed up and did it.

Hasn’t changed that much. Here we are in 2023 where most of this distributed power goes to cracking sha 256 hashes. Just for speculation and profit instead of leaderboards.


Bitcoin has nothing to do with "cracking SHA256 hashes", for what it's worth. The only thing remotely related is how it brute-forces inputs until it generates hashes that begin with a certain number of bits set to 0.


It's continually testing the strength of SHA256. If you can find a way to even partially crack it, you win money. Edit: Also, way more hashes are happening all the time now, increasing the chance that someone finds a collision if it were possible.


That's still not what the Bitcoin miners are doing. They're manipulating inputs, trillions of times a second, just to match a "magic" target for a SHA256 hash starting with a certain number of bits set to 0.

Cracking hashes would imply that they're taking pre-existing hashes and reversing them to find the input.


Hashes have multiple requirements for security. You're talking about only one facet, which is that you cannot be given a hash and produce a matching input without consulting a rainbow table (I think there's a term for this, I forget it though).


Chungy is correct though. Bitcoin has absolutely nothing to do with "cracking sha256 hashes". Bitcoin hashes until the correct output is found. There are double hashes, but that is solely to avoid collisions and has nothing to do with determining the plaintext from a known digest.


It's kinda sad. If we persisted all the hashes that have been tried, we'd have the mother of all rainbow tables.

Be a bitch to search. But damn would we know the topography of the function space.


Why hasn't anyone made that crypto currency yet?


Because it would actually solve a problem?

...I'll see myself out.


>and has nothing to do with determining the plaintext from a known digest.

So why'd you bring it up?


User bitcoinistrash is the one that brought it up, and they're spreading incorrect information about how it works.

Cracking sha256 hashes implies trying to reverse a specific hash. There are literally thousands (millions?) of potential inputs that hash to a valid bitcoin mining block output. It's basically a race to find the first one matching the rules of the current block difficulty. The goal isn't to produce any one specific input/hash pair.


In which context would somebody try to reverse a hash? That would be like arbitrarily enhancing a bad-resolution image to see something that cannot be seen.

Hash collision should always be about finding matching content for a hash.


And if you cracked the math of sha256 you wouldnt need to search you would just calculate one of the inputs to produce whatever you want. All zeros would be as easy as one.


> hashes that begin with a certain number of bits set to 0.

This myth was always intriguing to me. Where does it come from?

It's not a number of 0s (which would imply difficulty could only be doubled or halved), it's just finding a hash that is less than the target.


"Less than the target" in effect is "more leading zeroes" depending on how low the target is. But the lower, the more leading zeros, Right?


It's in the original Bitcoin whitepaper


Thanks, this is likely where I read it first. I hadn't realized the actual implementation got more complicated than that. :)


Interesting. Even though I read the paper many times, I didn't catch that. I guess I always skipped that part because I assume I already knew it.


Yeah, cryptocurrency stuff is the closest I've ever felt today to the spirit of the old internet.


> cryptocurrency stuff is the closest I've ever felt today to the spirit of the old internet

Cryptocurrency feels like the exact opposite of the old internet to me - it feels like a bunch of anonymous bros looking for the next sucker to scam out of money and get rich quick.


I agree with you. The early web was full of amazing opportunities but had trouble monetizing things. Cryptocurrency is the opposite, it tries to monetize everything but doesn't have practical uses.


Bitcoin was "old internet" tech at some point. Just for reference, I bought my first BTC at 0.20 €/BTC. Back then it looked like the future to a lot of us ... "hey! We/Hackers can make our own money. And it is superior to state issued money!" ... looking back, it honestly it is somewhat to see the state of cryptocurrencies today.


> looking back, it honestly it is somewhat to see the state of cryptocurrencies today.

Steam and pretty much every other regular retailer dropped Bitcoin support five years ago. The state of Bitcoin feels even more hopeless today than it did back then. It is impressive how you can run a trillion dollar pyramid scheme on just digital nothingness, but I always hoped for cryptocurrency to eventually turn into a real digital cash. Bypassing the fees and incompatibles between regular payment schemes. But none of that has happened. Bitcoin is still slow and expensive and there is hardly anything you can buy with it anyway. Worse yet, most Bitcoin still gets stored in "banks" instead of being self hosted, so it failed on the whole P2P cash thing as well.

It's impressive tech, but for all the wrong reasons.


It's still taking time I think. We understood that L1 should never have been the place for the end consumers and that's why all the chains (Bitcoin-Lightning; ETH various systems like StarkNet, zkSync, Loopring and so on) are currently in the process of finding out how to make everything that is happening on L1 possible on L2. On Ethereum one big part left out is still the the general ability to run EVM code on L2, but that is happening like right now.

Biggest problem after that will be user experience, but I am currently in the Ledger Connect (now Ledger extension) Testflight beta and it makes using dApps on iOS with a hardware wallet a really good experience. No cables, no app switching, no weird abstraction barrier. The new Stax also seems like a well thought out wallet that was created with a focus on UX. Only thing I don't yet see is a good NFC integration for existing payment terminals.

I still think the industry is really early. Layer 3 is now a topic for privacy preserving user interactions which is super interesting. It doesn't then stop at being your own bank, but with it you will be able to really control your own identity without anyone standing in the way. For needing less trust sharing your data and being self sovereign. Using the chains just as a highly accessible, authorized data store.


I like cryptocurrency overall, but the bar for convenient purchases is really high. For anyone to care, it'd better be as easy as tapping my credit card on the terminal... which even Apple Pay isn't.


I thought it was a dumb idea at the time, that nobody would be stupid enough to invest in.

I still think it's a dumb idea, but I've learnt not to underestimate human stupidity.


See ... Back then we thought that it would be replacing cash. Some friends and I were thinking about having wallets in smart cards (technically possible) and distributing BTC terminals to local businesses.

The whole crazy investment thing came much later and basically destroyed any hope of using BTC on a daily basis.


If you bought $10 USD of it "at the time", you'd have a giant pile of USD. Who is stupid in that lack of purchase?

Point is a lot of things are stupid and turn out to work really well. USD is backed by a computer at the Federal Reserve. Sounds pretty stupid but it works.


Hindsight is 20/20 ... I bought my BTC at 0.20€ ... Sold most of it at 20€, the rest at 200€.

I could have been rich ... but how should I have known.


Oh, I definitely feel stupid for underestimating human stupidity.

>USD is backed by a computer at the Federal Reserve. Sounds pretty stupid but it works.

Pretty much why I think it's stupid as an investment, it's no better than the USD if SHTF. You can't hold it like gold or silver.


You're right that it's not meant for SHTF. It's for holding value under normal circumstances instead of passively investing in whatever other bubbles (stocks etc).

Btw, not very many people are putting their money into gold and silver. Returns on that have been lackluster for decades. If BTC truly became "digital gold," I wouldn't be very interested.


Once the original Bitcoin OG cypherpunks shifted from “here is an amazing new technology that could do entirely new things” to tweeting about the USD price of Bitcoin, I knew that era was over. I think it was done by ~2015.

The Ethereum research community is still doing very neat cutting-edge things, but they’re a younger generation and didn’t claim to believe in the same principles (only to later disappoint us all.)


Yep ... I just looked it up, I bought my first BTC in 2010/11 ... Back when they were just handing out 0.01 BTC to mess around.


How did the Ethereum research community disappoint us all?


They just said that they didn't.


I think before ~2014ish it felt like the old internet. Using bitcoin to buy drugs online really felt like the internet was free and empowering again.


The 'old internet' was chuck full of hucksters and scammers and 'futurists' selling snake oil, at least from circa 1993 onward and in full bloom since around the time of the Netscape IPO. It was pretty much exactly like the crypto industry, a core of real technologists doing amazing stuff, and then a whole industry of scammy sales people on top of it.

If you are talking about pre-web, I'd say crypto had that era too, up until around 2012 when bitcoin really took off.


The nature of the scam has evolved, and possibly the parties involved have changed, but the scams were always there.


The recent neural network "revolution" is giving me those vibes, even though it'll ironically kill the old internet with the upcoming spam bots.


If it were based on open ended tooling that anyone interested could experiment freely with, I would feel the same. Unfortunately these giant LLMs are almost the complete opposite of that.


I was just thinking that it would be great to do something like seti@home but for an open libre super LLM: Install a client that would help run / train the huge llm and you get Access to the network to do queries. Kind of how napster/Kazaa were at their time.

I would love to work on doing something like that.


That is actually a great idea. One of the issues with a network, however, is how the output would be produced in such a scenario and how the dictionary would be controlled and how the training data could be managed. The transformer model allows different RNNs to be separated, but it would be challenging to 'put it all together'. Furthermore, bad actors could intentionally poison the network during the training process.

Furthermore, how would upgrades work? When new techniques are discovered, it would be hard to upgrade it. We see these challenges already when it comes to major changes in cryptocurrency networks.


What you’re describing is http://chat.petals.ml/


Thanks a lot for the link. It is an interesting project. I tried the Google Notebook and it was a trip:

https://ibb.co/qpbzBZJ

Somehow, the AI believes that Alan Shepard was the first person on space.

Doing this session reminded me A LOT of Asimov's short stories in "I Robot". I feel that the future AI "debugging" session will be like that. It's been a while since I read the book, so I'll have to re-read it again.


Excellent. I will check that out and see if I can contribute.


I get the opposite vibe because the forerunner is ClosedAI.


True, but to be fair it was only recently that OpenAI really took over with ChatGPT and their api cost. Before that (just 3-4 months ago), the open source models were excellent competitors.


Yeah but it's a huge gap. The other chat bots were just for research or fun, while ChatGPT is truly useful for average people.


Totally. Now it feels like the internet is more about exploiting people for profit.


>I miss the spirit of the internet back then

Classic SETI@home's website[1] is a blast to browse.

[1]: https://seticlassic.ssl.berkeley.edu/


What was the RSA crack effort you mention? It doesn't seem feasible, considering finding a SHA-256 collision would take so much compute power we don't have words to describe it, and even coming up with thoughts to frame the problem is hard (https://www.youtube.com/watch?v=S9JGmA5_unY). I can't imagine cracking even an RSA2048 key with everyone working on it.


distributed.net ftw.. those truly were the days.


> I wouldn't look forward to unleashing an eternal, 24-7, intensive task with a connection to the internet

Just use another machine for stuff like this? Spare computers are cheap nowadays.


Lookup project Argus by the SETI League:

http://www.setileague.org/argus/


It was my machine that found the key in the very first RSA competition.


The spirit is alive with AI.


yes, after I posted that, I had the same thought.

but we spend a lot of time here talking about the lack of openness in AI, and some justified fear of the companies that own it smothering many other sectors the way search and cloud computer got smothered and centralized. a chatGPT@home large scale group effort would be sort of cool to participate in... but then I think, so many opportunities for mal-actor exploitation


:'-( I installed SETI@HOME on both my office and home machine back in '98 / 99. I know my contributions were not very much in the sense just how much can one laptop & a desktop do but I watched (and hoped) SETI@HOME as it progressed (:-) and then there's always that wishful thinking that someone somewhere would hit jackpot)...

Hoping you guys come back again.

Best.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: