Hacker News new | past | comments | ask | show | jobs | submit login
Fusion energy breakthrough by Livermore Lab (ft.com)
1065 points by zackoverflow on Dec 12, 2022 | hide | past | favorite | 801 comments




Current thread, about the actual announcement:

US Department of Energy: Fusion Ignition Achieved - https://news.ycombinator.com/item?id=33971377


Very disappointed by the discourse in this HN thread. The same old quips over and over. "NIF is just a nuclear stewardship program", "it's not actually generating power", "fusion still 30 years away".

I think it's very clear, given the past year that NIF has had, that they are very rapidly approaching a point where we have the tech to "solve" inertial fusion.

https://lasers.llnl.gov/news/papers-presentations

Getting fusion right is done a magnitude at a time. Right now NIF is within 1 magnitude if they built it with modern laser tech. Many fusion designs are 10 magnitudes away or more.

Their most recent article has a ton of great data and next steps:

https://lasers.llnl.gov/news/magnetized-targets-boost-nif-im...

This includes

- Cryo-cooling the main target

- New alloys

- Magnetic compression of targets

The recent advancement that helped reach ignition (in the last article) boosted performance 40%.

The advancement between then and now: nearly 60%.

Within the past 6 months, NIF has nearly doubled energy output of the reaction.

Plus, if you know anything about fusion research, you'd know that energy outputs tend to scale non-linearly with energy input and size. This tends to be on the order of the power 3 or 4. Hence the existence of ITER.

NIF has uncovered some new science, closed the magnitude gap, and made it actually realistic for inertial confinement to be a feasible tech for a power producing plant.


>Their most recent article has a ton of great data and next steps:

That device in the photo is great. Looks to be about 16AWG magnet wire. Guessing a 10mm ID of the coil, and about 25mm in length. To get to 26 Tesla, looks like you'd need to push about 33,000 A through that coil. Coil inductance might be about 1uH, and if the test lasts ~1us, then you'd need 33kV to push that 33kA through the coil. 30kV/inch insulation resistance, might not get arcing between the wires in air. Probably running the thing in vacuum? Looks like things check out.

https://www.eeweb.com/tools/magnetic-field-calculator/

>NIF has uncovered some new science

What is the new science? Seems like they are working on making the fuel pellets closer to perfect, which makes sense if you are trying to use the implosion shock wave inside the fuel to be the source of heat and pressure needed for further fusion. I'm imagining that the laser initiates the surface fusion, and then you want that fusion to propagate inward, and need thing perfect, so the fuel doesn't go squirting out the sides (so to speak) stopping the chain reaction.


Before ignition, very little science had been explored in the "burning plasma" regime. Now that ignition had been figured out, an entire new field of experimentation has been opened up.

https://www.llnl.gov/news/three-peer-reviewed-papers-highlig...

In particular, the original article talks about magnetic compression hypothesis being a byproduct of white dwarf simulation. With this new regime, they were able to apply the same ideas to fusion, resulting in the breakthrough.

With ignition being a regular thing in laser fusion going forward, I suspect many groups will have some slightly varied approach or some technique improvements.

If you're into fusion and lasers, there's a lot of areas that are still ripe for magnitude leaps.

- Laser power, timing, materials, and cost

- Metallurgy of the target canister

- Construction of the target and perfecting it as you mentioned

- Absorption of energy

I believe the NIF will focus on #2 and #3 as of course they focus more on making the "boom bigger" rather than making it cost effective of useful. IMO another group (startup or otherwise) will step in as an actual project in this space.

One area to innovate here is to use a different fuel mixture that doesn't produce neutrons. We wouldn't need liquid lithium/lead, breeding, or any of the complexities people very commonly complain about.

- https://en.wikipedia.org/wiki/Aneutronic_fusion

- https://en.wikipedia.org/wiki/Direct_energy_conversion

It's entirely within the realm of possibility that the technique to achieve ignition will open the door for 5:1 or 10:1 q with neutron-free fuels.

Even a total Q of 2:1 or 3:1 is a huge win, and that's within a magnitude of the modern tech.

--

Something I want to mention here too - the easiest aneutronic fuel mixture available is H2 + He3. It hasn't been explored too much since He3 is hard to come by on earth (though you can mine it from the moon!).

But, Helion has patented a way to generate He3 from H2 fusions. We don't need to mine He3 to achieve neutron-free fuels, just need to transmute it from seawater.


"magnetic compression hypothesis being a byproduct of white dwarf simulation"

That is the coolest thing I'll read all month.


Well, it does happen sometimes - a Nobel price laureate who worked on Pulsar also developed a popular amateur radio modulation that works under the noise floor and makes regular trans-continental connections possible with essentially a cheap SDR and a piece of wire.

https://en.m.wikipedia.org/wiki/FT8

https://en.m.wikipedia.org/wiki/Joseph_Hooton_Taylor_Jr.


I wish I could remember more specific details and terminology, but 10-15 years ago a friend working at an economic development agency asked me to look over a patent invention submitted by some Los Alamos researchers. I was in no way properly qualified to understand the physics, but it nominally involved a faster than light component, and he thought he'd run it past me to reduce the chance he looked foolish when putting it to more qualified people.

The patent was based on research analyzing the propagation of wavefronts on neutron stars[1]. I forget the term(s), but the critical aspect related to features which travel along the wavefront faster than light. This feature couldn't actually be used to communicate faster than light, obviously, however the patent claimed to be able to use it to defeat active radar jamming--more specifically, radar deception. Because this controllable wavefront feature (modulation? polarization?) could be FTL, and the waves themselves light-speed, it was thus intrinsically impossible to fake a correct return signature.

From a lay geek's perspective, I told my friend that AFAICT this aspect of the invention seemed not obviously flawed: it was FTL only in the sense that you could swing a flashlight across the moon and the apparent motion of the reflected beam could be faster than light. Normally such a phenomenon is merely a curiosity, but apparently the inventors had put it to some practical use, at least in theory.

[1] IIRC, either physical or magnetic waves generated by starquakes.


I'm guessing they were referring to phase velocity:

https://en.wikipedia.org/wiki/Phase_velocity

You can see the effect shown in the first animation of this article if you keep track of the wave peaks in the wake of a boat. They start at the back of the group of waves, move through it, and fade away as they reach the front.

https://www.youtube.com/watch?v=lWi_KpBy8kU


Is it possible to leverage these power gains to feed back in via increased laser power?


here's a video that shows a machine at Berkeley lab that makes a high-performance magnet wire, not sure if it's related to this project but still pretty interesting to see how complex just the wire is- https://www.youtube.com/watch?v=FmmNRaKpBTI&t=2138s


"It looks a bit Rube Goldberg esque coming out the back"

Yes it does indeed. Amazing amount of trial and error to build such a machine.


Well, Goldberg was an engineer and supposedly did construct his drawings to be 'technically' feasible.


Ironically vacuum, unless super perfect, is not that good insulator. All hvac techs who started compressor on vacuum (albeit much less perfect) can tell :)


Yea, the whole target bay is in vacuum which helps with the arcing.


Vacuum is supposed to increase arcing.

People who want to prevent arcing flood the cavity with sulfur hexafluoride. Fun fact, it has 25,000 times the "greenhouse potential" of CO2. Dunno if that includes lifetime in the atmosphere, or just instantaneous opacity to IR. All the wind turbines are pumped full of it. There is an effort on to switch to something else of proprietary composition.


They're not pumped full of it though, are they.

"Data from Vattenfall suggests leakage emissions from Europe’s 100,000 wind turbines were about 900kg of SF6 over the last six years. This is equivalent to 3,525 tonnes of CO2 a year. This includes the release of gases during the reclamation and recycling process. At end-of-life the turbine switchgears are collected and the sulphur hexafluoride gas is reclaimed and reused in new equipment.

By comparison wind energy avoids the emission of 255 million tonnes of CO2 in Europe a year by generating 336TWh of electricity displacing fossil fuels. The SF6 leakage therefore represents around 0.001% of the emissions avoided thanks to wind energy every year."


That is only true to a certain point. Look up Paschen's law. After you go lower than a certain point, around 10 Torr cm depending on the gas, decreasing pressure increases the breakdown voltage.

It's probably not feasible to pump to such low pressures in wind turbines, so they probably don't even try. But for NIF, it's common.


I don’t know about the actual turbines, but there is GIS switchgear and it’s sealed. Modern GIS don’t leak much(extremely little).


Sorry, this is just FUD.

Provide links? Also, 25000 times might be ok, if the amount of gas released in the atmosphere is 1/10^6, say, of the amount of co2


SF6 is 23,900 CO2e.


A breaker can use sf6 gas but usually that is at 60 kV and above. For 35 kV and below vacuum bottle breakers are much cheaper than sf6.

individual wind turbines are not generating at more than 13.8kV so I would be surprised if they had any sf6 in them.


> Very disappointed by the discourse in this HN thread. The same old quips over and over.

To be fair, that's not entirely the fault of HN. It's hard to get excited about fusion research when I almost always feel mislead, because it is almost never explicitly stated that we are talking about Q-plasma here. I don't expect much from science journalism, but I feel that the fusion scientists have no problem silently playing along this misconception, which they are perfectly aware of.


"but I feel that the fusion scientists have no problem silently playing along this misconception"

That is the way to keep the money flowing.

See

"How close is nuclear fusion power?" https://www.youtube.com/watch?v=LJ4W1g-6JiY

and (different area but about how high investment physics works, "just around the corner")

https://www.youtube.com/watch?v=9qqEU1Q-gYE


Plus, this is about 10-th time in the last 10 years we are told "we are around the corner, clean energy will save us". Please tune down the message.


I like how this graph describes why nuclear fusion technology always seems to be "just around the corner": https://imgur.com/a/59NO34a


I really hate what hippies did to nuclear energy. We would've long solved the clean energy problem by now if they hadn't violently opposed anything with the word nuclear


nuclear energy is an existential threat to the green movement


green movement is an existential thread to nuclear energy


If they could explain how fusion energy can kill more people than fusion bombs funding would be unlimited.



Maybe the corner is just a very long one, but one that’s so important to turn any material progress is important news? I think post scarcity is monumental.


>Very disappointed by the discourse in this HN thread. The same old quips over and over.

We see these comments on every science thread because almost all of these people lack the requisite expertise to weigh in on the actual details, so instead they make a high-level criticism to give the appearance of having some kind of knowledge on the subject. Moreover, they think that crapping on things equates to being a critical thinker, and have convinced one another that this is so.


> because almost all of these people lack the requisite expertise to weigh in on the actual details

You mean, starting from the journalists writing the articles?


They're probably going by past experience in other tech products, which get development news but then you never heard about it again, or, if your lucky it hits consumer markets like 10 years later.

I have no experience in fusion, so can't comment on that either way.


I have thirty years of experience with people telling me we’re really close on fusion. Next year or the one after will mark my 30th anniversary of having heard the phrase, “fusion is 30 years away and always will be.”

If the people working on it today want to make beef with someone, they should complain to the people who have been lying for long enough that a joke about it is at least three decades old. If the real problem is that the only way we’ve gotten funding for this is to keep misleading politicians on how long the remaining road is, that’s fine, but once you’re known for lying - intergenerationally - how on earth do you expect the general public to take anything you say seriously?


Quite true. And this is something I have been quite guilty of myself.


I regret that I have but one upvote to give.


Here's the thing:

In order to make inertial confinement work, this process needs to occur multiple times per second

All the fancy stuff with the hohlraum, magnetic compression, target cryo cooling must be accomplished accurately and repeatedly, BUT ALSO shot out of an "injector" to fall precisely into alignment with the lasers, in vacuum within a plasma field...

When you write it all out! Yikes!

Then! This has not included any capture of energy, so that part must be implemented as well, which would effectively mean placing all of NIF target chamber inside a thermal heat exchanger.

So, no, inertial confinement is probably the furthest from ever being a suitable arrangement from a power production standpoint.

Physicists have an uncanny ability to ignore engineering.

As a proponent of fusion and fusion research, it's important to keep the focus on what is valuable about the work being done and not mislead the general public about flights of fancy.

If you want to understand radiative pressure and plasma characteristics, this is the place to be, for sure


All of the problems you just described are solved for in EUV lightsources for lithography on chips at 100khz or more. These are less hard then you make them sound.

https://www.youtube.com/watch?v=5Ge2RcvDlgw

Probably the hardest part is making sure the droplet is cost effective enough that we care.

Again, to op's point, this is an incredibly shallow analysis. The question I would be pushing towards is:

What are the hardest remaining engineering problems? How likely are we to overcome them? At the end of that process will it be a cost competitive outcome?


These processes are not similar.

The laser does not hit the target, it hits the interior of the hohlraum. Generating x-rays, this creates radiation pressure on the cryogenically cooled target, which is inadequate for fusion without also pulsing a high magnetic field to confine the plasma (this is now inertial/magnetic confinement fusion).

The energy output is not sustained for any duration, but rather is nearly instantaneous. The plasma and debris must be cleared of the beam path for the next laser pulse and target injection.

These are the hardest remaining engineering problems if you discount the fact that energy must then be absorbed and put to use with constant material degradation of the target chamber and the necessary high output production of tritium filled beryllium capsules.

We are unlikely to overcome these challenges for substantially longer than the time period required to generate a functioning EUV machine.


This is unlike an EUV light source in that once the fusion begins, ideally you don't even need the lasers any more; the fusion from pellet (n) ignites pellet (n+1). Getting that to work is probably orders of magnitude more difficult than what just happened.


Why? You just feedback the energy generated from the process into the lasers. Most powerplants have an initial negative net energy.


> BUT ALSO shot out of an "injector" to fall precisely into alignment with the lasers, in vacuum within a plasma field...

It does sound like magic, but doesn't EUV involve some process similar to this? Something about shooting drops of tin with a laser? That sounds like magic to me too but is apparently a thing. Obviously two totally different things, but the level of magic to me is the same.


The tin is reaching 400,000 K, a fusion reactor has to attain more like 100,000,000 K.


I am not a physicist. Most here aren’t but fancy themselves experts on things they know precious little about. I’m not surprised but I agree it’s disappointing.


Even a 100x fusion gain wouldn't make the NIF net power producer, because last time I checked the lasers produce ~99% heat and <1% coherent light. The relevant metric is the ratio of energy input into the facility vs its output.


NIF was never intended to produce net power output. It's a research facility. The latest result is a major milestone demonstrating that net energy gain is achievable. LLNL, and other labs, are developing diode-pumped laser systems which are closer to 25% efficient, rather than 1%, and can operate at the high repetition rate required for power production.

That means the reaction would only need a gain of 4, rather than 100, to generate net power.


Couldn't you use waste heat recovery to recoup some of the lost heat from lasers? I haven't ever ventured down that route and a I don't know the scale of heat here or quality of it. Think combined cycle gas plants. I recognize totally different tech but offering ways to get some value back.


> Many fusion designs are 10 magnitudes away or more.

JET Tokamak was within a factor of 2 and ITER will overshoot by a factor of 10.


I think you and the parent poster may be talking about different "goals".

My lazy quick skim of the main article here is that NIF has achieved a Q value of 1.2 presently.

ITER is aiming for a Q of 10 [0]; i.e. Q=10 means fusion outputting 10x the input energy, which is (by some considered) roughly enough to break-even in energy production [1], i.e. to recapture 10% of that energy (as heat or as electricity, not sure...)

So parent poster saying NIF is an order of magnitude away means Q=1.2 -> ~Q=10

And ITER seeking Q=10, means that's the goal that NIF is an order of magnitude away from, according to the parent poster.

[0] Q=10 for Iter: https://www.iter.org/sci/Goals#:~:text=ITER%20is%20designed%....

[1] Q=10 is a rough minimum for energy production (quick and dirty source from google) https://www.powermag.com/fusion-energy-is-coming-and-maybe-s....


There is nothing disappointing in skepticism about the importance or value of this work.

1. It is, purely, bomb research dressed up as civilian activity for funding purposes. Everyone working on it has top-secret clearance.

2. It has no consequence for any civilian project. The target that produced a couple of MJ cost $10M. (2.4 MJ is <0.7 kWh.) A real plant would need to feed them in at a high rate. Q is not the important measure. Dollars out / dollars in is the right measure, and everyone is still at exactly zero, with no plausible prospect of ever exceeding 1.

3. Extracting useful energy would require capturing hot neutrons in a "blanket", heating it up, and running fluid through it to boil water to drive a steam turbine. The minimum practical size for such a "blanket" exceeds that of a large fission plant. To collect enough neutrons to be useful requires a huge volume of plasma, as even compressed plasma is very diffuse vs. fissiles.

4. Compressing the plasma with superconducting magnets could increase density, but then the neutron flux through the smaller surface area of the chamber wall would destroy it that much more quickly.

5. The hot neutron flux would also quickly weaken the structural parts required to contain the enormous forces exerted by the electromagnetic coils. Superconducting coils would impose even larger stresses. No research has gone into identifying a viable material, in decades, despite that none is known. After a short time the reactor parts would all become weak and (also) fiercely radioactive. Repairs would need robots not yet designed.

6. Civil fusion would require a large amount of tritium, which no one knows how to make economically.

7. Steam turbines cost a lot to operate, regardless of heat source. No other generation method relying on such steam turbines -- coal, fission, geo -- is today competitive vs. renewables. As the cost of renewables continues on down, they get less competitive by the day.

Fusion is intrinsically interesting, just not for power generation.

One company, Helion, is trying to make a fusion device that does not emit many hot neutrons. However, achieving conditions for this process, D-3He, is even harder than for D-T fusion. They hope to breed their own tritium, which would eventually decay to the 3He they actually need, but it is not clear how they will produce enough. (Fun fact, 3He loves to turn back into tritium.)

If they cannot, but they do get it working, it might end up usable for outer solar system exploration, which is difficult to power.

This "milestone" provides exactly zero meaningful information for the magnetic confinement fusion that is the only avenue being pursued for civil power.

Fusion offers no prospect of "unlimited free energy". It offers instead very expensive energy, or possibly none at all. We already have access to unlimited free energy, and need only build out the solar, wind, and maybe tidal systems to collect some as it goes by.


1. The same research that brought us the atomic bomb brought us so much more. Who cares about the motivation of the research? This is such a bad take considering the history of technology evolving out of military development.

2. Again, all technology is expensive in the beginning. Who cares? The important thing here is to climb magnitude by magnitude. NIF climbed many magnitudes in recent history, making it notable.

3. As you mentioned, Helion and direct energy capture. D+He3 + DEC might be not feasible with a tokamak, but the scaling laws of fusion (size, current, B field) are in favor of experiments that get close.

4. See 3

5. See 3

6. See 3

7: See 3

I think you have a very negative take on what is an amazing breakthrough accomplishment. Even if the NIF doesn't end up converting their research into a commercial powerplant, they have at least demonstrated experimental viability of inertial confinement fusion. It's only a matter of time before the next generation shows viability of D+He3 fusion and then we'll have even more options.


"Amazing breakthroughs" have notable consequences. This thing has no prospect of any consequences. It is just one number inching past a second number, the second one of dubious provenance.

You may call it a "milestone" if you like, even though it is a milestone on the way to nowhere. The only reason fusion gets any attention is because megaton bombs worked, and already demonstrated Q>1.

People are building out solar and wind power systems that stand some chance of fending off climate catastrophe, each day pushing fusion, like fission, farther from any prospect of competitiveness.


> "Amazing breakthroughs" have notable consequences. This thing has no prospect of any consequences.

The consequences of a breakthrough are often not noted at the time. It is only when the consequences have happened that we recognize them and grasp the change which has been catalyzed by the discovery. Who, other than Mathematicians, cared about Number Theory when it was discovered?

I don't know enough about plasma physics to characterize this one way or the other, but I think looking for notable consequences is very slippy ground to stand on when dismissing a result.


If there are ever any notable consequences of any value, we can whoop about it then. There are dozens of advances of equal magnitude, in as many fields, every day, without attracting a glance.


> Everyone working on it has top-secret clearance.

along with 1.3 million others [0]. just sayin'

[0] https://news.clearancejobs.com/2022/08/16/how-many-people-ha...


Yes, that is bad.

But that they don't hire anybody without still tells us something.


I've read some very well informed critiques on HN. The general sentiment will always be super negative absent an easy topic or a clickbait headline which over simplifies a problem to tap into a commonly held feeling.

These sorts of hard questions about scaling it up into real life have been some of the most persuasive fusion critiques.

And the bit you tapped into about the whole detachment from the gov research world from making something they have to justify with hard $$ and all that comes with it are very legitimate.


> 3. Extracting useful energy would require capturing hot neutrons in a "blanket", heating it up, and running fluid through it to boil water to drive a steam turbine. The minimum practical size for such a "blanket" exceeds that of a large fission plant. To collect enough neutrons to be useful requires a huge volume of plasma, as even compressed plasma is very diffuse vs. fissiles.

I worked as a control systems engineer in the nuclear power industry for 8 years back in the '90s. I worked on Lungmen in Taiwan, ABWRs (Kashiwazaki 6/7) and even Fukushima (power uprates) in Japan and Grand Gulf and Pilgrim in the USA.

Fusion is billed as a "clean" alternative to fission reactors but I think this is (another) false hope of the technology. I still recall my nuke prof telling me that fusion (if it ever works) is going to be an even bigger waste problem than fission reactors. The 15 MEV neutrons are going to neutron-activate tons of shielding or heat extracting blankets which would be a huge disposal problem. He also thought that neutron embrittlement of plant structures was going to be a serious problem and is already a problem in fission plant cores that use thermal neutrons. Carting that waste away at EOL will kill the economics.

We should start building fission plants now which are safe and clean enough and can provide power until something better comes along.


Something better has already come along. Solar and wind produce power more cheaply than the world has ever seen.

Fission is distinguished by being the most expensive power in current heavy use. All the nukes will be mothballed, soon enough, unable find takers at a price that would pay for continued operation.


> 1. It is, purely, bomb research dressed up as civilian activity for funding purposes. Everyone working on it has top-secret clearance.

Mostly Agreed: there may be applications such experimental astrophysics, but that's certainly not the main motivation.

> 2. It has no consequence for any civilian project. The target that produced a couple of MJ cost $10M. (2.4 MJ is <0.7 kWh.) A real plant would need to feed them in at a high rate. Q is not the important measure. Dollars out / dollars in is the right measure, and everyone is still at exactly zero, with no plausible prospect of ever exceeding 1.

The cost of just about anything new regarding fusion experiments is not very meaningful: likely it had to be invented, designed and manufactured just for them. Of course it's crazy expensive, but it doesn't mean prices won't go down after an industry around fusion has been established. Just look at the price of a c-Si solar cell in the 70s.

> 5. No research has gone into identifying a viable material, in decades, despite that none is known. After a short time the reactor parts would all become weak and (also) fiercely radioactive.

I don't know about inertial confinement, but in magnetic fusion that is completely false. Materials with low activation, radiation damage resistance and good plasma properties have been continuously researched for the last 30 years: the current candidate for ITER is EUROFER97 for which you can find almost 800 publications. [1]

> 5. Repairs would need robots not yet designed.

Also false. Not only there are several remote handling designs for DEMO power plants, but they have also existed for a long time. For example, JET had been operated remotely since 1997, during the DT1 campaign. [2]

> 6. Civil fusion would require a large amount of tritium, which no one knows how to make economically.

Well, this is dishonest. Of course no one knows how to breed tritium economically: we don't know what the economy will look like in 40-50 years, but we surely know how to do it.

Fusion power plants are designed for self-sufficiency, producing more tritium that they consume by a factor of at least 1.05 (called tritium breeding ratio). Very briefly, this involves a breeding blanket that converts lithium to tritium and a complex chemical plant to extracts newly produced tritium from the blanket and also recovers it from the unburnt plasma fraction. See [3] for an overview of various design that will be tested in ITER.

For as long as there are a few CANDU reactors around, the current tritium supply will be enough to bootstrap future fusion plants without expensive ad-hoc production. [4]

[1]: https://www.journals.elsevier.com/nuclear-materials-and-ener...

[2]: https://yewtu.be/watch?v=hg6MnjG7m6U

[3]: https://doi.org/10.1016%2Fj.fusengdes.2011.11.005

[4]: https://doi.org/10.1016/j.fusengdes.2013.05.043


1. Tritium from CANDU is expensive ad-hoc production. The expense is well-documented.

2. Robots for fuel handling in a JET reactor are as far from robots for fusion plant repair as a Sopwith Camel is from Artemis. All they share is the word "robot".

3. Nobody can suggest any way to separate, every day, a few grams of tritium at PPB concentration from a thousand tons of molten FLiBe and lead.


7. It doesn't matter, even if renewables were free. They is no way to store energy from intermittent sources that will allow to power human civilization. You are going to end up like Germany, replacing nuclear power with coal and gas.


You can pretend there is no storage while the world goes ahead building and using it, at utility scale, paying no attention to you insisting they cannot


I get disappointed that I can't get interest in stories like

https://www.world-nuclear-news.org/Articles/First-Light-team...

Fusion has the strong advantage (and disadvantage) that it is a powerful neutron source. Even a very low performance reactor can be useful as a neutron source

https://ats-fns.fi/images/files/2019/syp2019/presentations/T...

Fusion might be useful for making isotopes long before it is competitive as an energy source. In the 1980s I know scientists were looking to hybrid systems that convert ²³²Th to ²³³U and ²³⁸U to ²³⁹Pu as fusion reactors produce so many high energy neutrons that they could be better than fast breeders for manufacturing fuel for thermal fission reactors. In fact, it is very possible a fusion reactor could be used to make fuel for nuclear weapons.


"and made it actually realistic for inertial confinement to be a feasible tech for a power producing plant."

Before this is not solved:

"it's not actually generating power"

I simply would not talk about a real power plant yet, because a real power plant has economic constraints. As long as the current approach is not even generating energy, all the scepticism is warranted, if we are talking about something that is supposed to solve energy generation and climate change. This is why people are upset with it - we need not promises of unsolved tech, but solutions now. So fusion remains exciting and cool tech and I love to read about its recent progress, but please without illusions. Even if they could generate power tommorow - it would still be a very long, unknown way, till it actually helps us.


HN is generally more excited in a minor release of well-known libraries or of GNOME, than in medical, space, or biology breakthroughs.


In my youth, I was familiar with the facility that came before NIF, built in the 70's the goal at the time was to use a smaller target to demonstrate that the foundational principles that would underpin the success of NIF would work. As far as I know, they never succeeded. NIF was built anyway because this type of device is well suited to allowing access to certain types or demonstrations of physics that are otherwise unaccessible and important in a specific field of research not directly related to the flourishing of our race.

The article is old news before it was written. The article mentions the previous 'success' (yield was higher than previous experiments), and that was over a year ago now. They haven't been able to reproduce the previous experiment even knowing as precisely as they can what they perceive to be the preconditions necessary for an effective reaction. It also seems that this article was written about a single experiment. They will not be able to intentionally repeat the experiment. The manner in which they're exploring the pareto front is like groping in the dark to find a light switch that has an unknown texture and conformation. It's a classic monte-carlo simulation but they have one iteration every several weeks or months, and they cannot even possibly identify all the controlling parameters, nor do they have the necessary throughput or bandwidth to succeed in their pursuit without windfall.

The low hanging fruit providing the basic harmonics of the solution were discovered well before I was even introduced to this technology (in the 70's and 80's. Coincidentally around the moment of the genesis of many of our modern treaties on weapons testing).

You are overly optimistic, a 40-60% increase in nearly nothing is still nearly nothing. The PR campaign around this event is I think more significant in its political convenience, and in white washing the purpose of the facility. There are significant discoveries that still need to be made to even make the reactions consistent, and they will not come conveniently or quickly. Once the reactions are better understood and the mechanisms can be manipulated with intent the distance between the science and a practical industry / commercial product will require even more hurdles that stretch the imagination to be overcome. For instance I cannot conceive of a practical mechanism for actually utilizing any fraction of the massive amount of energy released in a fraction of a second in a chaotic murder of wavelengths and particles. The most practical way we've yet discovered for converting neutrons to electricity is through boiling water. Grossly inefficient in other contexts, I'm not sure that has even marginal utility in this scope.

I for one am 100% sure I barely know what I'm talking about. My disclaimer is that I'm not a physics guy, and high energy density physics was only a hobby of mine at one brief point in my life. Through perspicacity and access to papers and people, this is my honest mental model of the whole thing. You're welcome to your perspective, but although you seem well informed you sound very inexperienced.


Do you realize that this is 50% of the problem? The another 50% is how to "compress" fusion?

https://coldfusionnow.org/power-equivalent-to-the-sun-we-alr...

This is one aspect of the problem. THe another aspect is that we should create decentralized technology that everybody can use at home. It would make our world a much better place, much more resistant to many things (including terrorism). I think the small amount of virtually infinite energy is a much better option.

https://ndb.technology/


Fusion energy is a holy grail in the field of nuclear power. The claims though out the years have been ridiculous. I think it is just natural that people are skeptical.


    > Very disappointed by the discourse in this HN thread. The same old quips over and over. "NIF is just a nuclear stewardship program", "it's not actually generating power", "fusion still 30 years away".
The interesting thing here is that every part of what you said I completely agree with. My behavior, however, indicates otherwise. I didn't read the article[0]. I went directly to the HN comments mostly because I wanted to cut through the hype.

Basically, I came to the comments to hear from the skeptics. Of course, most of the skeptics fall victim to a mental trap.

I think there is one key difference between "what I was looking to read from skeptics" and "what most skeptics used as arguments". The information I was seeking: to understand the difference between what the "breakthrough" was being reported as and what the breakthrough actually was. The information I received was: "this is impossible for (reasons)"

An equally important thing I was seeking was to understand was how this work might affect other industries (before it results in "fusion power").

The thing I'm least interested in is hearing "why it will never happen." I think most of us know many of the reasons this is a "Marsshot" problem[1]. I think most of us get annoyed when the media presents news in a manner that provides the general public with extremely unrealistic expectations and are sensitive to the dangers of that, but we get frustrated by those kinds of comments because, likely, none of us need to be told that! :)

It's impossible to make a useful argument to a skeptic that "this technology will exist in (insert timeframe)." The point at which (timeframe) is a trustworthy estimate usually coincides with the technology maturing to the point that the skeptics fall off (or turn out to be right if "timeframe" is never). And there's a long way to go (I think I saw a list of 10 or so "extremely hard problems") but this certainly appears to be something that is chipping away at one of the "impossible problems." Over-simplifying as this is, the rate at which technology advances is not linear; it accelerates. The next problem may not be as difficult or knowledge we attain from solving this one may be able to be used to solve related problems[2].

[0] There were several others on the topic and being ft.com, I assumed it would require a subscription that I do not have.

[1] Maybe far more difficult, but I never liked "moonshot" when describing something that hasn't been done, yet.

[2] Again, not a physicist, but reading through various "fusion is doomed" lists, many of the problems center around "the word 'hot' is a woefully inadequate description".


The key is that it is not a breakthrough at all. Breakthroughs have consequences, and this will have none. Soon they will claim "Q>5", "Q>10", etc. all equally inconsequential.


After a few more major breakthroughs we'll be where fission was in 1942 after Fermi made the first man made neutron chain reaction. After that, we can see what a practical electricity producing plant looks like, and see how much people actually care about small amounts of tritium radiation.

At the moment fuel costs in fission are like 5-10% of total costs for a fission fleet. In fusion it could be lower, but that will not be any means mean the overall system will be cheaper.

We'll have to see the cost tradeoffs: fusion makes much less radioactive material per kWh than fission (but it still makes some) vs. simplicity. Fission is relatively trivial: just put special rocks in a grid and pump water over them as they pour out their star energy.

Progress is good and exciting, but I don't see any reason to think this will have major implications for energy systems anytime soon. Would be happy to be wrong though.

Disclaimer: I switched from studying fusion energy to advanced fission 16 years ago.


Personally I think fission power's failure is a political and marketing one. I don't agree that the waste disposal issues, or the safety issues, are quite the big deal people make of them. (Not saying there are no unsolved issues, just that the issues that exist are not significantly worse than those present burning fossil fuels, and are better in some dimensions. They're just different, and in some ways very emotionally so.)

I think it might be fine that fusion power may be more expensive in some ways than fission, as long as its reputation is kept clean (figuratively and literally). Market fusion power as the savior of humanity, and get enough people to believe it, and it'll be fine.


> Personally I think fission power's failure is a political and marketing one.

I think it's because of the occasional catastrophic failures that spatter our short history with the technology. Fukushima made headline news around the world, leaked large amounts of caesium-137 into the ocean, caused a 20km evacuation radius, is projected to take a total of 30-40 years to clean up, and people think of it as not that bad of a nuclear incident.

In comparison burning fossil fuels is a classic tragedy of the commons problem. Way less sensational. You can do math and say nuclear has a safer track record than coal/oil. You can point to design, engineering or management faults with historical failures. It doesn't change the fact that nuclear had a very fair chance at being the future and shown itself to not be trustworthy. If humanity was a little more perfect maybe we could have pulled it off


People think of Fukushima as (1) one of the very worst nuclear incidents there has ever been, and also as (2) something that cost very few lives.

Yes, something like 150k people were evacuated because of worries about radiation. What you don't mention is that the total number of people evacuated was 470k. Most of the people who had to leave their homes had to leave not because of anything nuclear but because the enormous tsunami destroyed their homes.

So the Fukushima story is: massive natural disaster that caused enormous destruction and tens of thousands of deaths; a nuclear power plant was in a badly affected area; the damage was expensive to deal with but the total number of resulting deaths was, er, maybe about 1.


I think you're missing key elements to the Fukushima story. It wasn't a surprise natural disaster, it was a predictable eventuality.

1. People tried to ring alarm bells about the building codes (and the reactor specifically) not being able to handle earthquakes of a size Fukushima was likely to experience. They were on deaf ears.

2. Japanese government admitted guilt for poor oversight and regulation.

3. Three executives were put on trail for negligence. There were found not guilty, but that's not the same as innocent.


Oh, for sure, Fukushima's story is not one where everyone did the right thing. It's one where there was negligence and carelessness, as well as a huge (and, yes, in some sense predictable) natural disaster ... and despite all that, the actual reactor-related consequences were not so very bad.

If the question were, say, "how much should we trust the Japanese government?" then Fukushima is not very encouraging. But if it's "how worried should we be about nuclear power?" it seems pretty encouraging to me. Lots of errors and negligence, huge natural disaster, and even so scarcely any lives lost and most of the harm done would have been the same without the nuclear power plant.


> I think it's because of the occasional catastrophic failures that spatter our short history with the technology.

Airliners have the same problem, yet their spin doctors are much more successful. Everybody keeps believing they're the safest form of travel.


Fukushima itself killed 0 people. Compared to the overall issues caused by the tsunami its footnote really.

> caused a 20km

Questionable if that is actually necessary or just over-reaction.


Nuclear fission is probably one of the greatest victims of populist democracy. It seems the more modern "socially-democratic" a country was over the last 30-40yrs the more anti-nuclear it became. The raw output of markets or the tough hand of socialist dictatorship was the biggest positive driving force in the early history of fission energy.

But ultimately it's such an expensive and society-tier level of investment that it's at the whims and pressures more than almost any other technology that has benefited society in resent history. So likewise it's also most at risk of the downside of populist politics (short term thinking, highly reactive to noisy local issues, driven by emotional outrage, etc).

I wonder if it's prospects are even worse off now that's to social media.


> Nuclear fission is probably one of the greatest victims of populist democracy.

Oh yes indeed. Nuclear energy is not legal in Italy, so I did some research:

We had nuclear reactors in the 80s, until we held a referendum on nuclear energy, 3 months after Chernobyl. The result: overwhelmingly against, so we dismantled our reactors. Decades later, the Government pushed for a new referendum. When did they choose to do it? 6 months after the Fukushima disaster... you can guess what did the Italian population voted for.


> The raw output of markets or the tough hand of socialist dictatorship was the biggest positive driving force in the early history of fission energy.

Is this true?

I always considered fission tech to be used for the following reasons, and none of them are economic. The number's I've crunched say fission isn't the economic choice, but that varies depends on how much value is placed on 'base load'.

1. Cold war era vanity tech. Nuclear weapons were used to end World War 2, and now they are just another infrastructure project for us.

2. Code shifted weapons research. Countries blame each other for this all the time in the nuclear non-proliferation era.

3. Strategic choice to avoid traditional energy imports (France, Japan).


I disagree that its not the economic choice. I think everybody would agree that for 50 years before lets say the year 2000 is was overwhelmingly the only economic choice for green energy (outside to hydro limited by geography).

After 1990 specially 2000 lots of governments around the world started to massively subsidize solar and wind. While often at the same time having policies punishing nuclear in various ways.

The uneconomical solar and wind became economic because of massive government orders and investment. Even the US often simply set targets for solar and wind that utility providers had to reach. Even nuclear nations like France did so.

So why did wind and solar turn economical, massive investment around the world in making it so. Had Germany, France and the rest of the EU simply gone all in on even a Gen3+ reactor design, and had order 200 of them since 1990, it would also be very economical. History of nuclear shows that if you build the same plant in large numbers, they can be built and finished far faster and cheaper.

And that is even before we consider the huge reduction in capital cost if you go from a PWR design to a GenIV design. Just in terms of the scale of the project, there is a huge difference. Sadly by the time that technology was getting ready for serious commercialization, nuclear was basically seen as legacy and almost all government stopped most research and stop investing in it.

Imagine if nuclear in the 80s had the support wind/solar did in the last decade. If every utility in the US simply sad 'you need X% nuclear by Y date'. And in Europe at the same time as France was building its reactors, Germany, Nordics, Switzerland, Austria, Italy, Britain had also built reactors at the same time.

During the Kyoto protocol talks, France already had a mostly green grid because of nuclear. But somehow essentially nobody copied this success story because it simply wasn't politically viable in most places. It took decade plus after Kyoto before wind/solar were commercially viable but really only if you don't consider intermittency a problem and the market doesn't give you a penalty for it (it usually didn't because before wind/solar that just wasn't an issue). Yet despite solar and wind not being economical, massive investment in it happened and eventually it was made economical thanks to economics of scale.

I would claim if all the investment that was made in wind/solar since 1992 had been made in nuclear, we would produce more green energy now and the cost curve would be driven even lower, and baseload power would be solved as intermittency is simply not a thing. We would not need to redesign grids because nuclear plants would map nicely onto the current grid, if you just replace coal with nuclear.

So, its all about economics of scale, that makes it energy production cheap. Putting up huge wind mills is cheap because there are lots of trained people to do it, the factories can produce large volumes. The largest wind mills now are by themselves large then a whole GenIV plant would be. And produce like 95% less energy and not even consistently.

> I always considered fission tech to be used for the following reasons, and none of them are economic.

You missed that it is green and no CO2. That was not a reason anybody cared about before 2000 but since then it was part of the rational in some countries.

I would like to hear why you think fission can't be economic in principle. Maybe you can make the argument that Gen3 reactors can't be economical but based on first principles, fission itself can be economical if you had economics of scale seems a stretch.


I think fissions failure is a inability to plan for complete failure scenarios, were society folds in on itself, suppliers are no longer available or power plants are actually fought over. So the inability is not to get the tech going, but to plan for how it can it be usefull in a unravelling world.

Already economic downturns corelate with fission problems, as plants are not properly maintained. We have one blowing up every thirty years atm. Our reach exceeds our grasp, and there is no shame in admitting to that.


> I think fissions failure is a inability to plan for complete failure scenarios, were society folds in on itself, suppliers are no longer available or power plants are actually fought over.

Are you referring to the need for electricity now at the Ukraine plants? Newer technologies such as NuScale require no external electricity for their cooling. The reaction only occurs if there is water and when all the water evaporates then the reaction stops.

> Already economic downturns corelate with fission problems, as plants are not properly maintained. We have one blowing up every thirty years atm. Our reach exceeds our grasp, and there is no shame in admitting to that.

Gas turbines in aviation also blew up way more often in the past than they do now. Who says the blowing up of plants is a constant? There have been many improvements in safety. Also, apart from the Three Mile Island accident, there haven't been major nuclear problems in the US in the last 50 or so years. Furthermore, the thing that lead to the Chernobyl disaster is not possible, by law, in modern reactors. Furthermore, newer reactors require an extra casing of concrete which would also have contained the Chernobyl disaster. You can even fly an airplane in those newer housing buildings and nothing would happen (with the building at least).


There has been a major back-logging in nuclear equipments investments, so big in fact, that some countries who should by right be netto exporters of energy (france) became importers. Its a valid technology in a theoretical world, that does all the upkeep measures, does actual credible threat analysis.

I have yet to see such a world.


Its simply about economics of scale.

If you built nuclear at scale, then these problems solve themselves. Just as France solved them in 15 years in the 70/80s.

Just as these problems were solved for solar/wind by economics of scale.

Solar/wind was not economical, it was basically forced into being economical by creating economics of scale.


France currently has a problem at scale maintaining them?


In 2014 the government they elected basically went all in switching to renewables and they actually stopped or delayed a lot of stuff.

That said, well, they stopped building new reactors, most of their reactors were built in 15 years in the 70/80s. Since then they have not done as much as they should have and all those reactors are starting to need more maintenance now.

Because they have not built much new things, they don't have as many people with knowledge as they used to.

But they are managing most of these issues pretty well overall.

I would say France did pretty well having 40 years of green energy.


As comical as it sounds, The Simpsons and Teenage Mutant Ninja Turtles devastated the image of nuclear energy.


They both first aired in 1987, the year after Chernobyl. What would you expect?


The mutagen in TMNT is chemical in nature, is it not?


According to the lore it looks like you're right, but the fact that it glows makes people think of Cherenkov radiation, i.e. radioactive. Perception and reality, ya know?


> Personally I think fission power's failure is a political and marketing one

Probably the fact that it's literally the same thing that killed 140 thousands people in an instant and imposed the spectre of a nuclear winter upon us all, had its importance.


True, but fusion bombs are also a thing.


Nuclear plants are also very expensive, no?


Yep, and fusion reactors will probably be even more expensive (especially the first ones). Looking at the current prices of renewables, I don't see a market for fusion reactors at all to be honest.

After all we already have a giant fusion reactor just 12 light-minutes away from us! We just have to harvest that energy. The direction were already going (mostly market-driven nowadays actually!) is generation from renewable sources, flexible grids and storage systems to balance everything out.


This minimizes the main problem with really going full renewables, storage. Fusion is 24/7 output at the same level. Solar and wind are not, which means batteries, and all the problems associated with that.

Fusion could obviate the need for grid-wide storage systems which would be a huge advantage.


Unfortunately 100% renewable advocates just throw LCOE numbers around even though LCOE doesn't account for storage at all.

Levelized Full System Costs of Electricity (LFSCOE) does include storage and suddenly nuclear fission gets a lot more competitive:

https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4028640


Precisely this. Do they have similar numbers available for Korea?


Couldn't find any yet, but I'm curious as well. APR-1400 business is booming.


Making the box of salty radiation medicine an order of magnitude cheaper and shoving a bit of hydrogen in a salt cavern solves the storage problem (which is already far smaller than you're pretending) entirely.

Fissions reactors that don't have incredibly strict and expensive regulation are already pretty unreliable, and they're operating within the bounds of known materials rather than an order of magnitude outside of them.

Even the mythical 100% uptime nuclear reactor still needs just as much storage for abritrage because it is so much more expensive.


Not if fusion would cost orders of magnitude more than storage.

"All the problems associated with" what? Modern batteries don't burst into flame. Anyway the overwhelming bulk of storage is not and will not be chemical batteries.


> "All the problems associated with" what?

Economic challenges of quickly building grid-scale battery storage , battery production for the entire globe, NIMBY's etc.

> Modern batteries don't burst into flame

they literally do

> the overwhelming bulk of storage is not batteries

Well overwhelming bulk is a high bar and storage is geography dependent. Germany f.e. can't build as much pumped storage as Australia and Australia built a large amount of battery storage vs PSH.


Germany can has and will very well be building hydrogen and compressed air storage.


Well they certainly say they will, not sure about the rest.


Germany has an overabundance of hills. Most places do. But pumped storage is just one of many options

Modern batteries do not burn. Teslas do.


It's a problem with lithium batteries in general, not just Tesla.

https://arstechnica.com/gadgets/2022/12/recycling-firm-fined...


Lithium-ion batteries burn. But the topic was "modern batteries", which at the instant moment means LiFeP batteries, not "previous-generation batteries".

Lithium is anyway not favored for use in utility-scale storage, where its light weight offers no compelling value. Up-and-coming chemistries include iron-air (no explosions), calcium-antimony (no explosions), and bromine-zinc (no explosions). Hundreds of other chemistries are available.


> an overabundance of hills.

That's not sufficient for pumped storage at scale, but Germany is mostly focusing on hydrogen for now.


The distance to the sun is closer to 8 light minutes than to 12 light minutes. Or do you use heavier minutes, imperial minutes perchance?


It's an unstable isotope with only 40 seconds in it.


There is a lot of background on nuclear costs. From https://constructionphysics.substack.com/p/why-are-nuclear-p... & other sources, I've heard safety standards around radiation were set higher than ever proven necessary, and also ratchet up to obviate any cost gains per unit output.


Safety standards were set higher and higher because accidents kept happening, resulting in some well-known extremely large scale disasters and numerous minor ones. As with every industry, the rules were written with blood.

When you are building a power plant which has the capability of making a significant portion of your country permanently incompatible with human life, you generally want to be really sure you aren't going to have an oopsie.


I accept this as one point; I do question whether the opposite side of the benefit calculation was made - i.e. did we have a public debate where we correctly looked at the tradeoffs of increased fossil fuel carbon pollution, economic impacts from lack of energy and failures to advance, balanced out with honest cost estimates of disaster profiles? It doesn't seem like it. It seems the debates were fairly one-sided; this leads me to believe a better outcome would be weighted more towards pro-nuclear.


Same with wind, "you must pay a fine if you kill a bald eagle", but if you slowly kill all of them with pollution, that's free.

"The fly ash emitted from burning coal for electricity by a power plant carries into the surrounding environment 100 times more radiation than a nuclear power plant producing the same amount of energy."


> Safety standards were set higher and higher because accidents kept happening, resulting in some well-known extremely large scale disasters and numerous minor ones.

I think Chernobyl was the only really big nuclear plant disaster right? And even then, what we've really learned in the long term is that human habitation is more dangerous to wildlife than nuclear radiation (the area around the plant is now a thriving wildlife preserve).


Fukushima, Kyshtym… those are the other bad ones.


FYI, there were zero radiation deaths from Fukishima, although there were ~2200 from the evacuation. Keep in mind this was during the aftermath of a tsunami.

https://en.wikipedia.org/wiki/Fukushima_Daiichi_nuclear_disa...


I think there was a lot of hard work that went into to prevention of anything worse happening at Fukushima. At least that is what I remember from reading the iaea report a couple of years ago. I remember it being a very good and interesting read: https://www-pub.iaea.org/MTCD/Publications/PDF/Pub1710-Repor...


I think 1 person from the cleanup crew eventually got a cancer linked to the accident, but maybe he survivied?


Didn't they dig up the area around it and bury it under itself?


There was simply a false assumption about how dangerous radiation was. Some of this still persists to this day.

A nuclear reactor that had the same radiation as a coal plant would not be legal? How does this make any sense at all?

The Safety standards were actually put at to high a level to early, specially given how instantly save nuclear was.

Consider this, how safe were coal plants? Would nuclear instead of coal have saved 100000s of lives since then? Yes of course nuclear would have, even if you had an accident once in a while.

The problem is that there was 0 tolerance for nuclear accident, because of populist nonsense, but if coal plant and supply chain killed 5 people here 10 people and 1000s of people get sick, nobody cares.

So the reality is, that nuclear went uneconomical because nuclear and existing power production (mostly coal, later gas) were no treated the same in terms of their safety requirements.

> When you are building a power plant which has the capability of making a significant portion of your country permanently incompatible with human life

That's not actually what happens. 3 Mile Island or Fukushima didn't even remotely come close to what you describe. And even for Chernobyl this is questionable statement. And Chernobyl was a type of reactor not built in the West, so in the West something that bad simply can't happen with PWRs.


The learning cycle is too long. It takes ~10 years to build a reactor, so if you spot a way to improve on costs or safety it will take another 10 years for the innovation to see the light.

At the contrary in renewables the learning cycle is in months so costs fall exponentially.

That's the real reason of high costs in fission, not red tape or public sentiment.


No western governments have enough money


Fuel is hardly the only advantage, the major issue with fission is the enormous costs of trying to avoid problems or cleanup after them. Thus 24/7 security, redundancy on top of redundancy, walls thick enough to stop aircraft etc. Fission is still by far the most expensive power source even with massive subsides and is only even close to economically viable as base load power backed up with peaking power plants.

In theory much of that is excessive but there is a long history of very expensive mistakes with massive cleanup efforts. The US talks about three mile island as the largest nuclear accident ignoring the Stationary Low-Power Reactor Number One that killed 3 people. All that complexity and expense comes from trying to avoid real mistakes that actually happened.


this is simply not true. according to IEA

https://www.iea.org/reports/projected-costs-of-generating-el...

LCOE of nuclear is cheaper than almost all other possibilities we have. sure nuclear is very expensive up front, but a nuclear powerplant can run for 100 years while wind and solar had to be completely replaced every 25 years.

your correct that nuclear has had some very expensive accidents, but the chance of a modern gen3+ plant that we'd build today causing any accidents like that in a western country is so very close to 0 that it's not even worth discussing.


You see a lot of handwaving such as that very close to 0 statement with nuclear but someone’s got to be on the hook.

The rate and cost of failures directly relate to insurance costs. A 1 in 100,000 chance per year to cause a 500 billion dollar accident represents a ~5 million per year insurance cost to offset that risk before considering the risk premium associated with insurance. And that’s on top of the normal risks for large complexes that have little to do with nuclear just high voltage equipment etc. Unsubsidized insurance costs are something like 0.2c/kWh which is quite significant for these projects.

In the end you see a lot of people talking nonsense around nuclear costs using wildly optimistic numbers, but there hasn’t been a power plant built and operated in the last 20 years that come even close to these numbers. Let alone when you start to compare predictions for decommissioning costs with actual decommissioning costs.


> You see a lot of handwaving such as that very close to 0 statement with nuclear but someone’s got to be on the hook.

Yes, and that one is society. It what we do with any risk that is so great that if any company would have to carry it then the company would fold and society would still have to carry it.

Hydro power is one prime example. If a dam would break the damage downstream would be too high for any power company to pay. Individuals living downstream might have insurance, but no insurance company can handle the cost of a major flood. The only entity able to do so would be the government.

An other example is forest fires caused by poor maintenance of power lines. Such things happens from time to time and it not the power company or their insurance that will cover if half a country is up in literal flames and a few towns are lost. There might be a bit of bad press, a few millions/billions in damages, but the true cost won't land anywhere near the power company.

Fully eliminate the risk of floods and fire from the power grid would be very difficult, and putting the power company on the hook for the full cost would be impractical and counter productive. Society need electricity. The best they can do is impose regulations, and in exchange society will pick up some of the risk.


Unfortunately your examples have been litigated in practice already, and reality does not agree with you. See for example [0], which is a nice writeup on the liability for dam failure. As it turns out, there are very few cases in which the operator would not be liable. Similarly, Pacific Power has been sued for wildfires in Colorado, and PG&E even plead guilty to manslaughter in the Paradise fire - and had to file for bankruptcy after being faced with a $30bn liability.

Those companies can and should be held responsible for the damages they cause. You can't just privatize all the profits and leave all the losses to the government! If you want to do something so dangerous nobody is willing or even able to insure you, you should not be allowed to do it.

[0]: https://damsafety.org/sites/default/files/files/Legal%20Liab...


The Oroville Dam in California had a failoure in 2017 leading to the evacuation of 188,000 people. Who paid for that? See for example [0], were a very low estimate ends up around 1 billion with the Federal Emergency Management Agency expecting to pay around 75% of that. Who and what funds that department?

When a company files for bankruptcy the result is a legal process where the company seeks relief from debt. PG&E caused California second biggest wild fire named "Camp Fire" which destroyed 1,329 structures, and burned 963,309 acres, with an estimated cost of $16bn. The next year they caused a second wild fire, and yes they did get sued for that. They are estimated to have caused over 40 wild fires.

In the bankruptcy filing that got accepted by the judge they might be paying $13.5 billion for all of the wildfires, with half of that being paid as "stocks" in the company (for how much that is worth). All the remaining costs of the wildfires will be carried by the victims. Since September 30 this year the total amount PG&E has actually paid is $5.08 billions.

If one of Californias nulcear power plant would explode tomorrow with the effect of 40 wild fires then the result would be identical to PG&E. They would be sued, they would file for bankruptcy, and then a portion of the true costs will be paid out. That is reality regardless of what you thought it was.

[0]: https://www.sacbee.com/news/local/article165448747.html


The Oroville Dam was built by California Department of Water Resources primarily for water supply and flood control with electricity generation effectively a useful byproduct.

As a California government agency it’s self insured by the state government, which is a very different situation than a private company building a power plant exclusively to generate power.

As to bankruptcy, insurance is normally required. Wildfires are an odd case because unlike nuclear the people who suffer damage are partially responsible for failing to mitigate risks as eventually fires will happen.


That is just exceptionalism. People view floods and fire as natural events even when they are directly caused by humans. Risk is risk. Insurance and regulations on energy production should be technology neutral. If technology X put $100 risk on society per 1TW/h, and a regulation targeting them reduces that to $1 per 1TW/h, then what technology X is doesn't matter. It is a risk that is carried by society and society has a responsibility to protect itself by balancing the benefit of risk reducing regulations with potential drawbacks.

Who is the primary owners in a power company matter very little. In many countries, especially in EU, the government tend to be the majority owners in power companies operating nuclear power plants. It doesn't change the risk factors.

Also I would never blame victims of flooding or wildfires. People who choose to live downstream of a hydro power dam, or chooses to live in areas with high risk of wild fires, has just as much power as people who choose to live next to a nuclear power station. If operators of dangerous and critical infrastructure do a bad job then the blame tree start with the owners and trickles down to each leaf.


The point you seem to be missing is that Oroville Dam would still have been created even if it didn’t have hydroelectric generation. The risk from adding hydroelectric generation to a dam you where going to create anyway is effectively zero.

People have been making dams for quite literally thousands of years before we discovered AC electricity. They are useful structures to ensure water security and reduce damage and deaths from regular flooding. So yes the Marib Dam for example produces electricity and it’s failure would pose a risk, but it’s on the same location people a dam failed all the way back in 575 and there is evidence of earlier dams in that location going back to 1750 BC.


That is not what is being said. What is being said is that there are two factors to nuclear accidents: 1. The actual costs of containment, cleanup, repair. 2. The arbitrarily imposed costs to satisfy a terrified public.

For power generation, humans just need electricity. This requires large networks of high voltage lines crisscrossing the country. Those lines will start wildfires at some rate X. A utility cannot survive being liable for all damages by that wildfire.

So what you do - is everyone buys insurance and the government sets "best practice" regulations designed to reduce X to a number considered reasonable. Investigations that result in litigation are usually what happens when the company has clearly violated best practice.

The problem with all things nuclear is that our vision of acceptable number and severity of nuclear incidents is that it needs to be negative.


that's simply still not true. the last three plants built in Europe (England, France and Finland) has been very expensive because they've all been first of its kind and there hasn't really been built anything else. but if you take a look at what's happening elsewhere in the world KHNP for example has their standardized APR1400 https://en.m.wikipedia.org/wiki/APR-1400 reactor that seems to be very affordable

Poland just decided to build our nuclear to the tune of 40bn eur and their first contract is with westinghouse and their ap1000 reactor but also signed a letter of intent with KHNP to also built out further. I'm sure they cost Westinghouse for strategic reasons though and not because of price.

heck.. even Finland with their massively delayed and over budget Olkiluoto 3 also plans to built out even more nuclear. it's almost like some countries are now realizing that putting your faith in the weather gods for supply safety is not a good idea and that solar and wind are simply not viable for baseload or the grid in general.

i still think wind and solar has a place for creating synthetic fuels, but let's stop pretending it's been comparable to nuclear for the grid.

edit:

also.. are your saying IEA has wrong data? and if so, would you mind bringing since sources into your argument about people being way too optimistic


It’s not just the United Kingdom that has had issues with APR-1400.

United Arab Emirates has had massive issues. Unit 1 began construction in 2012 and was “completed” in 2018, but didn’t enter commercial operation until 2021 due to literally hundreds of issues. “In December 2018, it was reported that voids were found in the concrete containment buildings for units 2 & 3. Grease was found to have leaked through the unit 3 containment, which may have been due to a crack in the concrete.” https://en.m.wikipedia.org/wiki/Barakah_nuclear_power_plant

South Korea also ran into multiple delays, “Shin Kori-3 was initially scheduled to commence operation by the end of 2013, but the schedules for both Units 3 & 4 were delayed by approximately one year to replace safety-related control cabling, which had failed some tests.”

Poland isn’t a failure at this point, but they don’t have a power plant yet and their cost projections before delays aren’t very rosy.

Objectivity it’s reasonable to blame bad management for issues within a single project or even country, but when several different projects in different countries run into issues that suggest more fundamental problems.


The reality is that these are learning costs.

If Britain decided to build 10 APR-1400 in the next 10 years with each one they would improve.

France built like 50 reactors in 15 years with 60s technology. Yes they had issue early on but after a while they were completing reactors within 4-5 years and very few issues.

The reality is from 2000 to 2020 every country in Europe could have 100% green energy if they had just started building multiple reactors every year.

Germany could have easily have a green grid by now. A nation like Germany could very much have gone and do that, just as France did in 1980s.


Note that economics doesn't work even with Korean nuclear plants: Korean nuclear is cheaper than Korean gas, but more expensive than European gas, because European gas is pipelined, Korean gas is liquified, and liquified gas is so much more expensive than pipelined gas.

European nuclear initiatives are mostly about strategic concerns to get out of Russian gas. Economically, even the cheapest nuclear power on Earth can't compete with gas, if it is pipelined. (It can compete if it is liquified.) Or you need to penalize gas to unreasonable degrees for carbon emission.


European gas is cheap as long as we're willing to hand over control of Europe's energy supply to Putin. Which most countries in Europe no longer are, and maybe never will be again.

Meanwhile the largest known deposits of Uranium can be found in Australia and Canada, making them much safer sources for western countries.

If EU countries allow fracking domestically, this will change, of course. Though the same "green" movement that opposes nuclear is likly to try to block this. Maybe we should look at how much funding these people get from Russia?


In Germany none, because the Greens are against Putin. So instead Putin finances the people that protest against wind power and solar.


Maybe you're right, but there are sources claiming otherwise:

https://www.theguardian.com/environment/2014/jun/19/russia-s...

https://www.newsweek.com/putin-funding-green-groups-discredi...

https://www.thetimes.co.uk/article/german-green-group-brande...

https://www.nationalreview.com/2022/01/putins-green-fifth-co...

Just to pick a few random google results.

As far as I can tell, it's in Russia's interest to encourage any energy source that synergizes with NG (ie wind and solar) and to work against energy sources that are full alternatives (nuclear, coal and large scale storage), while at the same time ignore the downsides of NG.

It would make sense to fund groups aligned with these interests, even those that are generally negative to Russia politically. Such funding would not need to be done directly, but could be done through subsidiaries.


Someone suggesting the organization which made these predictions about solar https://pbs.twimg.com/media/FOoa6xYXIAQKUnv?format=jpg&name=... and is headed by an ex OPEC employee might be making bad cost projections when real prices of real projects in 2022 have a median far lower than their projection from 2020?

Must be a conspiracy theorist.


solar is fine in some places for some usecases and yes it's very cheap to add, but we've currently got no viable solution for storing the energy efficiently.


Batteries, pumped hyrdo, compressed air energy storage, hydrogen, the (large and growing) EV fleet, thermal energy storage


Batteries have had a lot of problems meeting capacity as a storage solution. Pumped Hydro is pretty good but highly location dependent, Gravity and compressed air I believe show a lot of promise. I don't know enough about Hydrogen or thermal storage to comment. But we are no where near actually solving the energy storage needs to use solar and wind exclusively. Unless we demonstrate real breakthroughs in production ready storage we'll always need a backup. Nuclear whether fission or fusion would have been a better route to clean energy but we basically stopped innovating there decades ago and now we are too far behind.


Gravity storage is an absolute joke. About the cheapest substance you can use is iron ore because it reduces the size and cost of the frame, and if you had everything else for free it would still cost you over $70/kWh for a box of it to store 1kWh in a 500m high tower.

Renewables with straight gas backup and no other storage are already lower carbon than any other option, and batteries and off river PHES have only just started getting cheap.

The breakthroughs we need to cover the final gap have already been made if you're paying any attention at all.

Stop concern trolling


Pumped hydro is not, in fact, "highly location dependent". It needs a hill, but there are many millions of hills.

Storage does not need any "breakthroughs". It will be built out when there is renewable generating capacity to charge it from. In the meantime, NG plants fill shortfalls.


The best storage solution is to offset normal hydro generation to build up capacity to be released when you have unmet demand. That massively changes the need for storage because dams are already storing months worth of energy so shifting demand within the day is effectively free barring possibly adding some turbines.

Globally 16% of electricity is produced by traditional hydro annually that can cover the majority of the projected need for storage in a pure wind/solar grid.

Also, by the time we need significant batteries the costs will have fallen even further. If you want to eventually cover 10% of the grids daily demand from batteries using projected costs from 2030 to 2040+ it doesn’t look unreasonable.


And also what is slowly starting to emerge: adapting the load back down during peak hours


The current technology mix is capable of meeting more of electrical demand than nuclear has ever achieved at lower cost with zero storage https://www.nature.com/articles/s41467-021-26355-z

It is also perfectly capable of meeting dispatchable loads like heating, chemical production, and EV charging, and adding them to the grid will bring the ability to meet electricity even higher. Considering the storage and dispatchable low carbon energy that already exists, the remaining part would produce less carbon than would be released by expanding Uranium mining.

There is not enough uranium to meet 50% of world electricity demand using current technology for long enough to wear out a single generation of wind turbines or solar panels.

Your imaginary all nuclear future is both impossoble and worse than the trajectory we are currently on.


> but there hasn’t been a power plant built and operated in the last 20 years that come even close to these numbers

If we are being honest, that also has a lot to do with why nuclear is so expensive.


Sure, I have no issue saying nuclear could in theory cost 40% less with reasonable regulation and a large scale deployment across decades. I just have problems with people saying well it could in theory cost X, therefore it does cost X.


> sure nuclear is very expensive up front, but a nuclear powerplant can run for 100 years while wind and solar had to be completely replaced every 25 years.

Hinkley Point C is currently expected to cost around $31 billion once finished for a measly 3,000 MW.

For that money you could build ~2,300 15MW onshore wind turbines - which would add up to roughly 34,500 MW capacity. So even under the assumptions that

- you have to replace the wind turbines 3x to reach 100 years life span and

- you always have to build more renewables since they don't run at 100% their capacity throughout their lifespan

wind make more sense economically nowadays.


Hinkley Point C is a first of its kind project, if you want to be economical you should look to KNGR https://en.m.wikipedia.org/wiki/APR-140 they've built several in Korea and one in Saudi Arabia where the cost was $24.4 billion for 5380 MW.

it's cute that you are mentioning onshore wind but that will just never happen, takes up way too much space and most places have a capacity factor of below 20% making your 34500 Mw 6900Mw as well as giving you erratic output. so for wind to work you either need fossil fuels, power 2x or some new magical battery that will make the cost of such a solution insane because you'd have to completely overhaul your infrastructure.

offshore wind is more realistic, but costs way more than nuclear.

wind makes sense of you want to built something fast, but it won't bring down your carbon footprint. og at least it haven't in Germany or Denmark. the only reduction we've seen is because we burn trash and biomass which fair some messed up reason is considered green and renewable.


Have a look at the availability factors of those 'cheap' Korean nukes, there's a lot of overlap with the capacity factor of offshore wind even including curtailment. Then the running cost differences are enough to pay for the wind farm in about 15 years.

Then also look at the $20 billion dollar 'service' contract for the Saudi one that doesn't include any labour or running costs. It suddenly costs about the same as Hinkley C even before overruns.

Once you look at the total in rather than comparing overnight costs to renewable all in costs, they're the same $8-10 per net watt as nuclear always is anywhere except china - and China's renewables are cheaper by close to the same ratio.

The penetration rates at a given cost favour renewables right up until your peaker gas plants are causing less emissions than the Uranium mine.


please stop calling them nukes, it makes you sound like Greenpeace crazy person that actually think that a nuclear power plant has anything to do with nuclear weapons that nuke is normally referring to.

could you please provide some evidence that the capacity factor and supply safety is remotely comparable between APR1400 and offshore wind?

What do you think service costs are for offshore/onshore wind and hinkley point? having maintenance and an industry is actually a good thing for the economy.

where do you get your numbers? you sure make many claims without a shred of evidence. and are you seriously suggesting that we continue using natural gas?


Nuke is a word that applies to a reactor, a bomb, and someone who works on a reactor equally. Stop with the pearl clutching.

https://pris.iaea.org/PRIS/WorldStatistics/ThreeYrsEnergyAva...

Cheap reactors are unreliable reactors.

> What do you think service costs are for offshore/onshore wind and hinkley point? having maintenance and an industry is actually a good thing for the economy.

Stop with the broken window fallacy. If subsidizing jobs is important, open a battery or PV plant with the tax money instead.

> and are you seriously suggesting that we continue using natural gas?

Using gas 2-20% of the time with a mean of around 8% produces fewer emissions than opening new uranium mines and only needs to happen whilst the storage industry matures. Your plan entails burning more gas whilst the reactors, mines, and enrichment are built out over decades, then it also entails burning more gas at the end for outages unless you overprovision and build seasonal storage and long distance transmission.


Colloquially speaking, which your conversation here is, nuke has always meant bombs not reactors. He's not pearl clutching, he's reacting to what sounds like unnecessarily negative terminology.


It unambiguously means nuclear reactor in context and is widely used. The only people who even pretend it doesn't are the ones simultaneously making disingenuous arguments about why renewables are terrible and we immediately need to drop them and wait 50 years for nuclear to save the day.

The mock outrage is tiresome and transparent.


Wikipedia disagrees: https://en.wikipedia.org/wiki/Nuke

As do I, I personally have never heard someone refer to a nuclear fission power plant as a nuke, but I guess I don't hang around with the same people as you...



As i have posted here many times, i live very close to a nuclear power plant.

Everyone in the area simply calls it "the nuke plant".

It is a directional landmark : "yeah, so once you get to the nuke plant turn left...."

"Once you see the nuke plant you know you are getting close"..

Its full name is a mouth full : Pickering Nuclear Generating Station.

Citing wikipedia sometimes backfires.


We call them nukes because they are nukes.

No one is even slightly confused by the usage.


What if Britain would have simply built 20 Hinkley Point C starting a new one every year at first and then after a couple years multiple every year.

The same people moving form project to project, on-boarding new people. Just as France did in 1980s.

This would result in very cheap competently green grid for the next 100 years.

Wind turbines have to be replaced 3x in that time and you don't have to deal with intermittency at all.

Just as with everything else, without economics of scale it doesn't work.


Remind me what the lifetime EAF of those 80s French nuclear plants is?


If you want to have an argument, then maybe just present your argument.


You said you don't have to deal with intermittency. That's a lie.


Look at real world data. Nuclear power has scheduled down time, something that is totally doable if you have a fleet of reactors.

You certainty don't have anything close to the intermittency of wind and solar. And this is clearly evident in the production graphs.


Real world data says that unless you spend insane amounts on it and then pretend the reactors that shut down decades early due to issues or destroyed themseves don't exist, or are China then something goes wrong and forces a shut down or low output about 20% of the time.

In most regions you can get a lower forced downtime rate for a lower cost with renewables, and then you also get the curtailed energy to feed dispatchable loads. You need the electrolysers anyway for chemical feed, and you need storage to meet variable loads so it's just a matter of which can be deployed faster.

Additionally you get a very long forced downtime when you burn through your Uranium reserves in under a decade by trying to provide current final energy.


In the US, the average capacity factor for wind turbines is about 33%, so your nameplate 34,500 MW capacity is immediately ~11.5 GW actual. Wind makes sense, but deceptive numbers don’t help your argument.


that is exactly what i am talking about in my second point?! you‘re at 11.5 GW considering the capacity factor, divide that by 3 (first point) and you’re still above whatever you’ll produce with Hinkley Point C!


The cost of the land required is non-trivial.


Onshore wind can overlap with pasture land.

Offshore wind pays into the public purse now via the leases and still costs about half what subsidised nuclear does. It's still a very young industry.


seriously.. bring some evidence. onshore wind is just not going to happen it just has too many problems. and offshore wind is more expensive, less reliable and takes roughly the same time to build as nuclear. Denmark is currently planning to build a 3Gw energy island that will cost a whopping 40bn dollars and is planed to be finished in 2033. insane if you ask me


Onshore wind has been happening for decades. See for example the 1.5GW farm in California, the 1GW farm in New Mexico, the 1GW one in Oklahoma, the 900MW one in Texas, or the 845MW one in Oregon.

Offshore has a rather fast construction time, it turns out. For example, the United Kingdom's Hornsea Wind Farm Project 2 was given planning permission in 2016, and it reached its full capacity of 1.4GW less than six years later. Project 1 at the same wind farm reached 1.2GW in less than five years.

And when it comes to cost, Hornsea Project 3 is to start construction next year - with commercial operation scheduled in 2025 - at $12bn for 2.4GW. Not bad when you compare it to Finland's Olkiluoto Nuclear Power Plant unit 3 costing an estimated $11bn for 1.6GW - which took 22 years from first license application to design output power.


6GW initially with the expensive part done for upto 10GW, €28 billion, and 2030.

That is insane. They're building a FOAK project for less than NOAK nuclear reactors like Hinkley C in less time and it will be generating at higher capacity from day one than new nukes manage for their first decade or so of operation. Nice pro wind factoid.

More power, sooner, with low enough O&M that you could build another one with the money you saved just during the time it would take for another EPR to be built and come to full power? Sign me up.


Do you have a source for the 100-year lifetime?

Currently a lot of reactors are hitting the 30-40 year mark, and they are running into significant issues with the aging equipment. We are seeing an increasing number of minor incidents, often caused due to manufacturing defects finally rearing its head, or just plain fatigue.

Meanwhile, solar has a 25-year economic lifespan. At that point you can make more money by replacing them with more efficient panels. However, manufacturers have already started offering 40-year warranties for consumer panels, at which point they have a guaranteed 88% power output. Wind indeed has a lifespan of 25 years, which seems pretty average when compared to literally any other power plant with moving parts.

When it comes to accidents, they are indeed extremely unlikely. However, the figure to look at is the potential damages multiplied by the likelyhood of the accident. When we look at those two together, they are definitely worth discussing.


> manufacturers have already started offering 40-year warranties

A warrantee of that length is only valuable if the manufacturer is a stable business with multiple income streams (say GE) or the warrantee is backed by stable insurance (say Lloyds). Liabilities are supposed to be on the balance sheet, so they are not free to mint.

If there were a long term issue where consumers needed to claim on the warrantee, I would guess most manufacturers would just get liquidated, but the executives and owners will have already cashed out. The same business model gets used for lots of other businesses with long term warrantees - limited liability is very handy.


well i guess a 100 years lifetime was kind of pulled out of my ass, what i was trying to communicate is that you can I'm theory maintain a nuclear power plant to last for 100s of years but i guess if you'd just let it run without doing anything it would probably run for 30-40 years.

https://www.iaea.org/newscenter/news/iaea-data-animation-nuc...

solar is fine for those who can afford it, but workout subsidies and the ability to sell electricity back to the grid it's a crazy long term investment in many places of the world especially northern Europe where I'm from (for hopefully obvious reasons). so different milage may apply elsewhere. i guess we'll have to see if those 40 years are for real and if the companies offering it are even around in 20 years.

wind needs constant maintenance to have a 20 year lifespan, but beyond the 25 years you'd have to replace the whole thing. so while a nuclear powerplant also requires constant maintenance you don't have to treat down the whole plant after 40 years. even the German ones that are closing now could easily have their lifetime extended https://www.reuters.com/business/energy/could-germany-keep-i...

>When it comes to accidents, they are indeed extremely unlikely. However, the figure to look at is the potential damages multiplied by the likelyhood of the accident. When we look at those two together, they are definitely worth discussing.

i guess what I'm trying to say is that we as a civilization engage in activities that are way more risky and dangerous than the miniscule risk of a serious accident in a modern gen 3+ nuclear power plant. of course we should have strict regulation here, but it's just not that dangerous or risky


Mechanical equipment like pumps in active use don’t last anywhere close to 50 years and need to be overhauled or replaced several times over that 50 year lifespan. You can find videos of turbines being replaced which is incredibly expensive. In the end you don’t get a clear this is the final date you can operate limit just increasing costs every year.

The ~fifty year lifespan is in part based on physical corrosion of pipes running through concrete there really isn’t a way to economically replace them all that costs less than simply building a new power plant. But even here not everything fails on the same day so there is some wiggle room.


> if you'd just let it run without doing anything it would probably run for 30-40 years.

Let it run? You mean, presumably, the huge amount of testing and preventative and planned maintenance that is scheduled in as part of a reactors expected lifetime, plus anything new discovered along the way. That doesn't come for free.

> In theory maintain a nuclear power plant to last for 100s of years

Sure, given enough effort you can fix anything. But extending a fission plant's lifetime can require massive overhauls, replacing reactor components, replacing materials that have experienced radiation embrittling and activation, etc. Keeping a plant running indefinitely is so complicated and expensive that we haven't managed it so far.

Extension is something we should absolutely consider but it's not a magic fix all. Sometimes it's not worth it to keep an old thing running.


Any claims about 100 years of trouble-free operations of a nuclear reactor is a wishfull thinking at best.

For example, in France nuclear power reactors were stopped because unexpected cracks appeared in pipes after just 25 years of operations requiring expensive maintenance, https://oilprice.com/Latest-Energy-News/World-News/France-Cl... That put reactors off-line for over a year.

Then Sweeden closed one of its reactors because it bacame unprofitable due to raising maintainance costs, https://apnews.com/article/technology-business-sweden-europe...


> your correct that nuclear has had some very expensive accidents, but the chance of a modern gen3+ plant that we'd build today causing any accidents like that in a western country is so very close to 0 that it's not even worth discussing.

But that's precisely why nuclear power plants are so expensive to construct. If the generation technology was inherently less risky, it stands to reason the facilities would be cheaper to build


Fusion will also have to go to enormous efforts to avoid problems -- not because of public safety, but because it's very difficult to repair anything in the reactor if it breaks. This was a lesson of Three Mile Island: a nuclear accident that doesn't kill anyone is still ruinous for a utility, since their large investment is destroyed.


Came here for the F.U.D and you did not disappoint.

>. Fission is still by far the most expensive power source even with massive subsides and is only even close to economically viable as base load power backed up with peaking power plants.

https://www.statista.com/statistics/748580/electricity-cost-...

Seems Solar is the most expensive, and by a large margin?

It looks like nuclear is cheaper vs almost all "renewables"?

There is a nuclear power plant ~10KM from me that set world records:

- On October 7, 1994, Pickering Unit 7 set the world record for continuous runtime at 894 days, a record that stood for 22 years.

Can you provide the number of days that "WIND" or "Solar" have provided continuous power for?

That complexity and expense is because you are building machines which can run for 894 days NON-STOP. (CANDU plants can be refuelled while operating)

Diesel locomotives are expensive, a lot of this is attributed to the engine designed to run at high-output for an extended amount of time.


> the major issue with fission is the enormous costs of trying to avoid problems or cleanup after them. Thus 24/7 security, redundancy on top of redundancy, walls thick enough to stop aircraft etc.

A fusion reactor will also require wall thick enough to stop aircraft. Security will likey be the same to. And there is no fundamental reason why fusion should require any less for any of these.

In fact the actual cost of nuclear is CAPX and comes from the large civil engineering project with high specification, the steam turbine and water towers.

There are lots of fission based reactor designs that have non of these things. So nothing you describe has really much to do with 'fission' itself. Fission plants can also be made so that airborn radiation is practically impossible.

We simply stopped fundamentally advancing fission reactors in the early 70s and instead of solving problems fundamentally, we added lots of regulation.


The most expensive before or after taking greenhouse emissions into effect?

Seems like power generation still counts on externalities being external.


It's still decades off but as I understand it, this was the hardest nut to crack. They got what, 2.5 megajoules out of 2.1 in?

I might be in the opposite camp as you but this is very much a "where were you when—" moment for me. I'm sure someone will pop in to disappoint me but I think the point is it's no longer a hypothetical exercise.


> They got what, 2.5 megajoules out of 2.1 in?

Of laser energy into a tiny control volume that doesn't consider how much energy went into the laser systems. If you draw the control volume around the building and see that the lasers require vastly more energy than what came out, I think you'll be less excited, right?

We've been getting lots of energy out of fusion since the early 1950s with thermonuclear bombs. We know we can get energy out of a control volume. But is it a practical energy source is still the question imho.


Could you elaborate on that? What do you mean that the lasers could require more energy?

Is it that in a specific volume they got X EM energy coming in from the laser and Y thermal energy coming out, with Y>X BUT the electricity consumption of the lasers is Z>Y>X?

If so that's sort of misleading, like the plethora of claims from ITER. I hoped this was different.


Exactly. Looking at the Wikipedia article [1] suggest that they start out with 422 MJ stored in capacitors, turn this into 4 MJ IR laser light, convert it into 1.8 MJ UV laser light, this into x-rays of which 0.15 MJ heat the target of which finally 0.015 MJ heat the fuel. Depending on what in this chain you consider the input energy, you can get orders of magnitude different numbers - 15 kJ of energy produced could either be a gain of 1 or a gain of ‭0.0000036 or anything in between.‬ And this is before trying to capture the released energy and converting it into electricity, this will come with another sizable loss.

[1] https://en.wikipedia.org/wiki/National_Ignition_Facility


From https://www.ft.com/content/4b6f0fab-66ef-4e33-adec-cfc345589...

> The fusion reaction at the US government facility produced about 2.5 megajoules of energy, which was about 120 per cent of the 2.1 megajoules of energy in the lasers, the people with knowledge of the results said, adding that the data was still being analysed.

They probably upgraded the rig since the Wikipedia article was written, so most likely the 2.1 MJ refers to the UV light numbers.


If this is assumption is true, they only produced 0.6 % of the energy they spent. Another question would then be, how relevant this is, i.e. could the UV light be produced much more efficiently than the experiment does? Maybe some constraints forces them to use a very inefficient process? In that case it might be reasonable to use the UV laser power as the reference for the gain.


Sure, and if they upgrade the lasers themselves to current laser tech (as I understand it, the NIF's hardware is around 25 years out of date on that front), then that 0.6% number probably jumps to 20% or so. Which still isn't enough, but is way closer than 0.6%.

Add to that the fact that improvements in laser efficiency is a hot research area (as lasers are used commercially in a lot of places, and cost-cutting is always a concern), and this is starting to feel a little more attainable.


I think the more interesting question is "how does this scale with more laser power?"

Even if the lasers are 1% efficient does it matter if 100 GJ of electrical power results in 100 TJ of fusion heat? I'm not saying this is at all how it scales, but it is the logic behind pursuing an ICF power plant. The fuel gets ignited and heats itself.

Also, for fun, 100 TJ is 24 kT TNT equivalent: slightly more than the bombs dropped on Hiroshima and Nagasaki. Trying to capture this energy released instantaneously would be a fun engineering challenge.


Presumably they mean that there are efficiency losses in charging the supercapacitor banks used to fire the lasers; so that if you consider the system over multiple duty cycles rather than over a single cycle, it's no longer energy-positive. (I.e. if the system were capturing its emitted energy — and that emitted energy needed to be enough to act as a grid power source feeding input power to the supercapacitors, rather than merely being the equivalent of the direct output power of the lasers per shot — then it wouldn't be enough to sustain the reaction.)

But personally, I don't know whether that's actually important. Power plants usually consume a nontrivial fraction of their own produced power to power themselves, and in fact consume more than 100% of produced power when starting from a full stop — meaning that in initial few-shot conditions, even when feeding back their own produced power into themselves, they still need (huge amounts of) external power input to get going, like a car engine needing a battery + starter motor. Only a rare few kinds of power plant can be used to "black start" a power grid. Most types of generator need to overcome initial higher resistances, e.g. inertia (and thereby back-EMF resistance at the transformer) in getting heavy turbines spinning from a stop.

It wouldn't be at all strange if a practical fusion power plant turned out to be energy-negative over a few-shot run (i.e. required "bootstrapping"), but then became energy positive over a theoretical 24/7 run at whatever its optimal duty cycle is. And a single-shot run becoming net-positive would be a good point to start to consider those more practical calculations, since they'd have been useless to consider until then—a power plant can't possibly be net-positive over any kind of runtime + duty cycle, if its core reaction can't be net-energy-positive when considered in isolation.

Which is, to me, why it probably does make sense for ITER to be excited. They've reached the point where they can stop using a lab-bench model of power efficiency, and start trying to come up with another, more full-scale model of power efficiency to replace it with.


Not /u/acidburnNSA, but what was meant is that no laser is 100% efficient. Not only do they not convert 100% of their electrical input into laser energy, but they also require other support systems, notably cooling. So we need to consider the total energy costs of the building the fusion experiment is conducted in, not just the physically small area where the fusion reaction is happening inside the reactor.

Still, this is an important step in the development of fusion energy reactors.


So the laser energy that went into the reaction in the form of light is less than what came out of the reaction. However, the energy needed to produce that laser energy may be orders of magnitude more depending on the laser. AKA: the Wall Plug Efficiency.

Tabletop rigs can be as efficient as 50%, however high power such as we see here tends to come with drastically reduced efficiency.


I think fusion-plants have always been "15 years away", and most likely will be so for quite a few years...

Edit: I was wrong, fusion is always 30 years away: https://www.discovermagazine.com/technology/why-nuclear-fusi...


I don't know what people get out of repeating this on every single fusion article. It's not inventive or insightful, and it doesn't further the discussion in the slightest.


Because it is context that is rarely included in the article.


some people are new to the Fusion discussion. They've missed the last 50 yrs of "fusion is 10 yrs away" claims. Over the years, I've learned to temper all discovery excitement. Its the other side of the coin equivalent of the the XKCD 10000 comic[1].

[1] https://xkcd.com/1053/


Because it's A) true, B) relevant to keep all of the hype in check. The year of Linux on the desktop is always right around the corner too. Yes, they are tropes, but they were not born out of nothing.

Someone has to keep the bloviated PR campaigns checked with reality. Otherwise, some crazy fools might actually start believing that fusion is real and gets duped out of their money. If you can't stand a bit of real criticism, then maybe you should sell your scam somewhere else. Otherwise, take it on the chin, retool your message, and come at it honestly.


It's not true. The original quote was 30 years given current funding. They reduced the funding and surprise surprise it didn't get done. It's like when you estimate how long a project will take given a thousand people, and they reduce the number of people on the project to one person and then hold you to the original estimate.


Okay, but then if the funding has decreased, what hasn't the "years away" increased? No, that wouldn't sound good in a press release now would it. So they keep saying it is just around the corner. It's like the religious people saying that the second coming is right around the corner for over a thousand years now. I know, I know, religious zealots and science (zealots?) are different. Or are they?


Show me a fusion scientist saying fusion is 30 years away. No one in the article is even saying that. It's people in the comments repeating the same thing from the 80s.


What article? It's people speculating on the announcement that another announcement is coming. It just feeds into the hype machine. With this level of hype, watch them come out and show off the Segway!


Okay. Guess I'm done here.


If you want to keep the hype in check, do it with facts like /acidburnNSA did above. Let people debate. You don’t even know what will be announced. Repeating the same joke in every single fusion article is tiresome and has long past its funny expiration date.


Why does it have to be funny? It's just a sad statement about the situation. Maybe you're tired of people not being as excited as you, or even willing to for a second hold their breath any longer on this topic. But here we are at another announcement essentially saying "this shit is hard. with more funding, we could possibly maybe do something in the nearish future". Anything announced in the PRs is just mumbojumbo hand waving to explain why what they are saying isn't really saying anything substantive other than to keep fusion in the news so it is easier to raise money. This is the main perception of fussion by the masses.

Personally, I just don't see fusion being a viable solution for anything in any of our lifetimes. I will gladly admit how wrong I was if/when someone solves it. I just have a much stronger doubt in sci-fi vs reality, and don't get swooned by the hype machines surrounding fusion.

What is tiring to me is calling the skeptics tiring. But to each their own


I think one can be simultaneously excited about a big breakthrough like this, but also understand that there's still a ton more to do before we have viable fusion power.

And it's unreasonable and annoying to expect everyone to say "This is amazing, but..." rather than just "This is amazing". Yes, we know, fusion power isn't ready, and we have no idea when (or if) it will be.

I haven't been "holding my breath". I've been watching from afar, checking in occasionally (like when this sort of news comes out), and I genuinely think this particular breakthrough is exciting. I don't need the tiresome -- yes, incredibly, frustratingly tiresome -- legion of naysayers coming in and stating the obvious every single time.


It's also weird to watch people debate passionately but without the passion to actually gain expertise in the thing they are debating. I find it weird that we do this. I'm not immune, we all do it. We should at least be cognizant and try to reduce how heated we get over things we know so little about. It is just weird.


It's not a trope; it's a cliché. There's nothing wrong with poking holes in overinflated hype, but do they have to be so boring and repetitive about it.


It's also just a parrot trick. There's no reason behind why it is 30 years away or even why 30 instead of 15 instead of 20. It is just a line. These numbers are meaningless but touted as a way to add validity to the argument without providing actual evidence for why fusion is such a tough nut to crack. We should dispose of hype, but let's do it from a place of understanding. I hope we're a bit smarter than parrots.


If you keep telling me the same thing with the same lack of results, I could say the same to you as being boring and repetitive. Just because you say 2+2=5 and someone tells you you're wrong every time doesn't mean they are boring and repetitive.


How is this "lack of results"? This particular announcement is a huge result!

Maybe it's not the result you think it should be ("with all they hype over decades, we should have fusion power by now"), but... too bad. It is what it is, and this particular announcement is indeed impressive.


It is not, in fact, a huge result, except insofar as it is convenient for further weapons research. It does not bring civil fusion power even a single day nearer.


Not an engineer in this field, so I may have misread/misunderstood, but I read that 2.5MJ out for 2.1MJ of laser energy in, NOT the total energy needed to make the whole thing work.. So, in a layman’s world, it is not a net gain of power, only a small subset of the system yielding more power than it took in.

Happy to be proven wrong and told that it is more of a breakthrough than I think it is..


No, you are correct.


So they are ignoring the laser efficiency as well as the thermal to electric efficiency? If you did the same for a tokamak, stellerator or Bussard, would you get a similar ratio?


Jup. Fusion research is necessary and funding should be provided. But it is not close to commercial or generative viability.

So there is at the moment no working design for a generator as a plant that produces more electricity than it takes in.


Yup? Absolutely not, nothing close to this has been achieved with other reactors.


Electricity in, heat out, I think. Getting that heat back to electricity will cost some, I expect more than that 0.4


The efficiency of a thermal power plant is around 40%, depending on the temperature of the steam it can produce. https://en.wikipedia.org/wiki/Thermal_power_station#Thermal_...


Note that energy in the case of ICF is produced instantaneously. There is no approach to capture this energy in production at this moment. Capturing that enery is still ongoing research topic.


The net energy gain is very slim and has to be converted to electricity to power the lasers – in doing so, there's so much loss, it is again NEGATIVE.

It's always the same…


As it always is with new, unproven things.


or fusion.

There are always these articles: net energy gain finally! and then: no not really.


Reminds me of solar. That took a century to get to where we are today where the net energy output is much greater than the energy needed to manufacture them.

It being hard and it requiring continual progress does not mean that progress does not occur.


How long has humanity been working on fusion? Wasn't Ivy Mike in the early 50's? Glaciers continually progress too, but it's not obvious on human timescales.


Correct. Nuclear fusion research should be funded and realistic goals be set.


…which is exactly what this is?


Since you seem to be an expert in that field, what is your perspective on fission for the short term? Are smrs really viable ?


I'm not super excited about current SMR projects either, sadly. The economies of scale that they explicitly turn away from are very real. The economies of mass production that they rely on can't be achieved unless a lot of people are willing to buy the first N for high cost. But who will buy after the first few boondoggle a bit?

I am excited about standardized large light-water reactors at the moment, like the US/Japanese ABWR or Korean's APR-1400 designs. I wish there was more hype around them rather than SMRs and advanced reactors.

My favorite idea in nuclear to rapidly deeply decarbonize is to use a shipyard to mass-product large floating reactors. This gives you economies of scale and economies of mass production. Amazingly, this was seriously attempted in the 1970 and 80s in Jacksonville, Fl on Blount Island, where Offshore Power Systems installed the world's largest gantry crane and got an honest-to-goodness manufacturing license from the Nuclear Regulatory Commission to build 8 of these. [1]

Sadly, my concern above with SMRs happened to OPS and they couldn't break through. Such a good idea though.

[1] https://whatisnuclear.com/offshore-nuclear-plants.html


I'm curious, when you're talking about the SMR projects, does that include the Natrium reactors from TerraPower? I think they're backed by the Gates Foundation? Those seemed pretty interesting to me as a nuclear layman. Also, I don't know a lot about Bill Gates, but he does seem like the kind of guy that if they showed some real success, boondoggle or not, he'd be willing to brute force his way past those issues by throwing money at the problem.


I would say Canada is the furthest along (in the West) for true GenIV reactors.

They are taking an active approach:

https://nuclearsafety.gc.ca/eng/reactors/power-plants/pre-li...

Reactors like ISMR from Terrestrial Energy and SSR from Moltex that will operate at 500MW (rather then true 'small' reactors) are for more reasonable for scale.

They look like 'small' reactors but they pack quite a punch in comparison to PWR designs.

Any nation that just seriously commits to a single reactor design like this and plans to build 50 of them will do really well.

But I agree the same could be done with APR-1400 or AP1000.


Wouldn't the possible location for floating reactors be much more limited than SMR projects? I would think special financing might get the ball rolling for SMRs, strong decades spanning incentives for first movers.


What your definition of SMR?

Because some countries consider even 500MW reactors SMR if they are GenIV.

SMR has become kind of widely used for lots of different things.


Much of fissions complexity comes from safety/damage management. Even after years of advancements we hear about some incidents and radioactive leaks every other decade.

Fusion is a much safer alternative both in incidents and fallout


Is it? Well I guess it is because we don't have a working fusion power reactor yet so the likelihood of an accident is zero. However, if we did have a fusion reactor it would be producing a lot more radioactive waste than a fission reactor.

I definitely wouldn't want to make any broad sweeping statements about something that hasn't been built yet.


I'm a bit at a lost, can you elaborate on how fusion would produce more radioactive waste than fission? We already have a few fusion generators designs so is not a sweeping statement on something totally unknown. AFAIK, the by-product of fusing Tritium/Hydrogen is Helium, which is far from radioactive. I might imagine that some radioactive isotopes might get produced as well, but I can't imagine it being nowhere near as dangerous as spent uranium fuel and contaminated components in fission reactors.

In a case of an accident I would also imagine an explosion from a Fusion Reactor, but the fallout of it would not even close as dramatic as a Fission Reactor leak or explosion


More in the sense of a larger mass and volume of activated material. Fission waste has more curies of radioisotopes, but they're concentrated in the fuel rods.


> At the moment fuel costs in fission are like 5-10% of total costs for a fission fleet.

Yeah and with a breeder fission reactor we could reduce this to below 1% probably. With a thorium breeder the fuel cost might be essentially 0%. In the vision of Alvin Weinberg you literally just drop some thorium into the fuel salt every once in a while.

But the real issue for nuclear energy is currently capital cost and time not fuel cost. And capital cost can go down massively with GenIV reactors as well.

So I don't see how fusion will be cheaper.

> In fusion it could be lower

But eventually you have to start breeding tritium, so wouldn't that make it more expensive.

> Disclaimer: I switched from studying fusion energy to advanced fission 16 years ago.

Awesome, we desperately need GenIV reactors (even if I dislike that term).


> ...put special rocks in a grid and pump water over them as they pour out their star energy.

I'm pretty sure I saw that in a 'goop' sales pitch.


>After that, we can see what a practical electricity producing plant looks like

I guess we still don't have anything better than boiling water, right?


Right. But slapping boiling water around the burning plasma is kind of a rube goldberg usually. See LLNL's LIFE design for example [1]. Things like molten salt walls circulating through a steam turbine and all that.

There are other ideas too, but it's hard to beat a Rankine cycle.

[1] https://en.wikipedia.org/wiki/Laser_Inertial_Fusion_Energy


You can't just slap boiling water around the burning plasma in a DT reactor, since you need almost all the neutrons to make more tritium. Water would absorb too many neutrons. The IFE designs use thick showers of liquid lithium or molten FLiBe for this reason.


HILIFE-II Inertial Confinement Fusion Power Plant Design:

https://web.archive.org/web/20150404075829/https://hifweb.lb...


I think we just haven't found the right fusion design.

If we use a reaction that primarily produces beta radiation or other high energy charged particle, sending it through a coil of wire would induce a voltage that we could extract as electric energy.

For that matter, appropriately located coils could be used to extract thermal energy from the plasma directly. The trick there is that we can't get much with the current tokamak and stellarator designs -- the thermal energy is too disordered to use a large coil and the plasma flow is not sufficiently confined to use small coils. There are almost certainly better configurations, but the electrohydrodynamics simulations are tricky. If we keep at it I'm sure we can find a stable configuration with fewer degrees of freedom.


I'm surprised too. I've looked into this before, and it's absolutely right - just not intuitive to me.

We do have radio-photo-voltaic devices, but they're so inefficient it's laughable. And we have RTG generators, which are only practical in limited situations, and again have a very low efficiency.

So hot water it is!!


Well, there is hydro, wind and photovoltaic. And in the fusion field there are startups working on aneutronic fusion, which can generate power directly from charged particles. LPPFusion is one that seemed promising a few years ago, but unfortunately less so now.


As some-point maybe aneutornic fusion will be a thing. That will be more like solar panels.


This is great! Why is this great? It is great because between magnetic confinement and inertial confinement approaches to fusion generation it is the FIRST one to demonstrate energy gain.

If you are programmer, think of it like your program compiled successfully for the first time. It means that all of the bits between you designing the program, the program being compiled, and the operating system recognizing it as a program, all did what they were supposed to. Of course your program probably doesn't do what you want it to yet, but you have validated a huge chunk of the "pipeline" between what you are trying to do, and doing it with the equipment you have. That is what this is, "hello world" for Fusion Physicists.

And the reason they are so pumped is that they have literally been told for DECADES that why they proposed to do "wasn't possible" (and by that I mean creating actual fusion through inertial confinement.)

Steps 2 - n look a LOT more like engineering steps than "can this even work" steps, okay?


When your program compiles for the first time is usually when the real trouble starts.


It's also when you can start iterating effectively.


Damn right!! on the path to make actually do what it is supposed to do.


For certain values of engineering steps.

Scaling up Qplasma from 1 to ~1000, and scaling up operating time from a microsecond to a megasecond are just two of them.

I have a feeling there is still some science to do.


The efficiency of the lasers is awful though and they will have to get at least 100x that energy yield for it to be a net power source. A lot of heat winds up in the laser glass and it takes it a long time to cool between shots so you are doing very good to make a few shots a day. A real power plant is going to need more like 10 shots per second.

Heavy-ion fusion has been talked about since the 1970s and it seems much more practical than lasers for energy production because the efficiency of particle accelerators is pretty good (maybe 30% or more) but it takes a very big machine, the size of a full powerplant, to do do meaningful development. Something like that seems to need about 100 beamlines because otherwise space charge effects prevent you from getting the needed luminosity. Given that you are going to need to protect the wall of the reactor and the beamlines from the blasts and also have a lot of liquid lithium flowing around to absorb neutrons and breed tritium it is hard for me to picture the beam quality being good enough.

There hasn't been much work on it since then. If I had $48 billion to spend I'd think a heavy ion fusion lab would be better than some other things I could buy.


It's not worthless research (not that you said it was), as it still validates various aspects of fusion energy and some of the engineering around it. And it's always been ahead of magnetic containment devices because they only have to keep the conditions for nanoseconds.

But NIF was never, and is not, designed to be a generating reactor, or even a prototype of a testbed. It's a weapons physics facility that happens to do some energy generating research sometimes.

That aside, hitting Q=1 (and be able to use the device again) in any way at all using any equipment is a major milestone that proves humans can get there. From that point, in theory, it's just engineering.


Yeah, either heavy-ion beams or electrically-pumped excimer lasers seems like the path forward for the driver. Higher efficiency, higher repetition rate, possibly more robust. They also need to do away with holraums and switch to direct drive, to reduce target cost, ease alignment issues, and increase energy efficiency.

I don't hold out much hope for a practical, economical reactor from inertial confinement, but it's certainly exciting to see them achieve ignition & scientific breakeven, even if it's 10 years behind schedule. The one nice thing about ICF is that the energy gain shoots up dramatically once you cross the ignition threshold. That means they're arguably closer than tokamaks, even though both concepts need ~100x the demonstrated gain to get from where they are now to a workable reactor. (Ie, tokamaks have hit Q~0.3, need to get Q~30, vs ICF that has hit Q~1, needs Q~100).


Unfortunately large fusion is unlikely to ever be economic because the cost of solar/battery is coming down so quickly and is already in the 1-2 cents per kilowatt hour for the solar component. And costs will continue to drop.

Small scale fusion on the other hand would have a viable niche application at the poles, in the sea or underground or any other environment that is without sun or space.


We won't know what the cost of solar/battery will be in a sustainable energy economy, until someone builds a solar-powered solar panel and battery factory. At the moment, productions costs are heavily (as in, entirely) subsidised by fossil fuels (mostly coal).


Cost of current production is an upper bound. As power costs fall, production costs that depend on that will fall, in lockstep.

Production is not subsidized: factories pay full price for their power.


You miss my point. The reason solar is so cheap right now (along with the huge amount of government subsidies) is that the huge amount of energy required to manufacture them is currently done with very cheap coal in China.

>Cost of current production is an upper bound.

Under the current state of the energy economy, maybe. If we had to replace all manufacturing power sources with renewables - absolutely not.


Power from renewables costs less than from coal.


Maybe - with government grants, and coal-powered manufacture of all of the associated generation equipment.

That's not very interesting though - what is interesting, which has been my topic of conversation this entire time, is what the energy economy would look like if it were not still fundamentally rooted in fossil fuels.

Given that coal and other fossil fuels are basically free energy - it does not take much at all to get energy from it (ie, set it on fire), it is not physically possible for PV generation to beat that. Therefore, it follows that renewable power will be more expensive than fossil fuel power. I don't see why this is so hard to acknowledge - we are living in a time of unreasonably cheap power, fuelled by several million years worth of stored solar energy. It can't last.


You make the same mistake fission boosters make. Converting heat to electricity is expensive. Solar and wind skip the expensive step, going straight to electricity. Electric power from solar and wind is already much cheaper than coal, without subsidy, for this reason, and because coal has to be dug up and transported. Coal has a high operating cost. Solar and wind have extremely low operating cost, and also very low capital cost, always falling.

Solar and wind, un-subsidized, are the cheapest power the world has ever seen, and their cost is still falling at exponential rate.


>Converting heat to electricity is expensive.

And? Most of our power usage is not supplied through electricity. Solar panels are never going to heat my house.


Why not? Plenty of places have enough sunlight to do so even in winter. Parts of Alberta have similar sunlight mid winter to PNG mid summer.

Plus storage is a thing. Using a heat pump to dry NaOH or melt Sodium Acetate, or heat a large pond can store low grade heat economically for months. Ammonia, or methanol can do so indefinitely.

Then there's transmission. HVDC can transport energy 10GW pernline for thousands of km at costs comparable to local generation.

I'd be very surprised if you could avoid using a solar panel to heat your home in 40 years even if you go out of your way to do so.


We're talking about the cost of power. Putting aside the unbelievable idea that Alberta has as much mid-winter solar energy available as at the equator, using solar to heat my house is more expensive than burning some stuff inside.


https://globalsolaratlas.info/detail?c=47.756755,-110.19981,...

https://globalsolaratlas.info/detail?c=-5.462873,137.384064,...

Bifacial isn't in this model, but it boosts the snowy region by about 20% and the tropical one by about 5%

And what will the stuff available to burn be made from when there are plants producing ethylene or methanol or ammonia in chile or saudi arabia or mongolia for less than what gas costs to dig up?


So the energy that is cheaper than coal and driving operating coal plants out of business will make the cost of producing it go up when the share increases?

These mental gymnastics routines are olympic level.


The reason we can ignore the huge manufacturing energy inputs required to make solar panels, is because it's powered by cheap domestic coal in China.

>driving operating coal plants out of business

Any specific ones? The only coal plants I've seen get shut down are because of environmental reasons (or age). Some countries, like Germany and China, are re-opening or building new coal plants.

Talking of mental gymnastics - fundamentally, the energy economy boils down to EROI (energy returned on energy invested). It's just wishful thinking that we can replace energy sources that are basically free (coal, oil, gas), with those that have energy payback periods in the mid-double digits of their expected lifespan (solar).


Try a new solar panel rather than one from the 90s. Your Shellenberger tripe about EROI went off when the EROI of solar surpassed that of nuclear and EPBT dropped below 18 months (or 6 months in sunny countries). Then it went even more ranc

If you're really worried about it, buy a panel from europe, the polysilicon (90% of the energy) comes from hydro, wind, and nuclear powered countries.

Even if all the money for a solar module went to coal generation at chinese or indian prices and nothing else it would pay back that power in under two years.

If the only activity involved in making PV was to spend the entire system cost on lignite and burn it directly at the mine front, it would *still* produce more energy in its lifetime than putting the coal in a coal plant.

It's absolutely laughable that you think you can keep spouting this ridiculous lie.


>the EROI of solar surpassed that of nuclear

Where do you get your numbers from?

>it would pay back that power in under two years.

That's exactly the problem. This is a significant portion of the lifetime output of the panels.

>it would still produce more energy in its lifetime than putting the coal in a coal plant.

I'm not arguing that solar panels are a net negative, as you seem to be implying. I'm arguing that the energy economics of a world fuelled entirely by solar (and other renewable technologies - solar is about the worst for EROI) would look very different to what we have now.


Crystalline Solar panels have a benchmark lifetime of 30 years and are consistently outperforming predicted degradation rates. None have worn out yet, but the best guess is a median 40 year lifetime. A new panasonic or jinko mono panel installed in india has epbt under 6 months and an eroi around 100.

You're the one making the insane claims. You back them up.


>A new panasonic or jinko mono panel installed in india has epbt under 6 months and an eroi around 100.

I certainly haven't made any claims as specific as this without any backup!


You claimed solar has too low EROI to be viable.

Prove new solar in a median location is lower EROI than the median for new gas using up to date info on the whole process and solar cells you would buy for a project started now such as 155 micron wafer mono PERC.


>You claimed solar has too low EROI to be viable.

Nope, I said that it's lower than other sources of power, and thus an energy economy based on solar will look very different than what we currently have.

Given that electricity represents a relatively small percentage of our power usage, in the majority of cases (materials manufacture, industry, heating, etc), the EROI of renewables will be worse than fossil fuels.


Prove that thermal energy from shale oil or tar sands is higher EROI than that same solar panel using a resistor or arc furnace then.

Then add heat pumps and PV+Heliostat or PV+CSP derived hydrogen compounds to your equation and realise that adding heat and chemical stocks shipped from distant places to the equation makes it favour renewables even more as you can turn 120MJ of electricity and 40MJ of direct sunlight at Chile's 35% capacity factor into 120MJ of hydrogen or 100MJ of Ammonia you don't have to refine. With the heat pump you'd get more low grade heat even if you burnt the fossil fuel for electricity.

Wind + PV is a pure upgrade from an EROI perspective, and electrolysers and CSP are following very close behind.


You don't imagine, how large part of world, which have at least two months year, with near 90% fall of solar energy.

And they have to use traditional energy sources, or buy energy from neighbors.


I only mentioned solar/battery for brevity but clearly wind/battery is already substantial in many parts of the world. In addition, HVDC transmissions line costs are also dropping year by year and these allow solar/wind generated electricity to be inexpensively shifted across long distances.

For example, such a transmission line is currently being built to send solar energy from Northern Australia to Singapore across about 3000km of ocean. Another project is generating wind energy near Iceland and sending it to the UK a distance of 800km.


Ok, and thank You for remember about Australia to Singapore transmission line.

Unfortunately, most of territories I mention, also have low population density, about 1/10 of western Europe, and have low middle income, so it is not right to directly compare them with western Europe or Singapore, in possibilities to achieve same infrastructure power.


There's well under a hundred million people that aren't within easy transmission distance of somewhere with at least 10% capacity factor for a bifacial system in mid winter and don't already have more than enough hydro to go wind/hydro.

If 1% of the world needs to get 30% of their energy from gas while we figure out the hydrogen thing, it's not really a problem.


I'd be very interested to see the breakdown of input energy costs. Most notable is the raw energy cost required to power the lasers and control machinery in the experiment. But then there are other costs, all of which must be amortized over time for any real-world use case to exist. I say this because the journalists in this piece imply that net gain is simply based off of the amount of energy pumped into the experiment while it operated, but the total input energy would clearly be more than that.

On the extreme end, there's the energy cost of building the machine and engineering its components. For the vast majority of these, we can probably all agree that were a fusion power plant to be built, the net gain would fully eclipse these initial inputs fairly quickly. This may sound silly, but remember that the economic context where fusion so often sits is one that centers on renewable energy and sustainability. These costs do have to be accounted for.

On the other end, there's the energy cost consumables. For example, the deuterium and tritium fuel input into the device, which need to be purified (deuterium from water, possibly tritium from the atmosphere) or otherwise isolated (from what I understand, tritium is a byproduct from fission reactors and they serve as its primary source in scientific applications). It may well be that the energy cost of acquiring these consumables is fractions to fractions of a fraction of the energy cost of running the device, effectively constituting a rounding error. But I think when we're talking about net gain, a clear definition and accounting of the input energy required to run the experiment would be useful to communicate to the public.

I hope we see disclosure of these details with all the expected caveats when the peer-reviewed article goes to print and journalists have another feeding frenzy.


Early reports are are not good. For every joule delivered to the chamber, it takes 100 joules of electrical power. Heat to electricity is 50% efficient at best. Reports are that with 2.1mj of input, they generated 2.5 mj of output. Taking inputs and electrical production into account, this means 0.6% is all they are getting out vs. what they put in.

These over-unity reports are meaningless, because every damn one of them only measures Q-plasma, not Q-total.


NIF people like to call this the “target gain”, but someone there has been whispering “net gain” to the journalists, in their ongoing campaign of deliberately deceptive hype. But “target gain” isn’t even (pellet output)/(laser output). The denominator is laser energy deposited in the hohlraum; the rest of it doesn’t count. And this deposited laser energy is estimated based on a model of laser deposition—it’s not measured. (At least, this is the way it was the last time I bothered reading a paper from NIF. I got bored with it a while ago.) The modeling codes are classified; no one without a need to know gets to examine them, and they are not well benchmarked. So the actual target gain is likely < 1 in any case.


> The modeling codes are classified

What is the justification for keeping it classified?


NIFs purpose is to validate simulations of thermonuclear weapon detonations. The default is going to be classified, it's making something public that would have to be justified.


Because it's a weapons program that is entirely unrelated to power generation.


I have personally taken a tour of the NIF at Livermore. The guide was an old hand, who constantly remarked about the efforts of NIF towards "stockpile stewardship," ie the maintenance of the US arsenal of nuclear weapons. It seemed like NIF was all about the stockpile stewardship first, and fusion research was a secondary consideration.

The capability of the NIF to get positive energy from the energy that they impart on the Hohlraum itself is neat, but I constantly discount any milestones that Livermore/NIF report, because the inertial confinement approach has such higher barriers to commercialization than tokamak style approaches, that I just consign it to "boondoggle" in my head.

Yeah, the lasers could be 20x more efficient, and yeah, they probably could figure out how to pump 10s of targets into the chamber per second, but the energy extraction is just completely missing from the considerations. The engineering challenges are a whole 'nother level for NIF, a big barrier to usability.


Seems like energy extraction would be similar to other D-T designs: surround the reaction chamber with molten FLiBe or lead-lithium and run some coolant pipes through it.


> surround the reaction chamber with molten FLiBe or lead-lithium

So manufacturing fusion reactors would use a lot of lithium, which is already in short supply. That would be an interesting complication with the demand of lithium for electric vehicle batteries. Maybe the Li supply situation will be eased by then.


The quantity of lithium required is miniscule, but would require a fair bit of enrichment to eliminate the Li-6. The limiting resource is the inner wall for which no known material other than double the annual world production beryllium is even close to sufficient



Personally my money is on SPARC as a demo plant and its planned successor ARC as a commercial power plant prototype. Unlike ITER these systems use high field strength superconducting magnets, which directly translates to a much smaller machine for the same energy gain. Because of the smaller machine size, it can be built much faster than ITER. The company building SPARC plans to achieve first fusion around the same time as ITER, and since their machines are smaller, they should be able to move faster. That said ITER will be fantastically useful for proving a lot of science, and I am happy we have so many viable fusion projects in the works.

https://cfs.energy/news-and-media/new-scientific-papers-pred...


ARC's volumetric power density is just 40x worse than a PWR's reactor vessel, vs. 400x worse for ITER. Neither appears to be on a route to an economical power plant.


I’m not sure I follow, but you’re saying the power plants are very large relative to their power output, and this size correlates to cost, thus making them so expensive that they are not economical?

You’re probably right, but I guess what I’m saying is that we’ve never had a fusion power plant that produces net energy gain at any cost. I believe SPARC is on the path to doing so. It will still take a long time to make fusion actually affordable. For what it’s worth I am a huge advocate of wind and solar power. But fusion is neat and I’m excited for us to get to a point where we actually have sustained Q total greater than 1.


Yes, that's right. View it this way: a fission power plant and a DT fusion power plant are pretty much the same, except for the reactor. The fusion reactor is many times the size (and mass) of the fission reactor, made of much more sophisticated materials, with a much more complex design, operating at higher stresses (loads on supports of the magnets, thermal power/area at the wall, neutron flux). So how is it that it's expected the fusion reactor will produce power more cheaply than the fission reactor? Note that fuel is today a small fraction of the cost of power from a fission reactor.


Ah, yeah that is totally valid. My thinking is not exactly that fusion power will be cheaper than fission any time soon, but that the technology has the potential to deliver safer power, and as important as wind and solar are to our transition over the next few decades, I believe that fusion has the potential to deliver much higher levels of power than wind and solar, allowing for new uses for electricity previously considered impractical. I wonder what kind of new manufacturing processes we can come up with if we have enough power to deliver huge amounts of process heat, for example.

I agree with you that in a practical sense fusion power will not be economical in the next 50 years, but then solar power was not economical for most of my life either. I am excited for the technology to get to the point where at the very least it is producing power, as this will stimulate more investment in lowering the costs, and has been such a dream for longer than I have been alive.


Solar and wind potential is enormous. The current world average primary energy demand is 18 TW, but the Earth is constantly being struck by 100,000 TW of sunlight. In no sense is a shortage of sun and wind an argument for fusion.

As for safety, the problem with fission isn't safety, it's cost. Trading off economics to obtain better safety is solving the wrong problem.

If fusion is not to be economical for 50 years, it will be competing against renewables (and storage) that have gone fully down their experience curves. In a world fully powered by PV, on the demonstrated historical experience curve, the LCOE from PV could be below $0.01/kWh (in today's dollars). Fusion will have a very difficult time competing against that.


Tokamaks are science projects and confusing them with electricity generators benefits no-one.


I'm a complete layman when it comes to ICF, but I'm assuming that there is a scaling factor between surface area and volume that would eventually help here? As in, the lasers initiate fusion on the surface of the fuel pellet, which propagates the fusion into the interior of the pellet in a chain reaction / positive feedback kind of way. So that if you increased the surface area of the pellet by a factor of 10, you'd get 100 times more total output energy (since there is 100 times more mass in a pellet with 10 times the surface area). So you'd need 10 times the current input power, but would get 100 times the output power.


That won’t work, I don’t think. You can’t make the pellets much larger because laser nonuniformities and hydrodynamic instabilities will kill the implosion; there will be no fusion at all. But that’s not a problem, you see, because in a commercial reactor you’ll have a pellet factory making the required one million targets per day, and they will be injected into the chamber 10 times per second, with practically no down time. And each shot will have gain > 100 to get net energy out.


That’s pretty much what this design would require for continuous operation.

But I cannot tell if this comment is being facetious or rather optimistic. Therefore, I’ll agree!


To everyone whining about the lasers: who cares? It's an engineering problem, and there are already clear solutions and paths forward. Efficiency of the plant isn't what's being discussed in the announcement. Plant efficiency is an entirely separate problem that we already know about.

The exciting thing is that they've shown a fusion reaction in lab conditions which produces more energy than it takes to start it. Yes, the gain is small. It's nearly irrelevant to the amount of energy used to run the reactor, yes.

But it clearly shows that what we're trying to do is possible, and we've identified one mechanism that can initiate these reactions.

It's exciting. This is a great result that shows the science is progressing and beginning to finally show results.


Engineering problems are perfectly capable of dooming a technology. Fission's problems are all engineering problems.


I see I'm a little late to the discussion, but this is very similar to the patented Fusion device I've been proposing for several years here. http://www.DDproFusion.com

The main idea behind my reactor is to contain NIF like explosions in magnetic fields. I've been trying to get a test reactor built for a long time, and my plans have been hampered by a theft earlier this year.

A huge difference between the current NIF device and my proposed device is the speed of implosions and the strength of the field. While the NIF device must be re-built before every implosion, my device creates an environment where the implosions form as part of a harmonic oscillation. The ions are allowed to travel their entire individual cyclotron trajectories before they return to the implosion site... my target frequency was 2.4GHz, which is a useful frequency for direct conversion and COTS components.


I thought I would let you know that you misspell "inertial" on your website as "innertial". Normally I don't point out spelling mistakes, but I think it may be more noticeable when pitching a nuclear fusion reactor.

https://en.wikipedia.org/wiki/Inertial_confinement_fusion


Thanks, I always try to do my best, but I can only see through my own eyes.

I had one of the main videos on the site set to private for a few years until someone told me they couldn't see it. I had only tested on my own devices which were all logged in to my Google account so I never saw the problem.

I felt so stupid, but very thankful for the feedback. How many people went to the site and immediately dismissed it because such a central piece was missing? Probably a lot. I'm sure there are scientists who will never take a second look at the idea because of stupid errors like that. So truly, I value the feedback.


Happy to help. Best of luck!


Technically many of the issues with fusion might be solvable.

However, I see no reason what so ever why fusion would ever be cheaper then a GenIV fission reactor. I guess the advantage fusion has is that states actually invest serious money in fusion while fission is struggling to get funding.

The reason why I think fission will remain cheaper.

- Capital cost is less. A GenIV fission reactor is pretty low tech overall, in a non water based reactor the containment is mostly just a steel tub. Everything around the reactor, the heat loops, the turbine and so on will be mostly the same.

- Fuel cost. Fuel cost is already a small part of reactor cost, if we switch to a breeder the fuel cost is basically nothing. For Fusion, in the long term this is an issue and you likely have to breed new tritium.

- Operation cost. Seems to me that self regulating GenIV reactor should be easier to operate overall and there is much less complex technology involved that could break.

- Safty. A GenIV reactor that is passivly safe is already incredibly safe. Specially with a molten salt reactor, the radioactive chemical that get blown into the air, will just remain in the fuel salt and will remain in the reactor safety zone. A fusion reactor does actually contain radioactive material that could be dispersed into the air. A fusion reactor might still be safer, but the difference doesn't seem that big.

So really I don't get it, why would fusion ever be cheaper then fission?

That said, I'm not against research of fusion. I just wish more money was spent on actually getting GenIV fission reactors into real world uses. That would actually be a more viable solution to energy problems on the plant.


Very encouraging to see at least some enthusiasm for this. This is the real way forward.

We can't just stop using energy. We can't buy our way forward with "carbon offset" fees. And, most importantly, we can't just redirect all of our environmental conservation efforts to eliminating energy use. Remember when we were going to save the rainforests? Don't forget why we called these "green" initiatives in the first place.


Sure we can. Nothing is given. We are very likely on a path of contraction over the next century. Humans are resilient, we probably won't have a full societal collapse, but we might.

If we could pick a World 3 track to be on, which one would it be? Now, what can we do to try to push ourselves onto that track? That's what gets me up in the morning.

https://youtu.be/kVOTPAxrrP4


You are correct. We need to control our own numbers at a sustainable level.


> We need to control our own numbers at a sustainable level.

That's not at all what GP said IMO


Would anyone knowledgeable about the field update their priors about whether we’ll see commercial fusion in the next 30 years, after seeing these results? If not, is there a big milestone we’re waiting for? Or will fusion advancement be a slow grind with many small improvements over decades?


I'm not an expert but I've been following the field for a while. It's telling that negligible venture capital is pursuing this route to commercial fusion, and the only cheerleading for it comes from DOE lab press releases. That's because the NIF is a thermonuclear bomb simulator developed by a lab tasked with both thermonuclear bomb development and also developing a portfolio of civilian applications for its technologies. Even if the NIF were to break even on the entire power plant package in theory, harvesting energy from fast fusion neutrons is hard enough in magnetic confinement designs without them pulsing like a bomb as they do in ignition designs.

Meanwhile the VC money is quietly piling into tokamak and stellarator magnetic confinement designs, driven by high expectations from real breakthroughs in ReBCO tape manufacturing technology. These superconducting tapes can be manufactured like semiconductors and can develop magnetic fields that were previously impossible, which is a key manufacturability enabler in a design whose path to commercialization is far better de-risked overall. There are still concerns with the durability of equipment needed to capture the neutrons in these designs too, but ReBCO tapes were the real prior changer.


Funding is starting to kick in for private laser fusion attempts. Over the past couple decades, lasers have advanced even more dramatically than superconductors.

https://physicstoday.scitation.org/do/10.1063/pt.6.2.2021102...


Currently, about $3B is invested in fusion per year, while about $6000B is spent on oil subsidies. That's just to show how little we spend on fusion. Any decent increase in spending would really help speeding up the process. I think that's something we should all be promoting!


Thank you - exactly what I was curious to learn more about!!


I don't see this as dealing with the considerable obstacles to inertial fusion. In particular: cost of lasers, size of the system with survivable final optics, cost of manufacturing the targets, and targeting of moving targets with sufficient accuracy.


Big milestone is construction materials, could long enough withstand neutron stream, which is about two magnitudes more, than in fission reactors.

This is last unknown in this equation. All others are already known, from achievements of last few years.

Materials research is one of primary targets of ITER.

If good enough materials will not being found fast enough, will need to use clear reactions like boron-carbon fusion, in which need magnitude higher temperature, so practical device will be few times larger (because x-ray losses, proportional to surface square of plasma configuration).


ITER will only operate for a few weeks total at full power. It's not intended for materials development. For that, a Fusion Nuclear Science Facility (FNSF) would be needed.


You don't understand. ITER will RESEARCH, how existing materials withstand in real fusion reactor, and gather parameters of real fusion reactor, so other science facilities will have benchmarks.


ITER is fundamentally unable to replicate the conditions that materials will be subjected to in an actual commercial fusion reactor. It cannot achieve the same cumulative neutrons dose that a real reactor can experience. It will not be able to answer the questions that need to be answered to prove out the materials for first walls or blankets, and it will not be able to establish reliability metrics for these structures.

For this reason, there has long been a call for a FNSF. This facility is likely to be needed to establish designs for components that would go into the putative successor to ITER (DEMO).


> ITER is fundamentally unable to replicate the conditions that materials will be subjected to in an actual commercial fusion reactor

Are You joking? Or You just don't know physics?

What REALLY differ ITER (DEMO) from real commercial reactor?


No, I'm not joking.

ITER fails in at least two ways. First, the intensity of neutron radiation at the first wall is far too low for a viable commercial reactor. It cannot simulate the heat load a commercially viable breeding module would encounter. Second, ITER cannot operate for more than a few weeks, so it cannot simulate the integrated radiation load a commercial first wall would have to be able to withstand. It also cannot operate with enough blanket modules, for long enough, to move the designs down experience curves for reliability growth to occur so they are sufficiently robust for a commercial reactor (this is a huge looming problem, as they will be very difficult to repair.)

Abdou at UCLA has been beating the drum for a FNSF to actually address these issues. He's been beating this drum for DECADES.


> the intensity of neutron radiation at the first wall is far too low for a viable commercial reactor

Source? Proofs? Sorry, for me this looking as just Your opinion.

> Second, ITER cannot operate for more than a few weeks

This is just not important at all now. That what I mean, said You don't understand physics.

- NOWHERE at Earth possible to recreate exact radiation environment of Jupiter orbit for YEARS, need to test radiation capable computer environment for space probes.

What really doing? After first probes measured parameters of environment, at Earth built test benches, consisting of few throttle-able sources, so they give approximate spectrum, very like near Jupiter, but could do year dose in few hours and could easy be switched off, to make manipulations with tested samples.

So now, I even know guys, who touched exposed chips and running real world software on them, and real computers in Jupiter/Mars missions, working much longer than need for mission (BTW, first samples tested at Earth, where not reliable).


Good write-up to temper expectations at https://twitter.com/wilson_ricks/status/1602088153577246721

My TLDR (from a layman):

  * The output is greater than the energy *in the lasers*, but the lasers deliver 1% of the energy required to power them. Need 100x improvement to break even.
  * Converting the generated energy into electricity would cut the output in half. We need a further 2x improvement here, so it's ~200x to break even end-to-end.
  * The scientific equipment requires immense & expensive maintenance.
  * Plus the $3B facility around the equipment, that theoretically could deliver just 2.5 MW.
So we might be as close as 10-20 years away, as always!


No, not as always. The laser confinement mechanism works, it has been shown, lasers that are more than 20 times as efficient as these NID lasers are now available, so the improvement needed to scale and "commercialize it," whatever that really means looks more like 10x than 200x. In the world of fusion, that counts as really good progress. For one thing, perhaps a lot of the research money will move to lasers now.


I mean, you nail it on the head. It's not "congrats on limitless free energy" but more "looks we might still get value in the future if we keep pouring money into this." Positive indicators at milestones are good. Onward.


> looks we might still get value in the future if we keep pouring money into this.

Maybe this could also open up more avenues for money.


> so the improvement needed to scale and "commercialize it," whatever that really means looks more like 10x than 200x. In the world of fusion, that counts as really good progress.

Yes it's good progress, but an order of magnitude is not nothing. Squeezing another order of magnitude efficiency out of the lasers will be very difficult. It took 30 years or so to go from 1% efficiency to 20%, and law of diminishing returns applies.


Why does the law of diminishing returns apply? A lot of things aren’t diminishing returns.


Literally everything has diminishing returns because nothing is infinite.

Edit: to clarify, lasers will have some maximum efficiency that is less than 100% and approaching that maximum is subject to diminishing returns.


That is trivially true at the extremes of energy input. If you input an infinite amount of energy you will not get an infinite amount out.

But that’s not what we’re talking about. This is a physical process which is known to be exothermic for the energy ranges we care about.

As another example, raising the temperature of a flammable material 1 degree from room temperature will probably not light. Ditto with 2 degrees. But eventually, if you raise the temperature high enough, you’ll get more energy out than you put in. That’s the type of process we’re talking about now.


Diminishing returns usually applies if you assume there are no major breakthroughs. Can we assume that there won't be any major breakthroughs in this field?


Diminishing returns describes a trend. A breakthrough describes a single data point that bucks the trend. I'm not sure these are mutually exclusive, as after any breakthrough the diminishing returns trend is reestablished.

I wouldn't bet on no breakthroughs happening in laser efficiency, but more efficient lasers doesn't look like it will be enough to get to net energy given other inefficiencies in the system.


It's still probably about 100x, given efficiency losses all around, even on the highest-efficiency lasers.


ICF works for its purpose - research into thermo-nuclear weapons (fusion bombs).

It has nothing to do with energy generation though, and never has.


https://en.wikipedia.org/wiki/Inertial_confinement_fusion#As...

That's utterly incorrect:

"Fast ignition and similar approaches changed the situation. In this approach gains of 100 are predicted in the first experimental device, HiPER. Given a gain of about 100 and a laser efficiency of about 1%, HiPER produces about the same amount of fusion energy as electrical energy was needed to create it (and thus will require more gain to produce electricity after considering losses). It also appears that an order of magnitude improvement in laser efficiency may be possible through the use of newer designs that replace flash lamps with laser diodes that are tuned to produce most of their energy in a frequency range that is strongly absorbed. Initial experimental devices offer efficiencies of about 10%, and it is suggested that 20% is possible."


This is irrelevant - each shot also requires a highly precision engineered piece of metal called a hohlraum to be destroyed.

With current technology, running an ICF plant would cost literally hundreds of millions of dollars per hour in hohlraums, since a single one costs millions, and you need to shoot several times per minute to produce energy.

That's why ICF is not even close to being a plausible electricity generation technology, so it is only being researched by nuclear weapons research labs like LLNL.


hohlraums are not expensive because of base materials, but because today we generally produce them as one offs and the process is incredibly man hour intensive. The DOE "roundtable" on the announcement today addressed this.

For an actual look at the challenge of ICF i'd say look here: https://www-pub.iaea.org/MTCD/Publications/PDF/TE_1704_web.p...

and also consider that it might be used in combination with MCF for example: https://medium.com/fusion-energy-league/the-fundamental-para...


They are one-offs, but even if they were to be mass-produced, they require extraordinary precision. I very much doubt claims that one can be built in the range of a few dollars that each shot is worth in terms of generated electricity.

The reports you quote actually mention the target costs very clearly. The IAEA one talks about needing 500,000 targets per day, and sets a target of 0.30$ per target. At the time it was written, it says that a target costs 1000$, which is probably before NIF found put just how much more stringent the requirements for the shape of the target were (since the numbers I saw last time NIF achieved ignition were closer to hundreds of thousands of dollars per target, though maybe I am misrembering).

It's also worth noting that that report was expecting NIF to achieve the current milestone within 3-6 years, and it actually took 13. So I feel their numbers can well be considered optimistic.


I thought Fast Ignition had been abandoned because it was found it didn't work.

HiPER is also dead, I think.


The NIF is using old laser technology. Current tech can get above 20% efficiency. Sure, that still means more improvement is needed, but 200x is probably an overstatement by an order of magnitude.

> So we might be as close as 10-20 years away, as always!

I don't really get the cynicism here. This is a huge milestone that's been passed. Maybe with this, we actually will be 10-20 years away. Or maybe it's more like 30-40, who knows. But this experiment shows that net-positive energy is actually possible to do with our current understanding and technology; before this, I believe much of the skepticism was based on a belief that it may not actually be possible to get more energy out than put in, at least not without technology that's significantly out of reach.


Anyone have insight into how this new development differs from this article from back in 2014 about the NIF, entitled: "Fusion Leaps Forward: Surpasses Major Break-Even Goal"

https://www.livescience.com/43318-fusion-energy-reaches-mile...


Back then they were comparing to the energy actually absorbed by the fusion fuel. This is indirect drive, the laser hits a metal container first and only some of the energy gets to the fuel pellet.

This time, they're comparing to the total energy in the laser beams.

They're ignoring the inefficiency of the laser devices, but that kinda makes sense because they're using really old, inefficient lasers and much better ones are available now.


> This time, they're comparing to the total energy in the laser beams.

How do you know? Nothing has been published yet; it’s science through press release. In the past, published papers from NIF have been a real wake-up call after absorbing the misleading hype (the papers are most honest than the folks taking to the reporters).


Fair point, I'm just going by the article:

> The fusion reaction at the US government facility produced about 2.5 megajoules of energy, which was about 120 per cent of the 2.1 megajoules of energy in the lasers

I guess we'll see how things develop. But from a quick google, 2.1 megajoules is about what the lasers deliver, unless they've significantly increased their power recently.


Right. Livermore has been working on this since the 1970s, with increasingly powerful lasers. Now, they claim "theoretical breakeven" - slightly more energy came out of the reaction than went into the reaction. But 100x less than went into the lasers, let alone the whole facility. Nor is energy being recovered.

This was never expected to be a power plant technology. It's a research tool, for studying fusion.

"Technical breakeven" is when the plant generates enough energy to run itself. This is at least 100x below that.

"Commercial breakeven" is when it makes money.

How's that Lockheed-Martin fusion thing coming along?[1]

[1] https://lockheedmartin.com/en-us/products/compact-fusion.htm...


I'm getting really sick of the "always 20 years away haha" jab.

Look, it's really simple:

1. This is a very hard and expensive problem.

2. Progress IS being made.

It's not clever or cute to diminish progress on this problem.


I wasn't diminishing the achievement but clarifying its place in the context of how far we are from commercialization. The director of LLNL who announced the breakthrough discovery said she expects we are 3-4 decades away from commercializing it.


Then you could have just said that instead of adding the “as always”.


Helion tech seems to be interesting in that they use the electricity directly so avoids the costly conversion via steam/turbines etc.


And right now they're building their seventh reactor, for a net electricity attempt in 2024.


It's the only one with a ghost of a chance. Still only a ghost, and the 3He supply problem looms.


They'd make their own 3He by also doing DD fusion. It's properly a DD+D3He concept.


I would like to see a demonstration that the synthesis method would exceed consumption. It should be possible with current tech, if at all.


Well, they can operate on just DD, so they can start from no 3He and make 3He.

This video of a presentation by Helion's Kirtley at Princeton has a slide where the reactivity vs. energy loss is shown for a DD system at beta=1. That system will make 3He directly, and also make tritium by two modes (directly from DD, and by capture of neutrons on 6Li in a blanket.) The net result would be production of 1.5 3He nuclei per DD fusion, on average. It takes a while for some of those 3He to be produced though, as the tritium has to decay (halflife of 12 years.)

https://mediacentral.princeton.edu/media/JPP08December2022_D...


Even if the lasers were currently 100% efficient, the Q still needs to be increased by 2 to 3 orders of magnitude. That's because they're making less than a penny's worth of energy here, and the system cannot be economically feasible with that little energy per expendable target.


The thing that would be surprising is if they discovered something new to do; but this seems like more refinement of what they already know how to do.

Continual refinement may finally get us where we need to be, but it's going to take a long time.


temper


What if my expectations are tamper-proof, can you still temper them? Thanks, edited ;)


Probably 5-10 years if this turns out to be the key unlocking it. If it is, the floodgates will open for funding, public and private, and we'll see a race to build the first reactor. Similar to how the first COVID vaccine was predicted to take 2-3 years and it took 8 months instead because it was a priority.


It took only 8 months because covid has been in existence for decades. Covid 19 strain was new and the vaccines had to be adjusted to new strains not created from ground up


Not so; the mRNA technology used to develop and deliver the vaccine has been in progress for decades. The hardest parts were done before SARS-CoV-2 ever existed, but it's wrong to claim that "the vaccines" needed to be tweaked - they never existed.


For people confused about this, there were prior commercial attempts at coronavirus vaccines, with mixed success. They were not RNA vaccines. The COVID-19 vaccines built on that research (regarding what proteins to target, in particular), but the COVID vaccines that were rolled out were completely novel technology.


To be honest, looking at those numbers, that doesn't look 10-20 years away. We'd need Moore's law style improvement in efficiency and to productionize it. So we're really saying 20 years at best for the technology, and then let's look at quickly we can build Nuclear power plants today... uh oh. In the UK for example it has taken 12 years to even agree to build a new Nuclear plant on a site that already has Nuclear plants!.


Well... if Nuclear Fusion becomes actually possible in a cost-effective manner, so much for the need to roll out solar and wind-based electricity, which looks very much like a 1st-generation modern green energy technology in retrospect.

I'm not complaining. If we do crack the code on Nuclear Fusion, if I was the government, my next step would be to figure out how to build so many reactors that electricity costs go to basically zero. If you can charge your electric car for pennies, even the most diehard gas-car fans won't be able to resist. Offering a better product attracts far more users than, say, trying to shame people for CO2 usage (more flies with honey instead of vinegar).


> even the most diehard gas-car fans won't be able to resist

They just won't have a choice; if we can provide a real alternative, we can just forbid gas car altogether. Just like we banned CFC to save the ozone when better alternatives were developed.

The main issue is that our electricity grids and production facilities aren't ready yet to sustain a mass shift to electric, so we need to ease in the transition. But the moment they are, there is no reason to delay any further.


> They just won't have a choice; if we can provide a real alternative, we can just forbid gas car altogether. Just like we banned CFC to save the ozone when better alternatives were developed.

Banning gas cars outright, I think, would be a political miscalculation. There is broad mistrust of anything the government does right now in the US (not wholly undeserved), and it is likely to continue getting stronger, so not tainting it with a political ban would be a better solution in my view. Otherwise you risk polarization and failure, because not everyone buys climate change, or banning something because X is determined to be better now. It also would breed widespread resentment from people who aren't ready to switch (because, let me tell you, outside of cities, "reduces climate change" is something nobody cares about as a selling point). Just let electric vehicles naturally become better at everything and let gas cars slowly die naturally. The "invisible hand" will take care of the rest - just like it did with the horse and buggy.


You don't even have to ban it outright; you just ban making new ones (though even the CFC ban wasn't 1000% complete; there's been evidence that some companies were 'faking finding old supplies').

People who "really want to" will keep old ones working and most people will slowly start using the new ones.

After all you can still get a horse-drawn carriage if you want to, and you can drive a Model T, but few people bother.


Even with such a breakthrough, cost-effective fusion would still probably be 50 years away. Why would you assume it to be super cheap right out the house?


“The Lawrence Livermore National Laboratory experiment shows that scientists can get more energy out than put in by the laser itself. This is great progress indeed, but still more is needed: first we need to get much more out that is put in so to account for losses in generating the laser light etc (although the technology for creating efficient lasers has also leapt forward in recent years). Secondly, the Lawrence Livermore National Laboratory could in principle produce this sort of result about once a day – a fusion power plant would need to do it ten times per second. However, the important takeaway point is that the basic science is now clearly well understood, and this should spur further investment. It is encouraging to see that the private sector is starting to wake up to the possibilities, although still long term, of these important emerging technologies.”

emphasis, etc


Yes, _but_ the problem of generating laser light efficiently has and is being solved for elsewhere. Which is why the NIF didn't focus on, or update their lasers. This is a major problem for semiconductor lithography for example, and receives literally tens of billions in investment every year and one which has lasers that are already 20x more efficient than the ones used by the NIF.

The real question in the experiments here at NIF was about whether inertial confinement fusion would work. This is very promising progress.

Also NIF spends a good portion of its time on weapons research, not fusion power so it's only been a recent focus.


The loss just on the lasers is 100x (i.e. delivered power is 1% of the input energy). Add in a combined cycle effeciency of only 50%, you're looking at needing a 200x improvement to have commercially relevant "net gain"


Yes but NIF's lasers date back to the 1990s, and laser technology has improved a lot since then. NIF-class lasers with over 20% efficiency are available now.

https://physicstoday.scitation.org/do/10.1063/pt.6.2.2021102...

Same article mentions that some petawatt lasers can fire more than once per second now.


>Add in a combined cycle effeciency of only 50%

Some reactor designs let you harvest electricity directly from charged ions: https://en.wikipedia.org/wiki/Direct_energy_conversion


Not only that, but the capsules that are used for the experiment are expensive and difficult to produce. And you'd have to be continuously blasting new ones for each burst of energy you want to generate.

Taking those costs into account, being able to use this method to generate power seems really non-optimal.



Last time, they got something like 80% return on the laser energy input, now it's over 100% apparently. And, they had trouble repeating that last record, so people were questioning how meaningful it was if it couldn't be repeated. Now they've been able to repeat it & improve on it.


It’s a real bummer to me that hype around fusion has faded so much because of the false hopes that this sort of thing barely registers on HN anymore.


I think that people are waiting to see the real announcement not the scoop with limited details. Let's see what the Granthom announces tomorrow. Tough to be excited about scoops with limited information and without the level of robustness of the accomplishment.


It's gratifying to me that the hype around fusion has been increasingly replaced by a cold and realistic assessment of its actual level of realism.


Some perspective from cnet:

> But, as with all science, it's good to be cautious and not overhype results yet to be fully analyzed. We have been here before, after all. In 2013, reports swirled the NIF had achieved this exact feat. It wasn't the case.

https://www.cnet.com/science/climate/a-fusion-energy-breakth...

AFAICT, the only thing that's been publicly confirmed is that announcement will be made tomorrow.


First flight 1903 Moon landing 1969

It took 63 years of progress in flight technology. Not counting earlier experiments and R&D time.

First fusion experiment was 1933 Fusion seems a lot more complex to a layman (me) than spaceflight.

Excited for what's to come


We're still on track for fusion power in 2050. Simcity2000 nailed it. https://sonatano1.wordpress.com/2014/08/20/retrospective-sim....


This is an interesting video covering several alternative fusion power initiatives being pursued currently: https://www.youtube.com/watch?v=yNP8by6V3RA

The common thread is that they tend to aim directly for an electrical output rather than simply generating energy, and don't necessarily plan to have a self-sustaining reaction.


This would be incredible... very excited for the details in the announcement coming Tuesday.


Even if for workable viability Q (Q_? Currently 1.2?) must reach values on the order of 50 to 100, if considering real-world losses and efficiencies. It's absolutely great news!

[https://en.wikipedia.org/wiki/Fusion_energy_gain_factor#Engi...]


How does something like this produce power? With tomamoks etc., it seems like they draw out some of the heat (somehow) but how does this work with a pellet that has to be hit by a laser? I'm confused about what the working fluid is, if you will. Is there some kind of plasma chamber that the laser has to go through, that heat is then extracted from?


The energy output is 80% neutron radiation. Surround the reaction chamber with a mix of molten lead (for neutron multiplication) and lithium (for tritium breeding) and run some cooling pipes through it.


came to HN to post this!! Potentially 2.5 megajoule output from 2.1 input


...where 2.1 "input" is generated from >400 input.


Here are the numbers from the current live discussion :

https://www.youtube.com/c/EnergyGov/live

300MJ in at the wall, 2MJ produced at lasers (using 1980's laser tech), 3MJ out from reaction


As they said in the press conference, the lasers weren’t designed with efficiency in mind, because it is not the goal of the experiment.


Usual caveat about all fusion "got more energy out than we put in" stories: https://backreaction.blogspot.com/2021/10/how-close-is-nucle...

From a quick skimming it seems only one of the experts quoted here even mentions that (Tony Roulstone).

(Update: i wrote this comment in response to another story and the comment got moved here, so it lost a bit of context https://news.ycombinator.com/item?id=33958678&ref=upstract.c... - the press release indeed does mention this caveat, but many news stories missed it)


And they mention this right in the press release. Quote:

“The Lawrence Livermore National Laboratory experiment shows that scientists can get more energy out than put in by the laser itself. This is great progress indeed, but still more is needed: first we need to get much more out that is put in so to account for losses in generating the laser light etc (although the technology for creating efficient lasers has also leapt forward in recent years). Secondly, the Lawrence Livermore National Laboratory could in principle produce this sort of result about once a day – a fusion power plant would need to do it ten times per second. However, the important takeaway point is that the basic science is now clearly well understood, and this should spur further investment. It is encouraging to see that the private sector is starting to wake up to the possibilities, although still long term, of these important emerging technologies.”

While this spins it in an optimistic way, the challenges to make this work are significant. The laser is quite inefficient, so the gain must be much much larger before you have net energy gain. To scale it up to implode a capsule tens of times a second rather than a few times a day, is in the order of 100.000 times more frequent than today.Thus this is a long way from commercial production.


The NIF uses lasers produced in the 90's because their core mission isn't to make lasers better. We already have lasers which are 20x more efficient, and hitting a pellet 10*s is a trivial task. Those lasers can fire a 1khz or better. The EUV light sources for semiconductor lithography do this tens of thousands of times a second.

The goal of the research being done at NIF is to understand inertial confinement fusion. "Solving" these other problems isn't as important, other folks are solving these all day long for commercial industries already.


"That’s because they had to use 500 MJ of energy into the lasers to deliver 1.8 MJ to the target – so even though they got 2.5 MJ out, it’s still far less than the energy they needed for the lasers in the first place. In other words, the energy output (largely heat energy) was still only 0.5% of the input."


Partly that's because they use laser tech from the 1990s, with less than 1% efficiency. Now we have NIF-class lasers with over 20% efficiency.

https://physicstoday.scitation.org/do/10.1063/pt.6.2.2021102...


They'd still be getting only ~1/4 of power input with a 20% efficiency laser.


Yes, but the overall we get "more power out of the building then we put in" isn't the goal. They are trying to drive the Q factor of the reaction itself up. If they get that to > 5x what the laser strike hits (a very real probability) it's likely trying to make a building that has a net positive Q makes sense.

That building would use modern lasers, modern supercapactiors, etc. to significantly change the "other" parts of the equation.


This is like making a program work on your laptop, but realizing that scaling up to production with 1000 laptops would be too costly, so you use actual servers. But you are in no rush to code directly on a server machine.


Converting the heat energy to electricity loses an additional 50-70%.


I don't see that scaling anytime soon, still more than two orders of magnitude away. But never say never.


If they replaced the lasers in this building from the 90's with a modern light source it would immediately do two orders of magnitude. Research like this needs to focus on solving, and experimenting with one problem (in this case the physics of inertial confinement fusion). They are not _trying_ to build something which gets "net power out of the building". So don't assume you're net in, net out ratios are representative of what a plant targeted doing that would be.

It's quite easy to see that replacing the lasers, the capacitors, etc. with more modern technology would have an immediate effect. But it doesn't matter until doing the reaction at all makes sense. That's what they are focusing on.


The exact numbers depend on the form of fusion in question, but fusion does have some several places where it has quite substantial x^n growth possibilities, where "n" is definitely greater than one and can be greater than two at times, sometimes even substantially so. This means that there is some real, concrete hope for improvement in a way that, say, solar could never improve more than 4-5x where it is now because the absolutely best it could ever hope for is 100% efficiency. At the core, this is because as you get the plasma hotter and more confined, the rate of fusion goes up very quickly, much much beyond linear increases.


This is laser based fusion, which is super cool, but it might be a stretch to expect 200x more efficient lasers. Still maybe there's other things you could do, like make a bigger fusion reaction. Hydrogen bombs do it, so maybe.


The lasers they use today are 20x less efficient than state of the art. The capacitors are also massively less efficient. So they only "need" to drive the Q factor of the reaction up by ~5x to be positioned to build something with a net energy gain.

Because of the physics of fusion (or ICF) returns on power are non linear. It's very much possible research here results in a path to a "net gain facility".


I wasn't aware actually. I was under the mistaken impression that their progress has only been possible because they were using state of the art lasers. This could actually be possible then, and far sooner than magnetic containment fusion.


So... Q-plasma is above 1 for the first time, which is a huge deal.

Q-total is still below 1, but some of that can be improved through already-known laser efficiency advancements, and also by pushing Q-plasma higher.

I think pushing Q-plasma above 1 is the big gate though, isn't it? I mean, partly psychologically. Showing that it's actually technically possible.


It's not really a big deal, since laser fusion needs Q of 500 to 1000 to actually make sense.


> all fusion "got more energy out than we put in"

I'm curious - given that this is the first time we have ever done this (even with the constrained definition as discussed in this article), how there can be a '"usual caveat" about all of the "got more energy out than we put in" stories'?

AFAICT, this is the first such "story" to have ever happened artificially in history.


Because people like to shit on fusion, sometimes understandably so, after decades of over-promising and under-delivering. It's annoying and tiring, but there we have it.

Yes, it's true that in this case we didn't actually "get more energy out than we put in" when considering the full closed system, but the point of this research was to see if they could get more energy out of the reaction itself than was put into it by the lasers themselves. Presumably the next step is to see how far they can push this, still without bothering to think about the energy needed to power the lasers themselves, because, again, that is not the purpose of this research. There are other people working on making lasers more efficient, and the overall project will benefit from that research (and so will the NIF, if they decide it's worthwhile to upgrade their 90s-era lasers to something modern).

I think a lot of people here are having knee-jerk reactions and didn't read the article where they very clearly explain the caveat and what the researchers actually did.


To be fair to the original commentator, their comment was moved from a different article where it was not so clearly explained.


Because all the interest of these stories is in fusion as a source of energy, and there's a long history of declaring we're near to break-even by leaving the most energy intensive part of the apparatus out of the equation.

With no disrespect to the researchers in this experiment, it's not like we're surprised that fusion works or that a pellet can generate more power than is put in.


The caveat still applied when experiments reported energy gains below 1.0.


Could someone break down the costs of realistic fusion for me like I am 12 please?

For example, for fission, my 12 year old understanding is: Stack uranium plates until the reaction is self-sustaining, boil water to spin turbine, if reaction gets too fast, cover it with lead / cool it with water. Circulated water is slightly radioactive. Main costs are keeping reaction container / need power to circulate water cooling, disposing of spent fuel is a problem. Power output is 100s or 1000s times more effective than coal / oil once running. In addition to meltdown risk, public opinion is concerned about radioactive cooling water near their community.

What's the same tldr for fusion? (and feel free to correct my tldr)


Tiny H-bomb except pure fusion, and instead of a fission bomb as the trigger, you have huge lasers. You’d produce energy the same way, with heat being captured by some sort of spherical shield around the tiny bomb (which could also be breeding some of the fusion fuel out of lithium) and used to produce steam to run turbines.

This is the first time that the laser’s photon energy was exceeded by the energy produced by fusion. But this machine isn’t optimized as a power plant, just to demonstrate fusion (mostly to improve modeling of H-bombs, actually). The shots take hours to do, the tiny bombs are currently expensive to make, the chamber for the tiny bombs isn’t designed to capture heat, breed fuel, or even withstand damage from higher yield fusion. Another machine would be needed to demonstrate like 10-100 tiny bombs per second, and the efficiency (and repetition rate) of the lasers would need to be higher and the energy gain also needs to be much higher (but if they got “ignition” where the fusion heat helps sustain the reaction, this may be doable). And need to find a way to make these tiny bombs cheaper.


Realistic fusion (with the best understood technology): build powerful magnets around a donut shaped chamber, which allows to contain a plasma comprised of Deuterium and Tritium (both Hydrogen isotops) which is then heated by externals sources. Reach very high temperatures such that fusion reactions occur frequently. Some of this energy stays inside the plasma, and some of it escapes under the form of neutrons. Capture these energetic neutrons in a blanket around the chamber, creating fuel (tritium) and heating water pipes that then drive a normal steam turbine. Tritium is radioactive (but has a very short shell life; just wait a couple of decades), and the chamber may be slightly radioactive after decades of neutron bombardment. There are no problems of long term radioactive waste, and the reactor can't do a chain-reaction, so no Fukushima or Tchernobyl.

I need to explain what Q is in the context of fusion. Basically, you heat the plasma with some energy (Energy In), and the fusion reactions produces some energy (Energy out). Q is basically the ratio (Energy out)/(Energy In). When Q is bigger than 1, we call it break-even. However, (Energy In) is not the actual cost of energy you need to run the whole facility, it is only the Energy that reaches the plasma. The same goes for (Energy out): this energy cannot be captured 100% efficiently. Some of it will heat the plasma itself, some of it will escape but the conversion back to electricity is not 100% efficient.

So in a sense, Q > 1, aka break-even, does not mean commercial fusion, it is only a kind of a psychological barrier to achieve (so this is what the NIF announced; still a major breakthrough). We need at least to achieve (Total Electrical Energy out)/(Total Electrical Energy In) > 1 to achieve commercial fusion. But physicists consider the rest as engineering problems, not physics problems. And great news, there is no theoretical limit on how big Q can be: for example, the sun has a Q of infinity, as there is no required energy input. Current estimates put Q at least 30-40 to achieve commercial fusion (again: there is no physical limit to achieve that, only engineering difficulties).

Main costs are: difficult to define, because we haven't commercialized a reactor yet. I would say, for now, everything around it is expensive (magnets, the blanket, the fuel (tritium)). However, once we have sufficiently understood the optimal parameters on how to produce net gain energy, there is no reason why the design of the reactor can't then be simplified to be mass-produced.

Note: the technology used by the NIF is very different from what I described for a realistic fusion device: what I described is called magnetic confinement, and what the NIF did is called inertial confinement.


Thank you, "Current estimates put Q at least 30-40 to achieve commercial fusion (again: there is no physical limit to achieve that, only engineering difficulties)" is exactly what I was looking for.


One thing to consider: Even if you prefer solar, you still need to initially make those solar panels and that is an energy intensive process.

I think we're still probably 20 years away from commercialization of this, but I still think this is a very big deal.


> you still need to initially make those solar panels

Can't you use energy produced from existing solar panels to create more of them?


Yes, Solar Breeder Factories, I believe they're called.


I hope I'm wrong, but this seems like a lot of other "firsts". I'm guessing the total (and I mean -total-, lasers typically aren't that efficient) energy put into this will be much greater than the output.


At least according to the TFA, it seems that the breakthrough is that they got 2.5 mJ out vs. the 2.1 mJ that was used to power the laser.


MJ, not mJ. 2.5mJ is roughly the energy of a single keyboard keypress. 2.5MJ is over half a kilo of TNT.

Fun fact that Wolfram alpha just informed me of: a phone uses between 10 and 20 MJ a year: multiple kilos of TNT. 4000mAh * 3.7V * 365: yep, it's about right.


Oh, oops that's a mistake, thanks for catching that.

Also, interesting fun fact indeed.


What I found impressive was the failure or rather why it couldn't keep going wasn't due to the fusion process it was due to overheated magnets.

Generating 59 MJ (11MW) in five seconds was impressive too although I didn't see what the amount of energy that was input.

I'm also curious how are they going to heat water for the steam generators? Water can't be heated to 150 million degrees C something has to moderate it down to X degrees. That seems to be incredibly wasteful.


With D-He3 process, most energy generates as neutrons (about two magnitudes more than in fission reactors) and as break x-ray.

Each will be converted to heat in walls and in shielding blanket, which cooled by water.


No, D-3He produces rather little energy in neutrons, especially if the D ions can be kept less energetic than the 3He ions to suppress the DD reaction. The D+3He reaction itself produces 4He and a proton.

NIF uses DT, not D3He, btw.


> The D+3He reaction itself produces 4He and a proton

Yes, I made mistake. Thank You.

But this is not making much difference from neutrons from D+T reaction, particles are not too convenient to got energy from their moving.

Much better pure Boron+Carbon reaction, from which energy will go just as photons, for example could be used to feed laser/maser and then convert to electricity by photovoltaic or high frequency power diodes.


Any opinions on the book mentioned: Star Builders by Arthur Turrel

https://aeturrell.com/


They'll get greater efficiencies at scale with a hydrogen-helium target around 1.989 × 10^30 kg. [Nice science joke for those who get it.]


While I appreciate all the effort in nuclear fusion and do think we should continue to invest a little of each years global R&D budget, it seems these reactors (e.g ITER and this one) still require tritium which is rather hard to come by efficiently.

Which means normal nuclear reactors will be needed to make it and minimising any economic viability of the dependent fusion rector for a long long time.


> tritium which is rather hard to come by efficiently

I'm not by any means well informed on the matter, but isn't the lunar surface covered in tritium deposits?

It might make sense to mine the moon sooner than later. Once we have the necessary equipment and resources there, the delta-v for getting the mined product to Earth isn't nearly as substantial.

Building lunar mining tech is likely to unlock all sorts of advances for the human race.


You're mixing up tritium (hydrogen-3) and helium-3.


Normal nuclear reactors are a good thing too, and they alone are enough to solve all of humanity’s energy problems (though we should pursue fusion power too, of course). See Integral Fast Reactor.


Only if you use the notoriously dangerous breeder reactors – otherwise there isn't enough fuel.


Breeder reactors are not "notoriously dangerous", they are just a little too expensive to justify their construction when the uranium is cheap (like it is now). Also, there are proliferation risks. However, these are not engineering problems nor scientific problems, breeder reactors are production-ready and safe.


I've never really gotten the "proliferation risk" in the context of US power production (or China, Russia, or even France, for that matter). We're talking about existing nuclear powers, they already have the capacity to make nuclear weapons. If they wanted more they would make more, for the simple reason that having nuclear weapons is table stakes for being a serious player in geopolitics.


Fast reactors present the possibility of prompt fast criticality in a serious accident. This could be worse than Chernobyl.


Tritium will be bred in the reactor that uses it. Exactly how is a problem which will be solved further down the development path but there’s little question about the viability of that.


I believe the tritium issue is addressed through the inclusion of lithium in the reactor's inner blanket [1]. Something about the neutron interaction with the lithium results in some non-trivial production of tritium which is then freed into the reactor. tl;dr - they've thought of that.

[1] https://en.wikipedia.org/wiki/Breeding_blanket


My understanding is that this is proposed, but has not yet been tested. In fact, one of the goals of ITER is to test various breeder blanket designs.


Isn't this the case with nearly every aspect of "proposed" fusion reactors. Just because it's proposed or "not yet tested on a commercial fusion reactor" does not necessarily mean that the mechanism is not well understood.


I think if it were so well understood, ITER wouldn't be testing over 100 different breeder blanket designs. I've seen breeder blanket design described as one of the biggest challenges with fusion today.


I would expect that it is more a matter of selecting the best/optimized design rather than demonstrating the fundamental viability of tritium breeding.


2.1 megajoules of energy in lasers to make 2.5 megajoules of heat energy.

If you turned that heat energy into electricity (our ultimate goal here) you'd have:

(2.5 megajoules produced * 50% loss in conversion to electricity) - 2.1 megajoules input = negative 0.85 megajoules generated

This is still cool of course, but we're still way off from making this anywhere near feasible.


This is science lab, not a power plant. The point is to create and prove new technologies.

They could easily buy a newer, more efficient laser for example. That would increase the overall efficiency, but would ultimately be a waste of money. It wouldn't change the science at all, and the point is the science.


Welcome to armchair fusion engineering discussions of HN. Remember to wear your hard hat safety helmet.


Asking as a layman, are there any hybrid solutions between inertial and magnetic, or are they mutually exclusive? I'm imagining using magnetic field for macro-control and laser for micro-adjustment. Kind of like SOC designs that have separate cores optimized for different workloads.


NIF recently started experimenting with adding a magnetic field: https://lasers.llnl.gov/news/magnetized-targets-boost-nif-im...

I don't think they used that for this recent event, so if it works out that's potentially a significant improvement.


If this were hosted on a science website, I would be more inclined to believe it. But because it's on FT, it smells like the Stein had their "free energy breakthrough" on the Economist - i.e yet not another science website buy a website for investors.


There is a difference between a breakthrough and a milestone. This is a milestone.


I agree!

- Breakthrough: a sudden, dramatic, and important discovery or development

- Milestone: a significant point in development

This is clearly neither 'sudden' nor 'dramatic' and should NOT be celebrated or acknowledged as a 'breakthrough'. This level of journalistic malpractice is worth noting, and I have contacted the author on Twitter to voice my disapproval.


just like this time in 2013, https://www.science.org/content/article/fusion-breakthrough-..., and this one in 2021 https://www.sciencealert.com/for-the-first-time-a-fusion-rea...

Which definition of breakeven are they using this time? https://www.youtube.com/watch?v=JurplDfPi3U&t


Anyone remembers Lockheed Martin container-sized fusion reactors announced couple (edit: 8) years ago?

https://news.ycombinator.com/item?id=8458339


They discovered that it actually had a power density 100x lower than what they had said, if it could even work at all. Last I heard the group there was disbanded in 2019.


Mandatory video by Sabine Hossenfelder:

https://www.youtube.com/watch?v=LJ4W1g-6JiY

So they probably are talking again about Q_plasma, not Q_total .


And a mandatory response that highlights how her ignorance only hurts general society's understanding of the shape of the problem.

https://youtu.be/KtqC8W0_Ups


The response does not disagree with Hossenfelder. It just points out, that Q_Plasma is a useful metric to track progress for research and science projects on fusion. However for building a useful fusion power plant only Q_Total is relevant in the end. This has been misrepresented very often and Hossenfelders criticism is absolutely justified.


Even if fusion ends up producing more power than consuming in the real world, it still has to compete on cost. People too enthusiastic about fusion tend to ignore that it might not actually be a cost effective source of power.

Solar panels are cheap and batteries are easier to build and there are lots of ways of making them.


Unfortunately, energy storage is still an unsolved problem. Research on batteries may get us there soon, but today they aren't feasible. It's very much worth putting effort into both approaches. IMO the best outcome is a wide variety of clean energy sources and storage solutions, so the best solution can be chosen for a given geographical/political/etc situation.


Solar and batteries are already cheaper than fossil fuels in most markets. Nuclear isn’t competing with renewables, it’s competing against batteries and almost free renewables that charge them.

Nuclear is still possibly a great fit for niche locales where renewables aren’t feasible at all. Not a nuclear hater by any means (we need every innovation we can get), just show your math.

https://www.science.org/doi/10.1126/science.365.6449.108


Are solar + batteries feasible to heat every house in Minnesota with electricity when it's below -20F (-30C) for a week, we have <9 hours of daylight per day, and failing power literally means death? I genuinely don't know. Like I said, having a variety of solutions is the best outcome so we can choose the right one & have backups.

> just show your math.

I admit I can't. It's mostly gut-feeling from various science news sources I keep up with (e.g. Ars Technica; Skeptic's Guide to the Universe).


Only if you limit yourself to using solar generated with Minnesota's state borders.

Solar, Wind, HVDC transmission lines, short-term battery storage get us most of the way there, and is all on the process of being built out now. Medium term storage is still up in the air (flow batteries? compressed air?). Long term storage looks like hydrogen or natural gas with carbon capture. All these things seem more achievable than fusion in the next few decades.


> if you limit yourself to using solar generated with Minnesota's state borders

I live in a cold state. The idea of relying on out-of-state power, regulated and controlled by people with zero accountability to you, for life-and-death energy is a tough sell.


Bad news then. You most assuredly rely on natural gas from Texas traveling through a long underground pipeline to heat your homes and businesses. Relying on solar electricity from Texas or Arizona traveling through a long wire isn't going to change the status quo much.


> most assuredly rely on natural gas from Texas traveling through a long underground pipeline to heat your homes and businesses

Last I checked, we mine our own coal, pump our own oil and put up our own wind farms [1]. Minnesota, for what it’s worth, runs on renewables, coal and nukes [2]. The fifth of natural gas it does use comes from Canada, the Dakotas and Iowa.

These cold-state energy security concerns are a big part of the political puzzle that gets missed in the national discourse.

[1] https://www.wsgs.wyo.gov/products/wsgs-2012-electricalgenera...

[2] https://www.eia.gov/state/analysis.php?sid=MN


In northern states almost all residential energy use is heating. The amount of electricity used is minimal, therefore even modest amounts of electricity generation can meet need. Wyoming is the only northern state that has natural gas in notable amounts, all other states import a lot of their energy (especially heating) needs.

If most states stopped importing energy they would have to go back to wood and coal-fired stoves. That would be a huge quality of life reduction in terms of convenience and home air quality.


> almost all residential energy use is heating. The amount of electricity used is minimal

Resistive heating.

> most states stopped importing energy they would have to go back to wood and coal-fired stoves

Most states don’t have high-baseload, low-latency life-or-death energy requirements. Those that do have the options I outlined above.


Heat pumps should be paired with rooftop solar and batteries whenever possible for resiliency. I admit the use of natural gas will decline in my lifetime, but probably won’t be fully deprecated.


The state you live in has one of the highest potentials for wind power in the country, easily backed by transmission, batteries, and as a last resort, natural gas.

High level, the energy transition isn't simply a fossil->renewables story, but also a centralization->highly decentralized story.


Totally agree, though I don’t know how wind performs in extended and deep subzero / heavy snow conditions. Hydropower is the traditional baseload for the Midwest, but it’s tough to square the destruction to natural beauty that entails in comparison with a remote nuclear set-up.

EDIT: It seems not too badly [1].

[1] https://empoweringmichigan.com/how-do-wind-turbines-work-in-...


I grew up in Minnesota, but left before there was much wind farming in the SW part of the state. (I've been told, SW Minnesota is one of the best places for wind farming.) Wind farming does work well in SW Minnesota.

However, I also remember a news story about some used wind turbines relocated from California that had trouble due to inadequate heaters to keep the lubricant from getting too viscous.


What does the geothermal story look like? I expect it's expensive to first set up, but after that, maybe it's cost-effective and reliable? Asking because I genuinely don't know, but haven't seen it mentioned in this subthread.



Note that there are two common uses of "geothermal". One is for geothermal power generation, and but there's also an unfortunate use of the term in describing ground-loop heat pumps and similar technologies. Ground loop heat exchanges are a godsend for heat pump efficiency in the deep of the Minnesota winter, but it's very different from a source of heat that's practically exploitable for electricity generation.

From the context, I think your link is relevant to the GP's question.

However, if you search for "geothermal Minnesota", you'll get hits primarily related to ground-loop heat pumps.

Note that in the Minneapolis area, the ground will freeze down about 3 feet in winter, so you need to bury your ground loop deeper than that. The frost line is even deeper up in the Duluth area. (Also, you need to use an air compressor to purge the vast majority of water out of any in-ground sprinkler systems before the ground freezes.)


To keep warm, I'm estimating 2,628 kwh for a month for a home for a family of 3. In our magical Minnesota where everyone lives in houses with 3 people and only electric heat pumps, we'd have 1,900,000. This means, we'd need 4,993,200,000 kwh in the coldest month (4.993 Twh).

500,000 kilowatt of panels would produce ~33 gwh in the worst month (January). So, we'd need 151 times that many to have a good chance of doing this with purely solar. That'd mean 75,500,000 kw of solar panels. Assuming that we could install these for $1.50/w, that'd cost 113,250,000,000 and there's still a chance that we'd freeze people to death.

To mitigate that risk, we'd want to add ~500 gwh of batteries (just guessing as to needed capacity here). At a price of ~150/kwh, we'd be looking at ~75,000,000,000 in energy storage prices.

Feel free to check my math, as I did that pretty quickly. The figures are absurdly high due to scaling for the worst case type scenarios. Summer months would correlate with lower demand and more than double the supply.

Sensibly speaking, noone would try to do this. Its like building an offgrid home. You can get 90% of the way there and add a generator, or you can spend 10x more be truly offgrid. Almost everyone chooses the former. Maybe even 80%. Solar is great and very cost effective, but the returns diminish the deeper one goes.


Nice. I just looked up last February's bill for my ~1700sqft detached SFH in Saint Paul. It was apparently 6.8 therms/day (12 deg F average temp for the month). That maths out to about 5916 kWh for the coldest month (6.8 therms * 29 kwh/therm * 30 days), or a little more than double your estimate. March was 5.9 therms/day and Jan was 5.4 therms/day. So I think your costs are on the conservative side of things... or possibly my home is very inefficient :)

E: Ah, it occurs to me that you're using electric heat pumps, which are probably much more efficient than my NG boiler.


Yes, I pulled the estimate for really efficient heat pumps. To convert to all electric heat like that estimate, we'd have to replace a lot of gas heat with electric. Might as well go for the most efficient thing.

Compared to the nearly $200B in infra investment that I was estimating, that looks easy, lol.


I realize this isn't relevant for a discussion about future investment, but the current "value" of the whole energy infrastructure for a state is probably in the hundreds of billions of dollars, right? It's been built out over decades, of course, so the actual costs per year are much lower.


That's definitely possible. Going based upon the output of something like Catawba, that looks like ~3 nuclear plants. I bet that could be done for less than 100B, though I'd just be guessing. I also don't know anything about operations costs for that.

Also, I estimated solar at $1.5/watt. That's probably at least 50% too high.


Why do we need to cover the worst case with 100% renewables?

The goal is to reduce emissions so it would be great even if we can just stop burning coal in the summer.


I think because it's the learned defensive reaction. What ends up happening is that you have someone who really hates fossil fuels who is more than willing to back policies that require a quality of life drop or a massive cost shift onto individuals to achieve 100% renewables. So whenever it comes up anything positive you say about renewables has to be come with the explicit caveat that it's not yet a 1-1 replacement.

It's one of those issues the overwhelming majority of people are on the same page about what we should do but at the ends you have "my livelihood depends on coal" on one end and "my life is insulated against the downsides of full-renewables so I'm privileged enough to have out of touch opinions" on the other and that's who shows up in comment sections.


We don't need to do that. But the media focuses on things like that and turns everything into some sort of weird argument that renewables are literally going to freeze gramma to death. Its overwhelmingly about emotion.

Its the same as what we see with EVs, tbh. Oh noes, what if you get caught in a snowstorm!? Imagine if 80% of the cars were EVs and they got stuck and there were... no chargers! Picture yourself freezing to death because of "those people".

Real world performance and goals are not correlated well with media hyperbole.


This site has changed a lot in the past year. Its been strange to watch.


Eventually we have to get to zero net carbon emissions. But the worst case is just to create carbon based fuels from CO2 extracted from the atmosphere and use it in places/for uses which cannot be covered by renewable electricity directly (the far north, airplaines, ...)


We can look at how solar/wind/storage compete with putative fusion. Fusion is a baseload source, so let's see how they would do to provide "synthetic baseload".

https://model.energy/

Selecting the state of Minnesota, 2011 weather data, and 2030 cost assumptions, this would be about 70 Euro/MWh. The cost optimized solution would involve 222 hours of hydrogen storage, 5 hours of battery storage, 4.2x peak power of solar and 2.4x peak power of wind.


In central California with ideal conditions, one day’s worth of storage roughly doubles the price of a solar system that is correctly sized for net zero production in November (assuming a wood stove is supplementing a heat pump).

I don’t think storage will be feasible in places like Minnesota. The following makes far more economic sense:

- Double solar / wind production by buying 2x more panels vs. “normal” states.

- Go all electric (heat pump / induction) for appliances and vehicles.

- Buy 8-24h worth of house batteries.

- Use a fossil fuel generator to top off batteries during outages (this more than doubles the generator’s end to end efficiency)

- Sell excess electricity to the grid, where it is used for subsidized carbon capture.

This should be completely resilient against storms and power outages, and extremely carbon negative. It would cost about 2x as much as best case renewables.


I think UMN did a study with 4 hour storage plus solar on the grid a few years back.

https://energytransition.umn.edu/modernizing-minnesotas-grid...


Thanks, this was informative. It wasn't clear to me, but I think the study does not account for switching heating from burning NG in the dwelling to electricity. I don't have numbers, but I'm pretty sure that's going to introduce an enormous load on the system, and is my main source of skepticism for wind/solar/storage as a solution for all electricity generation in places like Minnesota.


I honestly wonder if large scale population of the northern areas is feasible without carbon fuels. Historically chopped wood was used to heat northern homes and camps, later coal and oil and I guess now to some extent electricity, but as you say, renewable energy doesn't apply there. If places like Minnesota are a net negative for green/renewable energy, their costs may be much higher to offset generation in more favorable climates.


Cold can be mitigated a lot by enhanced R-value insulation in a single application. Northern states have higher levels of insulation.

https://www.energystar.gov/campaign/seal_insulate/identify_p...

I don't really see a hot/cold stratification in this chart-

https://www.statista.com/chart/12098/the-us-states-with-the-...

And even then, the difference in costs seems quite small. Alaska is $332 and Georgia is $310.


Not are carbon fuels are carbon negative - biofuel pulls down carbon from the atmosphere when it's created, so is considered carbon neutral.

I think it's highly likely we'll be burning a lot of algae fuel in the coming decades in situations where the energy density of carbon fuels is necessary.


The birds fly south for the winter. Then again, the birds dont have to worry about who owns the land wherever they eventually land.


Minnesota can use wind, which is also cheaper.


Minnesota has anticyclones, which are periods lasting over a week with almost no wind.


There are plenty of other renewables usable than solar. Wind power would be the obvious one, as wind is often a great complement to solar anyway. Then there are long-distance transmission lines, water power, energy from biomass. Finally, if everything else fails, create hydrocarbons from CO2 in sunny places, ship those "eFuels" to Minnesota.


I wonder if we should seriously consider moving people away from such cold climates and towards warmer ones. Air conditioning is cheaper and coincidentally happens at about the same time as maximum solar power.


This might work with post-Surak Vulcans, but it's not gonna fly here on Earth with humans :)


We are doing that actually, but the other way: rather than moving the people, we are moving the climate.


"Places where sun availability makes solar inefficient" is still a niche so massive that "niche" seems like a bad descriptor.


Keep in mind HVDC. 3300 KM north of the Sahara desert, and you are relatively close to the Arctic circle. North of that is still a "niche," but now we're talking about a million people living hugely spread out.

Most of those people living in Russia, Norway, and Sweden with easy access to an abundance of hydro, to the level that energy flows north to south in the Scandinavian countries.

https://en.wikipedia.org/wiki/High-voltage_direct_current


Well maybe once they finish blowing up the ones they built in Donbass and half-built near Saratov we can revisit it.


> Solar and batteries are already cheaper than fossil fuels in most markets.

This is false. This has only ever been shown to be true in extremely narrow edge cases where the batteries only needed to last overnight in extremely sunny locations.

For solar+batteries to be cheaper they need to be large enough to power through weeks/months of cloudy/snowy/leafy/rainy weather in places that are at least near higher latitude locations.


Construction still hasn't begun on that project, 3 years later. When is the last time LA had a large construction project come in under budget?

I'll believe it when the batteries are actually installed and the bill is paid.

Also, the solar farm is planned for 800-MWh of storage. In 2021, LA used over 65 TWh of electricity[1]. That's over 7 GWh, per hour. So this storage would run the city for a few minutes. Not exactly a replacement for base load generation.

[1] https://ecdms.energy.ca.gov/elecbycounty.aspx


See you in ten years at the earliest when any nuclear generator you break ground on today generates its first kWh of power (assuming it isn’t wildly late or over budget, as every one built since the 70s has been).


I'm not saying fusion is necessarily the answer. I'm just tired of hearing "solar plus storage is the cheapest option" when the sources always rely on projected costs and a pathetically small amount of storage.

We need a major breakthrough in storage tech to make grid-scale storage a reality. Li-ion batteries are never going to cut it. Who knows whether grid scale storage will come along faster than fusion.


We don't need major breakthroughs, we just need to watch technologies proceed down their experience curves.


But at least we’ve built them and we know we can provide the necessary capacity.


That’s great! Because there are [ONLY] 8500 coal power plants producing 20% of the CO2 emissions globally.

Removing 20% of emissions will make a huge difference.

ETA on this should be around 2030?

What I don’t get is since solar is cheaper, why are we building so many coal power plants?

https://www.newscientist.com/article/2317274-china-is-buildi...


Solar generates electricity during the day. It would have to be overprovisioned and paired with storage in order to handle dark hours. There are some battery banks out there (Tesla), but I don't think they're very common.

Coal handles baseline load. We should be using nuclear for baseline instead.


CATL is working on sodium batteries as a lithium replacement shipping in 2023

Form Energy is working on iron air batteries as a new class of multi-day energy storage, launching its first test installation in 2023

The US passed a tax credit for energy storage, to encourage building more pumped storage capacity

Congress is working on transmission line permitting reform

There are some good reasons to be optimistic in the near term


> Unfortunately, energy storage is still an unsolved problem.

Mechanical, lithium based, flow, heat, compressed air, pumped hydro are all types of batteries that are able to store quite large amounts of power today or in the near future. Certainly cheaper than fusion has any hope to be within 20 years.


Solar panels are cheap and batteries are easier to build because they're already taking advantage of economies of scale and aren't in the R&D phase still.

The viability of fusion has been centered for a long time around getting more power out than you put in and once that marker is met it's viewed as the last giant hurdle in the way. There's still plenty more R&D that needs to be done before it can easily / readily scale though.

It's where nuclear was in the 60s basically. Even if it only ever gets to be comparable to nuclear in terms of costing but with none of the hazardous byproduct, it will come out ahead. When you consider the environmental factors involved in battery production it is pretty clear that fusion at least has the potential to be the cleanest sources of energy. Whether it ultimately gets there is another question.


> It's where nuclear was in the 60s basically.

Plants built in the 70s are still operating. It is nowhere near a decade away.


Fair, my statement had an implied "if they cleared this hurdle" attached but I probably should have made it explicit.

I do think it'll be a decade or so to go from net gain -> commercial fusion reactors coming online.


Pessimists were saying solar panels and batteries were too expensive too, not so long ago. If we discover fusion power to be viable in our lifetime, it will be a breathtaking accomplishment to witness. It's a fork in the timeline with repercussions that will reverberate for millenia, across trillions of human lives.


They laughed at Galileo, but they also laughed at Bozo the Clown.

Most skepticism is ratified by subsequent events.

DT fusion doesn't appear to have much to recommend it, since it still requires a thermal cycle like fission or coal, and that keeps its cost high. From an engineering point of view it involves large monolithic plants with very complex and stressed equipment. This seems the opposite of good engineering.


My impression is that the research efforts have been focused on "can we do it?" Then, if the answer is yes, they'll focus on "how do we do it efficiently?" Where efficiency can mean anything from capital efficient, to resource efficient, to energy conversion efficiency. Limiting one's focus on the next blocker in the critical path and not increasing scope beyond it sounds like perfectly good engineering to me.


It seems like terrible myopic project management to me. You want to avoid first steps that you know are very likely going to lead to dead ends down the line.

We're constantly being told to take the long term view. Are we only to do that when it's favorable to the technological optimist's case or budget?


This.

If you have to build a steam turbine to convert the energy from your fusion reactor into electricity, it's never going to compete with solar and wind power in most of the world.

Doesn't mean that there won't be applications (if you can make all those lasers compact enough, submarines, ships, and ultimately spacecraft come to mind), but grid electricity is doubtful.


Beautifully stated. I teared up. This and watching us settle on the moon and Mars would be incredible. And achieving more breakthroughs in AI and medicine and everything else. I am an optimist and really excited by everything on the horizon.


Yeah, the cost of capsules for NIF is something like 4 orders of magnitude higher than it needs to be for commercialization, though admittedly it's not like they've industrialized the process yet.

The other thing is that if LLNL is still using their own definition of Q, it's not necessarily the case that they've demonstrated net-energy breakeven; they like to compare direct energy delivery to energy release, so when calculating Q they basically pretend there aren't any energy losses from actually running the huge laser facility itself. As a result, LLNL assumes that laser technology will improve to the point that real-life Q can catch up with their "scientific Q" metric. (IIRC I think "Project LIFE" was supposed to develop some of those technologies, but it never worked out, possibly since NIF is so far behind their promised schedule.)


Not everything is expressed in cost, externalities like the looks and intrusiveness of something do matter.

1 fusion plant has less NIMBYs to deal with than wind-on-land, for example.

But yes, could be that still it's too expensive by the time it becomes available. By then I hope we can make a fusion plant so small it fits on a space ship and power an Epstein drive :-)


Fusion will have plenty of NIMBY once the tritium leaks start. It doesn't take leaking a very large fraction of a DT plant's tritium to reach levels that already cause fission power plants PR problems.


> Solar panels are cheap and batteries are easier to build and there are lots of ways of making them.

Right now they are, but they often rely on materials from politically unstable regions (particularly Africa), or potential political rivals (China). Also, many solar panels require polysilicon from China, which is almost certainly produced with forced labor.

https://www.csis.org/analysis/dark-spot-solar-energy-industr...

https://foreignpolicy.com/2021/04/12/clean-energy-china-xinj...

https://www.theguardian.com/environment/2022/nov/29/evidence...

And it's not just a China problem.

"On batteries, there were major issues with the mining of between 15% and 30% of the world’s cobalt in the Democratic Republic of the Congo. Amnesty International found that children, some as young as seven, were working in artisanal cobalt mines, often for less than $2 a day. Mining conditions were reportedly hazardous, and workers often did not have adequate protective equipment and were exposed to toxic dust that contributed to hard metal lung disease."

The US is trying to crack down but Europe is lagging behind on it. However, if the report's claim (which I see no reason to doubt) that China has 82% of the global polysilicon market is true, with most of their polysilicon production being in the Xinjiang region, calling solar panels (or batteries) "cheap" is fairly distasteful considering their sources.


Mechanical, flow, heat, compressed air, pumped hydro are all types of batteries. All capable of storing MW to GW of power. It is not all lithium and cobalt.


Once again, I am reminding HackerNews that the technology to build a battery capable of storing enough renewable electrical energy for the (world|nation) for even half a day *does not exist* at any reasonable cost.

And if you want to store multiple days for a northerly nation with very cold winters, frequent high pressure anticyclones (so, no wind) that can last about a week, and you want to switch everyone to zero carbon heating, then the technology doubly doesn't exist.

And the only retort to the above will be mumbling "yeah, but exponential improvement in batteries plus didn't someone say something about hydrogen?" which is essentially, wishful thinking. When you can build a zero carbon grid out of nuclear fission plants - and we've known how to do so since the 60s.


Sure it exists, it is called compressed air. Even better with CO2.

Close to me is the oldest one, built in 1972 and still operational today: https://de.wikipedia.org/wiki/Kraftwerk_Huntorf


> Once again, I am reminding HackerNews that the technology to build a battery capable of storing enough renewable electrical energy for the (world|nation) for even half a day does not exist at any reasonable cost.

But it is almost certainly closer to existence than fusion.


Almost certainly not. The US alone generates 4,095 billion kWh yearly. For a half a day, you would need to store 5,600,000,000 kWh. Tesla Megapack can store 3916 kWh fully loaded. This means you would need 1,430,000 Megapacks to power the US for half a day. With Tesla only being capable of producing roughly 40,000,000 kWh of Megapacks annually, it would take 140 years to produce all the batteries. If Tesla created 100 times the factory capacity they have now (which, could the supply of raw materials even withstand the smallest fraction of that?), it would take 14 years, for batteries that have a warranty of 15 years. These are lithium-ion batteries which are the most space-efficient, unless you don't mind clearing hundreds of square miles of space for this project. Did I mention it costs about $1 million per Megapack right now, so this project would cost $1.4 TRILLION assuming all Lithium+Cobalt+Supplies+Labor cost the same as they do now despite demand being increased 100x, and ignoring all engineering costs, and factory scaling costs, which could multiply the cost exponentially. All to power the US for just half a day. Now consider how to add Europe, Asia, Africa, South America, the rest of North America...

We're not close, and it's basically completely unfeasible. Fusion will be closer in 100 years than such a project.


Almost certainly yes.

Consider pumped thermal energy storage. Use a thermal cycle to generate hot and cold (say, by compressing a gas, probably argon, extracting the heat, then reexpanding, and then storing the resulting "cold"), then reversing that cycle to generate power.

This scales embarrassingly well. It can be made entirely from cheap materials available in essentially infinite supply. No component operates at a temperature above the creep limit of ordinary steel. Round trip efficiency could reasonably be 75%. This requires no technological breakthroughs -- it's 19th century technology.


I have listed 5 different types of batteries than Tesla makes. A number more are much farther than the fundamental science stage of Fusion. Tesla primarily makes batteries for cars, grid storage is actually way more flexible in the type of battery that can be used. You are missing the forest for the trees.


There are quite a few battery chemistries now that don't involve any cobalt. And the ones that do can get their cobalt from more reputable sources than the mining industries founded by colonial powers in Africa. (often backed by big oil and mining corporations). Cobalt and other minerals were mined long before lithium ion batteries came along, for example.

For all the crocodile tears about children mining cobalt, it's easy to forget how other industries can be just as bad or much worse. Of course, critics of batteries are laser focused on only and exclusively criticizing how bad things are when it comes to batteries and literally nothing else whatsoever.

I mean, do you want to talk about oil? Or coal? Or copper? Or uranium? Nasty industries, each of them. Especially oil. Lots of environmental destruction, poor working conditions, the occasional bit of genocide or sponsored corruption, wars, etc. Mining and oil/gas industry just are a nasty. Especially when everybody just accepts it as normal and looks the other way.


Most things don’t start off cost effective, they become so due to investment, demand, industrialisation, competition, etc.

Maybe fusion will stay a small part of the energy mix for decades even after the first commercial plants are built but be part of what eventually enables us to use orders of magnitude more energy than we do now…


I agree. There is no breakthrough on the horizon that is going to make a fusion plant have the complexity closer to a natural gas plant than a nuclear fission plant. Therefore the costs will remain high.

It could still be a useful technology, especially in space. I could see a moon or mars base powered by fusion.


Gas has low capital cost but relatively high fuel cost, especially outside the US. For most fusion designs (possibly excluding NIF), the fuel cost is insignificant.

Also of course we might want to consider the carbon emissions of gas plants.


That is what I mean. Some people are imagining the capital costs of a natural gas plant with the fuel and environmental costs being almost nothing. There is absolutely nothing to suggest that a fusion plant would cost anything less than a fission plant at this point.


Solar and wind are bad and unsustainable due to mining of rare earth minerals and photovoltaic cells degrading and becoming a landfill liability.

Cost effectiveness is also a myth perpetrated by the death of nuclear executed through bureaucracy.

The nuclear, however, is currently the true energy source to use, technologically much simpler (than fusion) to execute with decades of experience making it the safest out there. It is the zero-carbon environmentally friendly energy source.


[flagged]


What were the estimated damage and deaths caused by Fukushima incident? When was the reactor site built, when was the reactor designed?

A few extra questions you may also be interested in: lithium, cobalt mining, costs of nanolitography for high efficiency photovoltaic cells. All that with tax breaks and heavy govt incentives vs insane regulatory burden on nuclear industry. Also nuclear scare in education that makes the public treat opinions like yours as even remotely realistic.


extra questions you might be interested in:

where is most of the uranium mined that is used in european reactors? what environmental damages are done by reprocessing uran? costs of the buildback of reactors? who will pay for it when the costs for this are 10x what the operators put aside for it? how much subsidies go into nuclear? how do you prevent proliferation in rougue nations that use nuclear for example iran?


Why did people back then think it was save and then it exploded? Were they wrong in their assessment back then? Why were they wrong? Are you sure your assesment is correct today? Why is it better than their assessment back then? Are you sure you are not making the same mistakes that they made back then? Oh look, I can ask questions too, because I am a sealion.


Genuine question: I seem to recall there being some very similar news about how 'ignition' had been achieved not too long ago. Am I imagining things or is this a genuine new development?


The story here isn't ignition. It's that they got out more energy than they put in, which is of course necessary to use fusion as an energy source. We'e been able to produce fusion for awhile, but net positive energy hasn't happened before.


To correct myself: 'ignition' is getting more energy output then input (if I understan correctly), and that is a first. Sorry for the confusion.


The net energy gain is very slim and has to be converted to electricity to power the lasers – in doing so, there's so much loss, it is again NEGATIVE.

It's always the same…


This isn’t the same; this hasn’t been done before.

New things are hard. Nothing truly worthwhile is easy.


Presenting the progress of fusion in such a way to give the impression that commercialisation is right around the corner has been done before.


Nothing like commercialization happens without an insane amount of work. It’s easy to criticize, hard to actually help.


Yes because it's hard to make fusion viable since 50 years my guy...

Am I talking to a ChatGPT instance or what is happening here. Let's find out :D

\\\vig-128 ?{/subject unlink;;;


By now everyone knows that Fusion, if we succeed, is going to provide us with abundant clean energy.

But Fusion is not just another way to power your lightbulbs, fusion is a completely new type of energy.

With fusion we can in principle reach 10% of the speed of light which would be revolutionary for space travel.

But even wilder, because it's technically a sun we would over time be able to create basic materials like, Gold, Neon, Sodium, Magnesium, Silicon, Nickel, Copper, Zinc, Gallium, Germanium.

It would also mean abundant energy to create synthetic materials that could even replace use of fossil fuels in our materials.


It's a different environment than the sun and other isotopes are fused. The plasma is a lot less dense, with a lot less pressure but much higher temperatures. The current technology will not generate other elements. And Gold etc are not created in the sun via fusion. they are generated by a different process involving stellar catastrophies.


It's a different environment which is in principle possible to recreate. First step is to get basic fusion working.


Suppose we have a world with working inertial confinement and stellarator fusion. Are there applications where one does better than the other?


I live near Princeton NJ. Approx 4+ years ago years ago I bumped into a friend one evening at a local restaurant / bar. As it turned out, her date was a top guy at the Princeton Plasma Labs.

Long to short, Gates assured me (paraphrasing), "We're close. It's doable. All we need is more funding."

I hope he's right.

p.s. I know PPPL might not be directly involved in this announcement. I was sharing context on the topic.

https://www.pppl.gov/


I'd take all of that with a grain of salt. First he was probably trying to impress the girl, and second, every scientist says their work is possible, they just "need more funding". If they didn't think it was possible, they wouldn't be working on it.


Exactly what was the meaning of "it" there? Reaching breakeven? Or reaching practicality? These are not the same problem!


NIF is still doing fusion research? I thought they pivoted to materials research in support of stockpile stewardship years ago.


They are still doing plenty shots for the national ignition campaign and figuring out the target manufacturing process. The official purpose of NIF has just been shifted to support security research.


TFA is devoid of details with which one might even categorize the breakthrough, let alone judge it.


I guess I don't really get it. Nobody doubts that you can get a tremendous output of energy from a fusion bomb with modest inputs. This thing they've ignited is a tiny fusion weapon without a fission blanket and with a huge, inconvenient optical primary. I mean I'm all for science but I don't see the road from this to civilian fusion power as people generally understand the term.


There isn't any, not for ICF. These labs are part of the DOE's thermonuclear weapon research programs, not energy research.

It is possible though that they could also use this for some fundamental research into how fusion works as a process.


This is like a version 1.0 steam engine. Miniaturization and optimization can come next.


No, this is like research into TNT being presented as a potential way of creating a power plant by capturing the energy of the explosion. The real purpose is producing better explosives.

This is not some bizarre idea - Lawrence Livermore is officially a part of the DoE's research into maintaining and improving thermonuclear weapons. That there are some vaguely imaginable applications in energy generation is at the very best a bonus.

Remember that each shot of the lasers also destroys 10 million dollars or so of the highly precision engineered "housing" for the fuel pellet (called a hohlraum).

The lasers don't directly hit the pellet - they hit the metal walls of this hohlraum, causing it to grow so hot that it emits x-rays, and its shape is perfectly aligned so that those pellets hit the two sides of the pellet at exactly the same time, causing two "ripples" to compress it so much that they force the atoms to fuse in the middle and produce a chain reaction that has to consume the entire amount of fuel before the force of the implosion dissipates, at which time all of the matter violently explodes. The brunt of that explosion (and the neutron bombardment from the fusion process) is taken up by the hohlraum, which is ireedemably destroyed and can only be, at best, melted down as raw material for the next hohlraum.

Edit: tldr, this is exactly as useful for energy generation as an internal combustion engine whose pistons are destroyed every time the fuel ignites.


I’m not following exactly—are the lasers destroyed after each shot? The fuel being destroyed of course makes sense..


No, the hohlraum is, which costs millions of dollars by itself. And it costs so much because it is essentially the piece that handles the synchronization of hitting the fuel pellet perfectly symmetrically. Its also built from solid gold right now, though depleted uranium may also work - either way, the raw material is only a fraction of the cost, the perfection required in achieving its exact shape is the problem.


The analogy doesn't really work. The utility of a steam engine was obvious to antiquity, but they did not have the materials technology to build it. They did not need basic science to do steam power. The first practical steam engine predates the understanding by chemists of combustion. It was invented when phlogiston was still the going theory.

NIF on the other hand is already a miracle of materials science. An absolute triumph. But you can't enumerate the list of unsolved problems that, if eventually solved, lead to inertial confinement fusion as a civilian energy source. On the other side you can make that list for magnetic confinement. There is a clear path from magnetic confinement research to commercialization, with a known set of major problems.


People in antiquity did build a functioning steam-powered engine, but dismissed it as a curiosity.

https://en.wikipedia.org/wiki/Aeolipile


This fascinating article goes into more detail on the reasons why: https://acoup.blog/2022/08/26/collections-why-no-roman-indus...

They correctly dismissed it as a curiosity because it was far too inefficient to do anything useful with the amounts of fuel they would have had available. They couldn't have made a more efficient one because they didn't have any idea how to construct reasonably uniform pressure-bearing cylinders.

Real innovation didn't happen until much later on, at British coal mines because 1. there was lots of fuel because it's already at a coal mine, 2. there was a useful task for the work in pumping water out of the mine, and 3. materials technology had advanced enough to make it possible to construct an engine that did a useful amount of work from a manageable amount of fuel.


The problem is you can say that about any wildly inefficient new technology, but it doesn't always pan out.


I just hope it isn't another ( there's been more than one ) NASA level "announcement" on astrobiology that's going to rewrite the "book". These sorts of headline grabs do nothing to help in the end. This is feeling like another one of these, and I'm hoping to be proven wrong - as who wouldn't love a mr. fusion in their future.


Can anyone recommend a good podcast episode that dives deeper into the implications of this?


This is the opposite of a breakthrough.

It's a marginal change.

To a system that doesn't fulfil the requirements of the rest of the article (requires tritium and the world's most powerful laser).

And it might not have even actually happened, the measurements are still being assessed.

People are so desperate for an easy, technical answer. But that doesn't mean the is one.


Very disappointed with the disappointment comments in this thread. People without relevant experience commenting about others without relevant experience commenting on the topic at hand.

Please reserve commenting for the experts who are specifically familiar with NIF.

Oh no now I shouldn't have commented /error


I keep seeing lots of talks of nuclear energy being the next greatest form of energy, but ever since Chernobyl, it seems like people are afraid even though Chernobyl was a one-off incident that wasn't regulated well.


When you consider the power that big oil and gas have worldwide, and all they've already done to sabotage adoption of clean energy, it just seems improbable to me that one day tech will arrive that provides unlimited clean energy without some kind of big ugly fight. Big. Like I can see these guys doing everything from run of the mill regulatory capture to kill it all the way up to supporting right wing (or communist) conspiracy theories or movements to destabilize democracies (all things that have been done in the past). I seriously wouldn't put anything past them. Maybe I'm being too paranoid but I have a hard time believing in any future that involves yanking away trillions of $$ in power from a small group of unscrupulous people.


this is the reason I've been saying that we will have fusion within a decade of when markets start to price in the decline of fossil fuels because of renewable & other factors. its not an impossible problem, it just needs more research funding/focus.


Still not nearly enough money invested to energy research.


This is not energy research, it's weapons research. Inertial containment fusion is only interesting because it replicates some of the conditions inside a fusion bomb - there is no plausible way to use it to generate electricity with anything approaching cost efficiency.


Once you have a viable start the money will explode into the sector. Manhattan project style.


If this is the laser inertial fusion for the National Ignition Facility, the purpose is not to generate energy. It is to study fusion in the laboratory in order to maintain the nuclear weapon stockpile.


Is this cold fusion? The article contrasts this experiment with a plasma tokamak in the UK. I suppose the lasers require a lot of energy but it doesn't sound like there is a plasma is there?


Definitely not cold fusion. Using powerful lasers to heat up and pressurize the target.


The power of the sun… in the palm of my hand…


666 comments, I won’t fall for it.


This is very promising. Hopefully this can be one of the primary tools used to remove our dependency on dirty fuel sources


Donate the technology to the world after X number of years.


It’s insane how much cynicism I’m seeing here. I know people who are nuclear scientists at LLNL - if they’re excited about this then it’s a big deal. The experiment actually created more energy than expected and damaged the sensors.

This website is seriously infested with reflexive contrarians and it’s a not healthy.


> This website is seriously infested with reflexive contrarians and it’s a not healthy.

I don't know. Looking closely at the article reveals that the researchers achieved 1.2x energy gain from the lasers, which are about 1% efficient. Given the SOTA for such lasers is closer to 20% efficiency, this means that they achieved about 60% of break-even. But that's energy, no electricity. Even with the best current methods, about 60% efficiency is the best we can hope for in terms of getting actual electricity from this. So in practical terms they achieved 30% of break-even.

Is that good progress? I'd say so, for sure. Is this a breakthrough? I don't know, especially since the article itself says the data is still being analysed and the actual results aren't published yet. 95% of the article is just fluff about the potential and quoting 3rd parties who celebrate a result that hasn't even been officially confirmed yet.

So, no I don't think it's cynicism, I don't think it's contrarianism, and I do think it's VERY healthy to approach sensationalist headlines with a level-headed and down to Earth attitude instead.


> But that's energy, no electricity as far as fusion viability is concerned net energy (over whats put in) is enough. the whole electricity is moving the goal post because there are plenty of other sources that primarily produce heat.

Now regarding efficiency of laser itself, sure they are inefficient but from just nuclear fusion pov net energy gain is a significant milestone in itself. lasers can get incrementally more efficient, at least there was not incentive to make them super efficient so far & there are no known fundamental problems with making them efficient.


Lasers can get over 50% efficient (although these are specialized types).

It’s silly to blame a facility not designed for power production for using inefficient lasers.

This is an important and necessary step to getting resources to go further. Imagine how dumb it would’ve been to build a fusion power plant before we could even do 1.2x energy gain. A complete waste of resources.


> the whole electricity is moving the goal post because there are plenty of other sources that primarily produce heat.

There's no industrial processes that make use of plasma in the 10s of megakelvins. It's also not moving the goal post at all, since generating electricity is the literal goal of nuclear fusion. If it's just heat you're after, we've solved that problem over 70 years ago. There's hundreds if not thousands of thermonuclear fusion devices readily available literally at the push of a button. But for some odd reason we try hard not to use them and focus on electricity instead...


The question is whether this is a breakthrough and a significant milestone or not. It seems to me like your comment suggests that we have hit the "significant milestone" marker only when we have an actual electricity-generating fusion reactor, which I think diminishes the actual breakthrough that a positive net energy gain represents (if correct). It was long sought after, it has now been reached.


Exactly! This is a very good question that requires some context, preferably from within the field. What does it actually mean?

Sadly, however, the article doesn't seem interested in answering that question and providing the necessary context. Instead it quotes authors of books, who seem ecstatic about the possibilities.

You'd be correct in calling me a cynic when I say that I've heard the "too cheap to meter"-slogan from back in the 50s when nuclear fission was the future.

But I try hard not to be that guy and genuinely want the same question answered - is this an actual breakthrough and a significant milestone in the big picture? Up to this point it's been hit-and-miss and many so called "breakthroughs" turned out to be small steps in the right direction, but not exactly quantum leaps.


EUV light source generate plasma that is in the megakelvin range today for silicon lithography. It's obviously not the same and still cooler, but the assertion that we don't make use of highly energetic plasma is off.

Secondly, your attempt at being pithy about nuclear bombs is a complete loss. We previously only knew how to achieve an inertial confinement based fusion reaction with a positive Q factor by first setting off a fission bomb, and this was only done for the neutron generation to increase the amount of fissionable material exploded (which is why they are called fission-fusion-fission bombs).

We can now generate fusion energy in a way that is obviously confine-able. That's a major step, and it's not THAT hard to imagine many mechanisms of turning a hot droplet into energy. For example the hohlraum itself in an indirect system will obviously be heated by the reaction and could be used to generate steam. Engineering that makes no sense though if you can't get a high Q factor out of the ignition itself, hence the focus.

This four sentence post is a perfect example of OP's point. No insight, no though process, just a pithy negative reply.


> EUV light source generate plasma that is in the megakelvin range today for silicon lithography.

Only an order of magnitude off, but yeah, physicists and spherical cows and all that.

> Secondly, your attempt at being pithy about nuclear bombs is a complete loss.

A little sense of humour is lost on so many bitter souls these days, it's kind of sad. Lighten up, mate!

> Engineering that makes no sense though if you can't get a high Q factor out of the ignition itself, hence the focus.

You do realise that the fusion reaction we're talking about lasted for less than a trillionth of a second in a miniscule area, while other practical designs are aiming for continuous operation in the half hour range to examine practical engineering challenges of particular reactor configurations?

A high Q-factor may be completely useless if the underlying concept doesn't work for actual power generation and one might be easier to achieve than the other (i.e. getting a continuously working reactor first and tweaking it to improve Qp). The question therefore becomes, what's the actual value of the result. The article doesn't even touch on that, while even some C-grade online publications provided context like that.


> but from just nuclear fusion pov net energy gain is a significant milestone in itself.

Didn't we get to that milestone when they detonated a fusion-based bomb back in 1952?

https://en.wikipedia.org/wiki/Ivy_Mike

Anyway, I'm not sure it's a significant milestone. It's just a number along a scale. If you were to tell me they've achieved a _sustained_ reaction which yields more energy than goes into it, for a period of, say, a day or so - then you could claim a significant milestone has been reached.

And even with that, some people argue that given how there's basically no sustainable source of tritium for large-scale electricity generation, the whole exercise is pointless unless the process uses other combinations of elements.


> ...I do think it's VERY healthy to approach sensationalist headlines with a level-headed and down to Earth attitude instead.

My experience on HN is there is a bias for critical thinking. If it's traditional nuclear power or climate change, the bias is for it. If it's new battery tech or fusion power the bias is against.

Does it only feel "very healthy" to be critical because you are being critical of the idea?

I have been called "contrarian" to my face. I understand the deep seated need to be "absolutely certain", but maybe there _is_ something going here other than that?


> Does it only feel "very healthy" to be critical because you are being critical of the idea?

Who's critical of the idea? I literally said it's good progress. What 's not good, however, is exaggeration, sensationalism that puts potential views and hype before substance, and raising expectations for something that's still essentially just basic research.

This has nothing to do with bias of any kind. It's just poor journalism, bad form, and misrepresentation of genuinely great work. I simply expect better from a publication like FT. If that's the level of reporting we get from what I thought to be a somewhat reputable source, why even bother taking any publication serious anymore? It's not criticising the researchers or downplaying their work.

It's a critique of the media preventing the public from actually getting a realistic picture. I'd like to be educated and kept up-to-date, not mislead and hyped up.


Many fusion new articles have this problem but i would argue that this time, how FT categorized this is appropriate. This is literally the first time the scientific break-even (not engineering break-even) has been achieved by any controlled experiment, including MCF. How is that not a breakthrough?


> How is that not a breakthrough?

I will try to be really positive here. The researchers managed to achieve ignition on an area less than the width of a human hair for 100 trillionths of a second.

The resulting fusion may have gotten scientific break-even (again - no officially published results yet). This is great progress in terms of basic research, for sure.

On the other hand, we have experiments like Wendelstein 7-X, an experimental reactor that already can hold a stable plasma for seconds and is planned to go up 30 minutes of continuous operation early next year (construction is already finished).

The researchers state that they want to test, whether continuous operation is possible, how the plasma can be handled, how the materials and magnetic fields can be optimised and whether their approach is practical.

So on the one hand we have a theoretical result that may or may not be a scientific break even and is hailed as a major breakthrough that will open the door for commercial fusion reactors and lasts for trillionths of a second within a miniscule aera. No continuous plasma, no work on practical reactor design, just good old fashioned basic research at its best.

On the other hand we have working, practical fusion reactor experiments that are already able to hold a stable plasma for seconds and are tackling the engineering challenges of actually producing electricity. Some are designed for engineering break-even and Qp > 1 (e.g. ITER) and not ready yet, while others "simply" examine the practicality of a particular design (e.g. Wendelstein 7-X) and actually worked and continue to improve by orders of magnitude (in terms of operation time), pushing continuous operation time up to 30 minutes.

Now that I gave some context, how much of a breakthrough are we talking about? I don't know. All these experiments are important, of course and are required for the end goal of achieving economically viable stable power generation using nuclear fusion. I'd just like to wait for an official publication and a proper subsumption by other experts in the field.


Sure we can wait, but there already has been publications about the August 2021 shot, in which they determined that the August 2021 met the ignition criteria. It's pretty clear that they were on the verge of achieving scientific break-even.

Maybe we are just arguing about semantics and what constitutes a breakthrough but in my mind, the hardest challenge of fusion has been getting scientific gain over 1. There are still OTHER hard problem like continuous operation, capturing energy, but ultimately, getting scientific gain over 1 is/was the most challenging. You can say it isn't but the fact is, none of the MCF concepts have achieved a scientific gain over ~.64 and have not improved since the 1990s (JET). Look, if the 7-X or ITER or JET achieves a similar scientific gain, they will get similarly applauded.

I'm not saying that fusion will become a economically viable power source now. It is just that NIF de-risked the hardest challenge of fusion from a pure physics standpoint: more energy out than in.


I like to remind people now and again that humankind's very first attempt at inertial confinement fusion was wildly, ridiculously, terrifyingly successful: https://en.wikipedia.org/wiki/Ivy_Mike.

These may seem like tiny steps forward, but once the genie is out of the bottle it's going to be nuts.

Also this laser tech is ancient, once there's a major economic driver behind it I expect they'll rapidly advance.


How many times have you or literally anyone you know achieved a state of the art breakthrough in the production of energy from a nuclear fusion reactor? Is this just another Monday to you?


For most people, yes it's just another Monday. Same way the observation of the Higgs Boson was just another day. Maybe worth an hour or two of curious investigation, but of no immediate consequence. Question is, are we watching the first Wright Brothers' flight, or are we watching one of the marginal glider improvements in the 19th century that would eventually contribute to the first Wright Brothers' flight 40 years later?


For the vast majority of people, the Wright Brothers' flight was also "just another Monday [or whatever day it happened to be]".

It's a bad criterion for judging something noteworthy.


The potential for powered flight was clear to those who observed and didn't think they were lying. The Wright Brothers sold their improved designs to the US Army in 1908, just six years after their initial flight. 15 years after said initial flight you have massive formations of planes Dogfighting over France, conducting reconnaissance and dropping bombs.

15 years after whatever this breakthrough is will anything have changed outside of the lab? That remains to be seen, but I'll believe it when we see the data.


In the case of the Wright Brothers, most people didn't even believe it. One factor at work was that they'd been primed by years of Respected Scientists saying that it was physically impossible.

The newspapers didn't cover it until random people started asking why the feats they'd seen with their own eyes weren't being written about.

https://www.wright-brothers.org/History_Wing/Aviations_Attic...


I doubt that 'wright-brothers.org' is a reliable impartial source on that. The idea of heavier than air flight had been around, it wasn't conceived by them. Lilienthal was an earlier well-documented pioneer, there are pictures of his gliders in mid-flight. Just to name one. So without doing further research, the idea that actual scientists at the time thought it was impossible seems highly unlikely to me. As early as the 18th century Europeans experimented with fixed wing "flying machines". The Wright Brothers probably weren't even the ones to first archive powered flight, there are multiple contenders.

It's a bit like with Elon Musk (re-)inventing the electric car. Those popular names that went down in history are usually not the original inventors. These people were first of all successful entrepreneurs that understood business and ultimately won the patent war. But in the early 20th century, the idea of flight was already firmly established, why would any "respected scientist" have doubted what could already be observed in action around the Western world?


That's just what I could find quickly on Google, accessible by all.

A few years back I read David McCullough's biography of the Wright Brothers which told the same story: https://www.amazon.com/Wright-Brothers-David-McCullough/dp/1...

This site cites a couple of the pre-Wright naysayers: https://bigthink.com/pessimists-archive/air-space-flight-imp...


Like the ARPANET? That was even less of a notable Monday to the general populace, because I doubt any significant part of the public had a great interest in some universities being able to connect their at least fridge-size, expensive computers together similarly to how they were already able to do before, but now using "packets".


ARPANET was established in 1969, the internet as we know it started gaining traction in the 90s. So you're saying this "breakthrough" means we're 30 years away from the nuclear fusion equivalent of the dot-com boom? If that's true then I'm incredibly hyped, show me the data


There is no need to put “breakthrough” in quotes. For the first time since working on fusion since the 50s, we have achieved a positive gain of energy. Engineering-wise, it is a long way still until this can be put into its intended use, and who knows what roadblock scientists and engineers will hit on the way, but anyone who knows what they are talking about is suitably excited about this milestone.


How many times have you read yet another article on the most recent fantastic fusion breakthrough?

It certainly sounds like just another Monday to me.


I think it's just the difference in expectations between scientists and laypeople. "Major fusion breakthrough" to a scientist could mean one step out of 200, over 3 decades, towards functional fusion power. Scientists understand the long arc of progress. But these labs need to market to the public as well who invariably end up expecting a SimCity Fusion Power Plant within 18 months.


Many on HN have the same response to many things in every domain, not just research.


For being a tech entrepreneur forum people here are strangely very anti science and technology. The top voted responses to every new product announcement are essentially "why do we need this? Pen and paper work just fine".


Rather, very anti science and technology hype. Many visitors of this website measure experience by decades, and have seen many waves of hype resiting in not much progress in unyielding areas, from self-driving car and silver-bullet methodologies to, well, commercial fusion.

When demonstrable, measure progress is achieved, visitors of this site get very excited and positive, from things like the Rust language all the way to solar power and reusable rockets.

A breakthrough is a qualitative change, not (merely) quantitative. 95% to 96% of reaction energy output is a nice but quantitative advance. 99% to 101% is a qualitative breakthrough: suddenly, it's a surplus, actual generation.

We are still far away from the latter, alas.


This is the very opposite to silver bullet approaches to fusion, though. This is a methodical, military-industrial-complex style development that was decades in the making.

I think it’s just the Zeitgeist. Social media has trained us that a certain reasoning style is rewarded, quick takes that don’t dig into the first principles and instead serve as shibboleths that you’re not one of THOSE types of unintellectual pseudo tech bros who bought NFTs or whatever.


I have plenty of experience and I don't agree that it's about hype. IMHO it's about the standard human reactionary response to something new and challenging [0], and people trying to sound smarter than something or someone by criticizing. It is disappointing.

[0] The saying is true IME: First they laugh at it (ridicule it), then they say it's not in the Bible (conflicts with the norm), then they say they believed it all along.


Not really convincing since Rust is 99% hype from people who misunderstand C++ and are just happy to join an "inclusive" cult


That's because for 90% of new products, the old stuff performs better, uses fewer resources, and has been debugged in ways the new stuff has not.

All the new stuff, however, has marketing and looks shiny.


If you are always a naysayer you will be right 90% of the time and can feel smug and pat yourself on the back for it (so, like everyone here). However, progress comes from people willing to take risks and make wild bets for the small chance that they are in the 10%.


And they will succeed around 1% of the time.

In fusion research, pessimism is realism. Especially in laser powered fusion.

This experiment is producing 2.5 MJ of output for 500 MJ of input.

Roughly once a day.

After decades of basic research.

It's a scientific breakthrough in the sense that the rocks are now being banged together hard enough to make sparks. And a little more is known about rock banging than was known ten years ago.

But it's clearly not going to be producing power on a commercial scale any time soon.


Aside from the particular percentages I don't exactly disagree with your observation about the odds of being right or wrong in the respective cases, but I think the skepticism being due to an lazy pursuit of this smugness and self-congratulation you describe is almost entirely flawed as an explanation of motivations.

Instead, I'd like to suggest, in addition to having with cumulative exposure developed a severe hype allergy, a lot of us are burnt with respect to that so-called progress. There's been a fair bit of outright corrupted delivery on the promise of new technology, not least IT, and many people here are savvy enough to see the costs of wrongheaded changes.

'Move fast or not, we don't care much, but back off breaking things we liked and leaving the rest of us picking up the shards.'


The problem is this might be true, but it will not always be true. The horse was probably better than the first cars for a while, but progress changed that.


Old computers perform better and use fewer resources? Have you used, say, a PC in the 90s? The fans were loud, the power usage constant and high, and the performance lacking by so many orders of magnitude that it can just be emulated in software today.

Cars, TVs, phones, take your pick.


Do you realize what that represents?


In my experience, the more someone has deep understanding in tech, the more they are critical of it. Especially true in my field of infosec.


I think as people gain experience, they can start substituting experience and cynicism for actual first principles thinking and curiosity.


> the more someone has deep understanding in tech, the more they are critical of it.

I think the criticism comes mostly from people with just a little knowledge, trying to sound and feel like they know more. Just a few talking points or principles enable you to criticize, but not seriously analyze (much less create).


Everyone is a cynic simply because cynicism is easy. Any infosec professional can go "computers are insecure, never use them". And the advice will be correct, but ultimately useless. The ones worth anything won't just point out problems but also find solutions.


Yes, it's called counter-signaling.


Most people in tech are cynical about tech because they intimately know the vision is waaay further out than reality, they know the breed and sometimes the names of the squirrels running in the wheels making it work, and have gone through more figurative duct tape and baling wire than most developing nations.


There's a bias for empiricism. People don't believe stuff until there's 50 years worth of data and 1000 politically correct and carbon neutral peer-reviewed papers on it. It must be approved by ethical panels and experts on TV. Only then it becomes science™.


Products better have sn answer to that question, and that answer tends to be very informative.


Yeah its the same here about most new technology like ai and improvement in solar and battery technology. One thing I have noticed is that some of the most vocal people have formed their opinions years ago and now they are not aware or ignore all the changes/improvements that have occured since.


That's not completely true, but has a lot of truth in it.

Everyone formed their opinion about eg. Blockchain a long time ago.

But they do admit that eg. Gpt-3 is pretty advanced, but has it's own flaws.


On fusion energy and battery technology I see plenty of cynicism, but given the history of wildly over-stated "advances" in both fields I think people are justified in leaning towards pessimism.


The reality of course falls short of the most optimistic projections, but e.g. for batteries: look around! Wealthy countries at least are now full of little gadgets that couldn't have existed even a few years ago due to the battery demands. A walk down any street in NYC you'll see probably 5-8 different personal transportation systems that are pretty close to sci-fi.


I agree on batteries, but something like half a dozen Manhattan projects worth has been spent on fusion research so far and we don’t seem anywhere remotely close to achieving even a fraction of its promise. I remember reading about the future of fusion energy in my teens, and now my daughters are coming out of their teens. And I had kids late. I’m sorry but technical break even just doesn’t cut it for me at this point.


As a scientists, I'll never classify the 200 intermediate steps as "major breakthrough". Perhaps 1 or 2 of them. Fusion looks difficult, so let's say 5 of them. All the other are just improvements, minor improvements, side quest, dead ends, easy task for a undergraduate thesis, ...

If you read "breakthrough", you can be almost sure that it's an exaggeration from the press or the marketing team of the university (and in some rare cases, from the research team).


Engineers tend to have a problem solving demeanor towards novelty, which is excellent for finding the problem with things.

Showing a room full of problem solvers an unfinished problem that lacks critical supporting evidence will no doubt elicit a general response in the skeptical-to-cynical range.

I would respectfully argue that is is a health and normal response given the audience, and should be an expected bias on HN.

This is a “show me the evidence don’t tell me about the possibilities” crowd.

I for one and deeply excited if the data proves out, but my bias is “wait and see.” This could be a massive leap towards proof it will work.


This could pass as satire of a Hacker News comment.


> This website is seriously infested with reflexive contrarians

No it's not.


Yes it is.


No, it isn't Yes, it is. You just contradicted me No, I didn't Yes, you did No, no, no You did just then That's ludicrous Oh, this is futile No, it isn't I came in here for a good argument No, you didn't. You came in here for an argument Well, argument isn't the same as contradiction Can be


"An argument is a connected series of statements intended to form a proposition. Contradiction is merely the automatic gainsaying of anything the other person says"

"No it isn't!"



I’ve had enough of this.


Oh I'm sorry, this is abuse! Yes, you want arguments, next door.


You just made my day...


Comment of the year.


This made me chuckle.


I agree with the spirit of your comment, and I am extremely excited by these results. However, I think the history of fusion has showed us that the cynics have had a much better track record than the fusion optimists, haha.

My very uninformed opinion (nuclear physicist by training, but not specialized in fusion, lasers, or plasma physics) is that we’re still 20 years (haha) away from fusion energy making its way into the power grid. And that is assuming this result (or other things, like the relative instability of global energy markets lately) causes an increase in funding for the field so that they can solve all the pesky engineering issues related to efficiency, reactor lifespan, reliability, cycling speed, etc.


Also this assumes we get 20 years and the science budget will not be eaten by emerging endless crisis and wars.


To be fair, fusion technology is a strategic imperative. The first nation to master it will quickly enjoy defacto Energy independence. Given that many of the crises will likely be energy-eccentric, we may see more investment in the space rather than less, especially if visible progress is being made.


I think I’d switch that from “quickly” to “eventually”, or “have a head start to” - we could get grid independence “relatively” quickly if the government subsidized it (I highly doubt first Gen fusion competes with natural gas or solar cost-wise), but a large amount of energy is used in transportation, home heating, etc.. Until those become fully electrified you’re still stuck in the fossil fuel economy.


True, I meant "quickly" on a relative scale. One advantage the 1st gen fusions would have is immunity to the supply shocks of fossil fuels and the intermittency of solar/wind. Plus we have workable electric vehicles and every home that has fossil-fuel powered heat by definition has a connection to the electric grid.

It wouldn't happen overnight, but I can think of few things that would kickstart the electrification of everything better than functional fusion power plants.


There is no more reason to believe that than about fission. Fielding practical fusion would cost more than fission, and fission is not today competitive. (Some people are spending others' money trying to make fission competitive.) Unless somebody comes up with a copious supply of cheap tritium, it can have no substantial effect here, though it might be useful for outer solar system exploration.

But building out solar, wind, and storage will very predictably achieve energy independence, for radically less expense. No breakthroughs needed, but gratefully applied where found.

Can fusion power generation be made to work cheaply? Each day the question becomes less relevant.


Fusion could end up being a useless waste of effort, not a strategic imperative.


People have been burned time and time again by scientists over hyping stuff in the last 10 years, then combine that with the replication failures over nearly every single scientific field. Then look at the extreme amount of business fraud in the last 10 years with places like Theranos and FTX

Hackernews is not infested with reflexive contrarians.

Hackernews has healthy amounts of skepticism and doubt. Extraordinary claims require extraordinary evidence.


When people use the word "reflexive," they're talking about things like conflating business hype designed to attract publicity and VC capital with a press release for a major scientific paper from NIF. I don't think it's unreasonable for HN to hold itself to an understanding of these things. If you actually want to critically examine evidence then you must necessarily read the paper before posting.


Bodied, and deservedly so.

Bringing FTX into a discussion of nuclear fusion to justify skepticism is parody-worthy.


It's an example of latest in many high profile scams. If you think it's parody that's fine, but you cannot deny the public mood is souring on big promises with out big results.

FTX is simply the latest in a series of media empowered EXTREMELY high profile scams. You can easily put a thousand other different high profile companies or claims in there.


"reflexive contrarians" in the context here was being used a perojative rhetorical trick to broadly dismiss valid doubts people have about this research.


What specific issue do you have with the methodology and why? What specific scientific criticism do you claim is being silenced? Merely feeling strongly about this work contributes nothing to the discussion.


Note that there's multiple bits of hype compounding on each other. The scientists hype it up a bit, the University PR guys do it a lot more and the popular press goes nuts.

The scientists are like 10% to blame here.


Look if a scientist at LLNL is excited about it, then there's a conflict of interest here. The fact of the matter is that there is such a high likelihood that inertial confinement is a dead end, because as far as I can tell there is not a realistic plan to harvest the produced energy, which at least, some of the other designs do. The bar is literally higher in other branches of fusion research (and they too are getting called to task for reporting plasma q values instead of estimated plausible total yields). Until someone starts at least building a model of how to collect this energy high levels of skepticism are warranted.


Agree with the first sentence. I worked at a couple of national labs and the number one priority is to keep the lab open by justifying the flagship project. NIF has a long history of disappointments so it's nice to see some success, but it still isn't clear building this thing was justified. The main rationale during the planning stages was "stockpile stewardship" which loosely translate into "making jobs for nuclear weapons scientists even though we aren't building any."


What I encourage people to do, and what I was encouraged to do by a professor, is to find the value in things. Yes the thing, any thing, has great flaws, risks, is an imperfect match, etc. That goes without saying, and is is in some respects pointless to say - we can stay in place without going through the effort of researching something. It's the value in things, and finding that value, that moves us forward.


I try to live my life this way. People think I’m an optimist, but really I think the world is mostly BS and I try to acknowledge the good things. It works for me.


> It’s insane how much cynicism I’m seeing here. [..] This website is seriously infested with reflexive contrarians and it’s a not healthy.

The problem is that fusion "breakthroughs" have been hyped by the press for many decades now. After a few such articles gets people excited and then reality crushes the hype, people learn to dismiss every new story as yet another inconsequential thing blown out of proportion.

I'm commenting about the coverage of fusion in general, not about this particular thing. If it is actually a big deal, great!


It isn't just hyped; popular reporting on fusion power hasn't been very accurate. It doesn't help fusion power that things like this are trumpeted as a breakthrough when the reality is that the INF was never was a viable way to generate power in first place.


I am skeptical, not cynical.

When I read what USDOE announces, I hope to be less skeptical.

The basis of my skepticism rests on having written a term paper titled ’Nuclear Fusion, Infinite Energy for the Future’ in 1982, and after the semester sharing my ‘it’s only 20 years away’ enthusiasm with my father -a PhD scientist working for the DoD. Hence it’s forty years since I first heard ‘fusion is always 20 years away.’

Of course I don’t know any LLNL scientists but don’t question their or your sincerity or motivation.

The difference between those and the incentives of financially oriented news reporting, doesn’t make me less skeptical. Their mandate is to present potentially market moving ideas before the market can move.

And because I lived through Pons-Fleischman. Which is to say I have forty years of experience with reports…I mean I see excitement for Tokamaks and I wrote about them in 1982.


I don't believe its contrarianism. Sure, its an interesting science experiment but it has no viable way to generate power in any way. The lasers needs to be more efficient by a factor of 100x in the best case scenario(it depends on the specifics of how they calculate net gain). Then you probably need to increase that by another factor of 2-5x even assuming you have a way to convert that thermal energy to electricity.

No one has any idea how that would ever be viable; other fusion alternatives at least have a way to accomplish thermal transfer from the reactor. Then you somehow have to figure out how to build a financially viable power plant. Oh, by the way, the lasers need to fire 1000x more for that. No one has any idea how that would work either.

There is a reason no one but a national lab interested in fusion reactions with massive financial resources has done this before; its interesting but doesn't produce any kind of viable power source.

Edit: The INF was proposed and designed as means to ensure the viabilty of the nuclear stockpile. It and the French equivalent were never understood as somehow prototyping a fusion power plant for the reasons laid out above. The press reporting here is just not accurate.


To an external non-technical observer, this is about as exciting as me hitting a clean compile in the scale of things. It really makes me happy but no one else cares until the product arrives.

I'm excited for both for reference.


I'm not sure when it happened, but this place has become a lot less inquisitive and a lot more dark in recent times. Possibly its correlated with growth, but it feels like something else.


Think it's correlated with growth. I've seen significant post-pandemic degradation on all major social media platforms I use (mostly here & Twitter), along with large increases in volume.


It isn't cynicism. It's a reality check:

1. Energy output != power generation. At the end of every fusion reactor is boiling water to turn a turbine to generate electricity. There's a limit on efficiency and we still aren't there yet;

2. Much like all of nuclear power (fission included) we brush over capital costs and focus on operating costs because that tells a much better story.

3. We still have energy loss from neutron loss;

4. We still have container damage to content with due to neutron embrittlement.

Even the article claims (and this is optimistic) that commercial fusion power generation is "decades away".

Much like FTL travel, we get suckered into unwarranted optimism because we want it to be tru, particularly with the fuel abundance and (no) waste issues. We also fall into th enaive trap of thinking if stars can do it, it must work. But what contains stellar nuclear fusion is gravity.

I'd argue there's still way too much optimism. Pointing out these issues doesn't make you a contrarian. It makes you a realist.


> This website is seriously infested with reflexive contrarians and it’s a not healthy.

I think that's true. But I also think there is a lot in the way of breathless PR around science topics both from university press offices and lower-end science news outlets. Especially around fusion, which has been 20 years away for a lifetime. So I get why people are going to be particularly skeptical.


Reflexive contrarianism is far healthier than blind credulity. Skepticism should be the default state, especially for claims of amazing scientific breakthroughs.


The older I get the more I think it's just counter signaling that one is smarter than whoever did this work, which is almost certainly not the case.


I think it's more often just counter signaling that one is better informed or more realistic than whoever wrote the press release or media article, which may very be the case.


Yeah I can't help but imagine these same folks would be more than happy to produce a list of common logical fallacies.


[flagged]


That's a pretty arrogant take based on zero actual evidence. When "in my experience" is "randoms on the internet who I've never met and occasionally argue with online", I don't think we can draw many conclusions.


I know for a fact that many people who worked on this are also on HN. You should know that scientists see posters on this site mostly as representatives of the software engineering world. Seeing this kind of sneering attitude so frequently on display here is pretty embarrassing and casts the whole profession in a poor light.


The average HN reader who is probably a generic software engineer in their 20s/30s will know more about nuclear fusion than scientists at LLNL?


Listen man, remembering the names of all these JavaScript frameworks is hard!


They think they do, but don't...I think that's the point that's trying to be made here


The average HN reader strikes me as someone who can talk big, spout buzzwords, and play skeptic, but make them actually solve something and they crumble instantly.

Maybe the average HN reader is smarter than the average reddit reader (very slightly if at all), but they're not more useful than someone who actually did work and shared it publicly.


>The average HN reader strikes me as someone who can talk big, spout buzzwords, and play skeptic, but make them actually solve something and they crumble instantly.

I'm sure many have experienced the phenomenon where they read some HN comments that sound authoritative and give them that level of credulity. And then they get into a discussion on a topic they may literally be an expert in and it's made glaring obvious the person they are in a discussion with only has a superficial understanding, yet takes the same authoritative tone.


Surely this is sarcasm?


In my experience the average HN reader is slightly above the general population average.


Neither is particularly healthy. Either staying level-headed and analytical or simply admitting ignorance would be healthier. Skeptical/Gullible are two ends of the same crutch for when we are unable or unwilling to do either.


I think skepticism is healthy, rational, and intellectually economical, especially when we're talking about popular media stories. The skeptic isn't harmed by dismissing grandiose headlines about scientific breakthroughs which are selling a false narrative 99.9% of the time (yes, real science is happening, but the media's narrative about the impact of research is pretty much always false), and in the cases where someone is a little too dismissive, they might end up looking like an idiot one day, but layperson skepticism has no bearing on the validity of the claim, no amount of skepticism can overcome the reality on the ground, if it's real it doesn't matter what anyone believes.


I agree, but what I'm seeing here down in the comments isn't merely skepticism, but outright dismissal.


> This website is seriously infested with reflexive contrarians and it’s a not healthy.

I can't imagine what it is like to be in their heads. Even for things I am skeptical about, I still want them to be true if they are truly transformative. My worst case scenario is being cautious, but never, ever, negative.


Lotsa reasons.

(1) We are used to the same "news" story being cycled again and again. I think a year ago we heard about a previous breakthrough in ignition. When I hear a story like this my first instinct is that the old story has been recycled and I'm not sure that there is any actual news.

A few months back it was announced that scientists had discovered a black hole that was nearest to the earth and it still gets posted to HN which makes me wonder if they discovered a closer one.

(2) For a while there have been two parallel tracks, one of very slow development efforts at LLNL and IETF which might yield a power source in 50 years and another about firms from Lockheed Martin to scrappy startups who are promising to build a "Mr Fusion" tomorrow. There are still memories of the Pons & Fleischman affair from the 1980s and a strange subculture of LENR activists who claim they will sell you a fusion power source today. One could easily assume "fusion is the new blockchain" in this climate

(3) Fusion research has proceeded with no direct line to a practical power source for a long time, the sharpest critique you hear is "the point of the NIF is to do subthreshold tests of nuclear weapons, not develop a power source"

(4) Fusion is really hard. They might have to get the energy output up 100 times and increase the shot rate 500,000 times to build a real power source, even if 1-3 aren't enough to make you dismiss the whole thing. People will point out that ignition is a big threshold and it might not be so hard to increase the energy output from here out, but we have a long ways to go.


I'm just glad these scientists are working on something other than nuclear bombs.


The entire point of LLNL is to study nuclear bombs.


The original point of LLNL was to develop nuclear bombs. There is such as thing as "mission creep", also the challenge of maintaining the ability to develop bombs in the future if we need to.


Studying nuclear bombs is still the point. The press releases about fusion "energy" are just for appearances sake. The methods they employ are useless for energy applications. They're just H-bomb simulations.


> This website is seriously infested with reflexive contrarians and it’s a not healthy.

What could be done about that aside from expecting people to just... be better? I think the shape of these forums induces those kinds of comments, even if the community and moderators make a real effort to uphold higher standards. And I think if I encountered the same people in a different kind of forum then I might have a higher quality conversation. Heck, my own comments would probably be a lot more constructive!

Real world example of what I'm thinking: I have a neighbor over one fence who has very different political views to mine. We have perfectly civil conversations in which we're both actually really engaged and trying to understand each others' perspectives and experiences, and not just keeping the peace by avoiding difficult topics. It feels like effort we put into the conversation is rewarded.

I can't shake the idea that there might be "one weird trick" (okay, maybe a handful used together) that could make it more rewarding to put more effort into online conversations on forums like Hacker News or Reddit. One I've wanted to try for a while is to recreate something along the lines of Slashdot's moderation system, but with room for a meta-conversation to take place in "moderation space" (in which all community members could participate) and for there to be opportunities for people to refine their comments in response to feedback — and for doing so to be the norm.

Maybe it's not that simple. That's okay, too. But I've seen different moderation strategies around the web produce very different results, so it seems to me that there should be plenty of room for experimentation, and a lot to learn from doing so.


I'd say most of the problem here is that viewpoints are meted out as simple pithy statements. Half of the comments on this thread are one sentence statements saying the building has 200x to go before it's truly net positive.

You get more content out of a discussion with your neighbor in 30s than that. Those comments are genuinely worthless, they don't talk about things like:

1) What are the parts of an inertial confinement fusion based system which are difficult and which are missing today and would need serious investment

2) What is the likelyhood that the power output observed here could double, or more with other scale factors?

3) What's the net system costs once a plant is made. Is the fuel cheap or expensive?

Etc. It's fine to be contrarian, but most of the contrariness on this most internet forums is of the most basic, shallow kind that is defeated in a moment by any serious thinking.

The short answer to being better? Posts with more in depth content. I seriously think HN should consider banning pithy one or two sentence posts "they still would only get 1/4 the power" you find all over the place.


Your conversation with your neighbor has no meta-conversation going on.

Online discussions "between two people" merely mimic a conversation so the audience (of potentially thousands+ of people) can learn and be swayed.

Online conversations are inherently broadcast so the stakes are too high to acquiesce or make concessions for whomever's willing to actually take the bait and engage on "important" topics.


That's a good point. I guess I hadn't really thought about how performative discussions on public forums are. Maybe embracing that more deliberately somehow could produce a worthwhile difference?


Being able to bring the audience "in" on the broadcast nature of these (presumably authentic) conversations on X contentious topic would be an intriguing problem to solve. With AI coming more mainstream, an AI analysis of conversations might be where you could shine, including calling out astroturfing (like spam is detected today).


Part of it is that we read about "breakthroughs" in diverse fields only to see nothing come of them. Past experience creates valid doubt. Also, as exciting as this might be, we are nowhere near a practical application.

Overall, I'm glad there are still points of excitement and we haven't come to a halt.


Seeing headlines like this every year or so will do that to you


It’s ok to be cynical about things that are massively overhyped. This development is an important milestone, but it is nowhere near what it is being reported as.


I agree that the discussion generated from this article is not what we want on HN, but I don't think it is fair to criticize the comments as being reflexive contrarians when they are simply and correctly pointing out that the claims being made in the article are misleading at best. And these aren't nit-picky details about side-issues in the article - the are the core headline claims that aren't further clarified or nuanced in the article text, so guidelines to not "pick the most provocative thing ... to complain about" aren't applicable IMO - without posts correcting the article many reader would have a wildly false understanding of what actually occurred which isn't what we want either.

I think the best way to increase the quality of discussion for research results is to avoid posting misleading and hype driven coverage, so the discussion can then focus on the actual research results and their implications, rather than on the poor coverage.


We're hackers, engineers. We poke around for problems before there are problems and we pry open the black box to make sure it's not just filled with Bullshit. If you want to unquestioningly lap up everything that's offered to you, then I've got some ocean-front property in Afghanistan I'd like to sell you.


That’s definitely how many people here see themselves, but excessive scepticism can also be a problem, something which is overlooked by this crowd.

If you don’t push and help the many small steps that come before the big leap, many big leaps will never become feasible.


Over-hyping the small steps as big leaps is the problem. If the scientifically literate people here are sick of it, imagine what the voting public thinks.


“First time ind history fusion releases more energy that is put in” is a big leap. The fact that it’s still a technology in its infancy and decades away from actual use doesn’t make it any less impressive.


So the two options are to believe everything unquestionably or be suspicious/cynical about all new announcements.


I think the cynicism is linked to the cycles of bubbles.

When it was all on the upside, inflating the bubble, there was a fair amount of hero worship here for Zuck and others. People were talking about self driving cars being leased by the minute and changing the world, all with a straight face. Google paid an engineer over $110million because he was going to lead the effort to build a fully autonomous self driving car... As an industry, we've sort of failed on that one. AI/ML was going to lead to mass layoffs of people as we "automated" everything, there were companies just pouring money in to anything related to it to avoid being left behind. I think I heard at a conference over the summer that 90+% of all ML/AI project fail to make it to production; that's brutal, like half I could see but 9 of 10?!? Even if you're getting paid tons of money to do that stuff, wouldn't you want to actually achieve some success? Social media has sort of failed us too, the real media got involved and sort of took it away and then the Russians and Chinese have been using it to tamper with our elections and our ability to practice democracy. The internet is "decentralized" but just try to do that without Google or Facebook or Amazon or other... Since everyone seems to be convinced a recession is going to happen, it's going to take one to sort of get things righted and start the next bubble cycle. Or maybe how the gig-economy was going to change it all. Or everyone was going to learn to cook gourmet meals from blue apron and all the carbon used to move boxes of ingredients around was never going to be a big deal...

It's always based in hype. Every handful of years the geeks and nerds think they're going to take over the world again, maybe we'll do it next time.

In the mean time, any and every break through with fusion is awesome. I'm a geek/nerd so don't believe my hype, but when we crack the fusion nut, we will change the world.


> This website is seriously infested with reflexive contrarians and it’s a not healthy.

The initial flood of comments is always like that, because they are low-effort dismissals. The first 5 comments on every story could probably be auto-flagged.

The better stuff usually rises to the top eventually.


I think there are a couple different types of cynicism and one might be more justified than the other.

The first one I see is along the lines of "This was only net energy gain in the plasma and not overall so it shouldn't be called a breakthrough". The net energy gain in the plasma is still a huge step and rightfully called a breakthrough.

The second one is along the lines of "These are just intial results and the article says the data is still under review". This one I totally get. Replication of scientific results and accounting for all sources of errors is real big deal. The NIF had an experiment last year where they we able to achieve an ignition reaction but were unable to replicate it.


My experience is that if scientists are exited about it then it's probably not a big deal to non-scientists. It may be a small piece of a big deal in a few decades.

Don't get me wrong I respect all the effort it takes to do something truly new, inventing technologies that previously didn't exist with the height of what we can produce today, and every step forward is a triumph. But is tomorrow's announcement going to lead to a step-change in anyone's life before my infant daughter goes to college? I doubt it, and I have work to do. I'm happy to be proven wrong though!


"LLNL - if they’re excited about this then it’s a big deal"

Just because they're excited doesn't mean it's a big deal, nor any guarantee that it works or ends up being practical.

I've heard 'exciting' news about fusion many, many times over my entire life and essentially all of them have come to nought. Or after the excitement settles down over said development we still find that fusion ends up being that magic number of 40 years into the future.

I've even worked in the nuclear game but I don't expect to see my home powered by fusion energy in my lifetime, unfortunately.


It’s been clear like that for a while. Crypto threads are infested with nonsense, ignoring anything that’s even distantly related and ignoring any breakthroughs. Any new tech is poo pooed immediately.


We’re cynical because 100% of the announcements from the LLNL are essentially government propaganda.

People laugh at Russian TV, but to a lesser extent all western governments do the same kind of thing.

The LLNL has one job: research nuclear physics for maintaining the bomb stockpile without actually blowing any up in life fire tests.

Anything else they say is just there to make the public feel good about the billions being spent on weapons research.

There is zero — repeat — ZERO chance that the fusion approach used by LLNL will ever be used in any sort of energy production.

That’s not what their setup is for.


My father does radionucleidic metrology and every time there is a breakthrough at all, be it JET or something else, total rejection that it was a big deal or "real" or that its "not a large enough net gain reaction" to matter. Its wild

I think scientists are humans after all and they (much like people who rejected bitcoin when it was at $3 and now have to justify their pre-held beliefs) have to justify why they didnt think it was possible or "real" even in the face of multiple fusion breakthroughs.


> This website is seriously infested with reflexive contrarians

hey, at least all of them are highly educated and extremely correct about things. read about it on their blogs. (sarcasm enabled for this reply)


downvoters of this comment, be sure to leave a link to your blog, too


"Announcing a breakthrough" without replicated results is exactly what made cold fusion a taboo subject in the first place.

We are not 'reflexive contrarians' for going "I don't believe it until a lot of separate research groups show the same results". The whole point of the scientific method is to not believe somebody just because you personally know them or they are "respected". Their work has to be replicated for Science to take it seriously.


I’m a physicist and it’s absurd how much career concerns push us to overhype even the most incremental research effort. I’m not surprised the public are sceptical


> This website is seriously infested with reflexive contrarians

no it's not

( /s but also many of us have seen enough 'fusion breakthrough' and 'battery tech breakthrough' and 'medical breakthrough' and 'AI breakthrough' announcements that it's difficult to give anything much credence without at least a production prototype showing real world performance.)


This isn't practical and it isn't exciting. Shoving unexciting breakthroughs in everyone's face is part of the problem with science funding today. It's tough for actual breakthroughs to get traction in the news cycle because of all these underwhelming duds.


People get cynical when they’ve been duped a hundred times on a hundred different things. I’d blame media more than users.

But yeah, certainly seems like this time is different. Really hope we start seeing more breakthroughs after this as it’s great news.


HN comments are not thinking, doing anything, or building something. It is a place where you gain attention and karma not from some constructive act. People post constructive things, then commenters vie for attention. If you look at threads, the top "comment" on them is something about a completely different topic. And then they mostly go downhill.

Hacker News is a good source for interesting posts and idea. The comments are mostly worthwhile for watching how a social machine produces very weird stuff. It is not the people who are contrarians, it a function of the machine.

Zeynep Tufekci talked about how twitter affords outrage and the Arab Spring, but did not afford a way to do anything constructive with that outrage. (Twitter and Tear Gas, available as a pdf). HN commenting system affords .... what you see here.


> The experiment actually created more energy than expected and damaged the sensors.

Who else in their minds eye see smoke and sparks in the experimental facility and control room, and scientists and engineer wooping with joy ;)


Hacker News is absolutely totally broken with cynicism. It has been getting worse for years now.


"Exciting" for individuals within the field often does not translate to "exciting" for everyone else. It's quite reasonable to think there's a good chance this is not the beginning of a "practical for energy generation" fusion revolution.

It is very interesting, but in the same way that advances in particle physics are interesting.


An example of the context in which I want to tamper excitement comes from a post by a journalist writing for the FT, an outlet that is (relative to its peers) usually quite matter-of-fact: https://twitter.com/thomas_m_wilson/status/16020118886526320...

> SCOOP: Net energy gain in a fusion reaction has been a holy grail in science for decades. Now I’m told US scientists have done it. A massive breakthrough with revolutionary potential for clean power. US Energy Secretary to hold a press conference Tuesday: https://www.ft.com/content/4b6f0fab-66ef-4e33-adec-cfc345589...

Instead of particle physics, perhaps a better comparison would be to quantum computing "breakthroughs" that come out from time to time. Within the field I'm sure there are breakthroughs that inch us closer to something useful (useful in the way it is described in these articles, solving currently unsolvable problems, etc.), sure, but we are so far away from something useful that these inches are ultimately quite underwhelming to the general public (people like me).

By all means, I will occasionally read and enjoy great science reporting on these topics, but I have been conditioned to massively downplay the general significance of such news, and I think it's quite well justified, and not mere cynicism (cast as a negative).


"edit": I meant temper, of course


> “Initial diagnostic data suggests another successful experiment at the National Ignition Facility. However, the exact yield is still being determined and we can’t confirm that it is over the threshold at this time,” it said. “That analysis is in process, so publishing the information . . . before that process is complete would be inaccurate.”

From the article!!!


People are cynical because the world is already feeling the effects of climate change, the technology exists today to move the grid to zero emissions, and because the work required to do that is a quotidian, slow-and-steady slog, it gets ignored in terms of both funding and mindshare in favor of things like nuclear fusion experiments.


Are you claiming a key reason that low or zero emissions technology hasn't been implemented is due to scientists wasting time on nuclear fusion experiments? Not entire political spectrum who doesn't believe global warming to be real, overstated, or some sort of conspiracy.


Maybe we paid attention when our parents told us the tale of the Boy Who Cried Wolf.


People have just become unsensitized to clickbait, it's mostly the media's fault, they always use titles like "cancer cure discovered" to get more views and thus more money, the viewers see a thousand articles like this and keep getting disappointed to the point a real cancer cure could be discovered and no one would believe it. tldr:crywolf


It doesn't matter if we find an infinite energy source. It will just shuffle the powers around. Nothing will really change. Humans will shift their fight to something else and inequality will still be the source of most of our problems.


that's a brilliant phrase, a reflexive contrarian. They just go the opposite of, I've been thinking about this behavior of late, great way to characterize it.


I do not think I can meaningfully increase my levels of credulity (nor my skepticism). I strive to communicate my thoughts accurately. Given those two points, how is it not healthy?


It's mostly British folks in those comments as well. If you are looking at real estate in Britain, now you know what you are dealing with.


You’re just a reflexive (reflexive contrarian) contrarian


It's not reflexive-contrarianism as such; it's that the science press has historically been so, so bad that cynicism is the only healthy response. Think of the last 5 things you've seen in the science press: which ones were overemphasized? Which ones were exaggerated to the point that they didn't reflect anything meaningful about the actual result? And thinking back on the press releases over the years, what percentage of what you've read end up having an actual effect on the world? Add to that the fact that this is about fusion breakthroughs, something that has been wrought with complete disinformation by the science press since the late 1940s. Of course people here are going to be cynical about it.


Good news! FTL travel when?


Right after the reusable fusion rockets.


do I just wait until this gets walked back or...?


When can I pick up a Mr Fusion at Home Depot?


Is overcoming fission's political problems harder than fusions technical problems?


The 457th "breakthrough" in fusion this year...


I would say there have been a handful of important milestones this year, but this I would consider a breakthrough. Most of the other stuff is overhyped for sure.


good, keep em coming.


10k more and we might actually make some progress. Just 20 more years!


From the article.

"Although many scientists believe fusion power stations are still decades away, the technology’s potential is hard to ignore. Fusion reactions emit no carbon, produce no long-lived radioactive waste and a small cup of the hydrogen fuel could theoretically power a house for hundreds of years."

Not sure if you were expecting things to progress faster. But it it "only" takes 20 years. That would be insanely fast and world changing.


> produce no long-lived radioactive waste

It's important to note that while this is technically true, it's mostly irrelevant. Sure, there's no material that will remain radioactive for the next 10k years, but instead you get much more highly radioactive material that will emit high doses for a "short" hundred years or so.


It's worth noting that the last 2 generations of fission plants were guaranteed to produce no waste, to be cheap, efficient, reliable etc. The unpalatable truth here is that we have no idea what fusion power will look like until we have built a few. The quoted section made me laugh as it's easy to be zero carbon when you don't actually exist... :)


Can you elaborate more about the guarantees about no waste?


Sorry, I was actually making a joke: fusion power has been described as "a decade away for the last 50 years" which I think sums it up pretty well...

https://www.engineering.com/story/why-is-fusion-power-is-alw...

The potential is hard to ignore, but that doesn't mean the potential will ever be achieved. This (like crypto currency) is the realm of vapourware I am afraid. Always just around the corner. :(


>vaporware

Have you never heard of ITER? Its set to power on in 2025.

https://en.wikipedia.org/wiki/ITER


What's sort of my point: we've had big projects that would totally definitely work this time every few years since the 90s. Will ITER work? Maybe. Would it be the first to fail (or even the 10th) if it doesn't? No. Per your own link there are literally 100s of other "reactors".

Its the same as crypto or emissions reductions.


What value are you contributing to this conversation?


I remember hearing about ITER back in school, a long time ago, and being told that they were just about to finally assemble the thing now.

That's pretty much the definition of vaporware, but maybe it will actually go the route of Duke Nukem :)


Construction began in 2007 and its moving along according to schedule.

https://en.wikipedia.org/wiki/ITER#Timeline_and_status


It is being assembled right now, it is just taking a bloody long time to do so.


What is "zero-carbon" in this context? No graphite control rods?


I think it's there for people who may not be familiar with what fusion energy is, so they can understand that it's a potential climate change solution.


It means no fossil fuels required to sustain the reaction, and no carbon emissions resulting from it.


It’s unnecessary greenwashing hyperbole. Of course there will still be carbon emissions from the production of the reactor parts and the sourcing of fuel ingredients. The potential benefits of working fusion are far greater than carbon worries, and the media sells it short with narrow-minded labeling.


Just because you understand the impact of fusion on carbon emissions, does not mean

>It’s unnecessary greenwashing hyperbole

OP's question provides evidence that not all people understand the carbon benefits of this technology.


A fusion rector does not have control rods. It has a magnetic containment field around a plasma which is, something like 10x hotter than the sun. if you put a control rod in there it would instantly vaporize.


At the risk of being pedantic, if this is the LLNL NIF, then it's ICF, not MCF, though putting a graphite rod at the heart of a laser-driven thermonuclear event probably looks about the same either way.


There are no control rods in fusion reactors.


Solves no problem.

Fusion plants have exorbitant feedstock price volatility and are only marginally smaller than a fission planet, despite square footage not being the scope of the worlds' energy problems today.


This is no different than the hundreds of "fusion breakthroughs" we've been reading about over the past 20+ years. Progress is good, sure, but we're tired of celebrating small incremental gains.

A leap forward or two might be worth celebrating along the way, sure, but we're at least 3 orders of magnitude away from actually generating net power here.


I don't yet understand why this is better than Fission. Surely Fission provides us with unlimited carbon free energy (given enough fissionable material).

What will Fusion give us that Fission can't already? Is it safer perhaps?


I think it's just like thorium molten salt reactors. It's a new awesomer kind of nuclear energy that doesn't have any of the baggage of fission!

Certainly, fusion does have the big advantage that it makes far fewer Curies of radioactive material per kWh as it operates. That has been the main driver of nuclear fission safety and waste issues.

On the other hand, there are good arguments suggesting that conventional fission has been reasonably good at containing and controlling the radiation, such that it's among the safest and cleanest forms of energy known already. But the PR issue is a hard one, and people don't think like actuaries.


I think the main difference is safety. Simplifying / IAMAPhysicist, but you can't get a runaway chain reaction with Fusion, and the reaction tends to just burn itself out if you shut it off.

That being said, fission is already pretty darn safe. But the public perception of it is not good.


Issues of potential output and safety considerations aside:

> Surely Fission provides us with unlimited carbon free energy (given enough fissionable material).

The crux of the problem is, there is a limited supply of fissionable material. If we manage to survive as a species, our energy demand will continue to grow, and one day we would meet a hard cap, limiting what humanity as a species, is able to do.

As a very very rough estimate, if we burnt through all the fissionable material that we have available on earth, it would be about enough energy to launch the mass of Mt. Everest into orbit. Long term (as in, many generations from now) we will need more energy than that.


> I don't yet understand why this is better than Fission

Realistically, today, it's only better because of decades of lobbying and propaganda for fear mongering around fission. There is no reason why nuclear energy couldn't be the vast majority producer of all electricity in the world while massively lowering environmental damage and loss of human life.

Long term, fusion might be better because it can produce a lot more energy and be safer. I feel like the safety improvement is negligible however compared to modern fission reactors that are properly maintained and governed.


Yes there is reason: The high cost, the unsolved waste issue and the inherent strategic danger such a centralization of power production would pose. You'd only have to hit a few large power plants to take out the electricity of an entire country. When done right, attackers can even create more destruction and chaos by initiating a meltdown.

Since you brought up lobbying, it's fascinating to me how many nuclear power fans the industry has created who are not informed by data and facts but are utterly convinced that nuclear power is the solution to all our energy issues.


The cost is not high if you consider the cost of pollution caused by other methods. And nuclear waste is absolutely a solved problem and is entirely unproblematic. These are thoroughly debunked talking points.

> attackers can even create more destruction and chaos by initiating a meltdown

This is not a feasible thing to do with modern reactor designs, and the same danger is present for infrastructure like dams or even just a big building.

> not informed by data and facts

Said by the person who's repeating misinformation


It's not feasible but also it's the same danger as with dams? Which is it? In reality, both nuclear power plants and dams have been attacked many times over the past decades. More decentralized power production has other disadvantages but clearly it can't be targeted as easily. The attacker would take out a single wind turbine or someone's roof PV. The effects would be negligible on a national level.

The cost btw is even higher than most people think considering that energy companies aren't paying for most of it but tax payers do. Some not even born yet. But even without factoring in those future costs, as you suggested we do, nuclear power in its current form is among the most expensive forms of power production. Again, look at the data, not energy company propaganda:

https://en.wikipedia.org/wiki/Levelized_cost_of_electricity


More than 8 million people die every year from breathing polluted air containing particles from fossil fuel emissions. What would you put the cost of that at? Until we factor that in as an a cost for carbon emitting power generation, we cannot make a fair comparison of the cost of nuclear.


The pessimist in me says that some building(s) are going to burn down, one or more persons will be found with two bullet wounds as "obvious suicide", and that any and all supporting documents will be "lost".

Because we simply cannot live in a world where we are independent from the current power structure. They won't allow it.

Hopefully I'm wrong, I'd love to see progress in energy production that is actually sustainable.


That isn't how They would do it were They to set Their face against fusion.

They'd just make people afraid: Focus on how it's nuclear, how it's dangerous, and get the people to want it banned. A few laws in some key jurisdictions and it's over, and the scientists can rant as much as they want, it'll never be a problem again.

Of course, all of that partakes of the comforting illusion there even is a Them in the world.


This is like the "great-man" theory applied to scientific research. Even if this were to happen, I don't think it would matter much in the long term. The scientific community seems to independently come up with the same or similar solutions to the problems being concentrated on.


We'll see the same thing we see around fission. Lobbying and fear mongering. Politicians lining their pockets in exchange for delaying and blocking and refusing to fund fusion.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: