Hacker News new | past | comments | ask | show | jobs | submit | more jokteur's comments login

I've seen also people arguing that we should never ever build nuclear fission again because it is "bad".

What I want to say is that bad faith arguments will always exist, and calling this a breakthrough won't significantly increase the number of people using bad faith arguments, because they would just find another excuse instead of the fusion one.


I see people reacting left and right saying "yeah, but the lasers used 300MJ" like a gotcha.

At the press conference, they were pretty clear about what they achieved. They stated multiple times that it was the laser energy in, and not the wall plug energy. Also they also said that the lasers weren't designed to be efficient in the first place, because they want to maximize scientific output. And they were pretty clear that there are many many steps required until we have fusion energy.

It is a significant milestone, and people are trying to downplay that by stating fusion will never be feasible anyways, and this is why we shouldn't be excited.


So, the actual achievement comes down to this:

> “The fusion reaction is heating the fusion reaction, which is making more fusions happen,” says Steven Cowley, director of the Princeton Plasma Physics Laboratory. “It’s like the fire has been lit. This is the first controlled fusion ignition that we’ve ever seen, and that’s spectacular.”

Well for me it's debatable whether this is such a huge achievement, because the distinction between controlled and uncontrolled fusion is a bit academic in this case. In uncontrolled fusion (https://en.wikipedia.org/wiki/Thermonuclear_weapon), you are using a nuclear fission primary stage to generate enough energy to start the fusion reaction in the secondary stage. In the NIF, you are using lasers (that have to be aligned to trillionths of a meter and damage their own guiding optics everytime they fire) to start the fusion reaction in a smaller pellet (that costs hundreds of thousands of dollars). So the only real difference between the first and the second case is that the H-bomb is smaller, much more expensive per amount of energy released, and explodes inside a chamber. And it's obvious to pretty much everyone who is paying attention that transforming this setup into a working and cost-effective fusion power plant is a very tall order...


I'm one of those people. But it's mainly various reporters I'm angry at, for misreporting and hyping up the story. I saw none of what you wrote in the news.

And well, this really doesn't seem like a very viable path to a power plant, though I'm all for more fusion research.


I wouldn’t be angry at the reporters. They don’t come up with this stuff by themselves. LLNL has a staff dedicated to public relations¹. The insertion of misleading phrases and ideas into news articles is a longstanding, deliberate practice.

[1] https://www.llnl.gov/news/media-contacts


Also, people fail to take into account that even with reduced consumption, we would still need to improve the electrical grid. We have to electrify transport, heating, agriculture, industrial processes, ... Even if we all stopped buying iPhones and computers tomorrow, if we are serious about decarbonisation, we still need more electrical energy.


Yes, try to imagine a London - Tokyo non stop flight in 1902. Would be pretty crazy to imagine, considering that the longest distance the Wright brother flew was 180m.


As they said in the press conference, the lasers weren’t designed with efficiency in mind, because it is not the goal of the experiment.


Realistic fusion (with the best understood technology): build powerful magnets around a donut shaped chamber, which allows to contain a plasma comprised of Deuterium and Tritium (both Hydrogen isotops) which is then heated by externals sources. Reach very high temperatures such that fusion reactions occur frequently. Some of this energy stays inside the plasma, and some of it escapes under the form of neutrons. Capture these energetic neutrons in a blanket around the chamber, creating fuel (tritium) and heating water pipes that then drive a normal steam turbine. Tritium is radioactive (but has a very short shell life; just wait a couple of decades), and the chamber may be slightly radioactive after decades of neutron bombardment. There are no problems of long term radioactive waste, and the reactor can't do a chain-reaction, so no Fukushima or Tchernobyl.

I need to explain what Q is in the context of fusion. Basically, you heat the plasma with some energy (Energy In), and the fusion reactions produces some energy (Energy out). Q is basically the ratio (Energy out)/(Energy In). When Q is bigger than 1, we call it break-even. However, (Energy In) is not the actual cost of energy you need to run the whole facility, it is only the Energy that reaches the plasma. The same goes for (Energy out): this energy cannot be captured 100% efficiently. Some of it will heat the plasma itself, some of it will escape but the conversion back to electricity is not 100% efficient.

So in a sense, Q > 1, aka break-even, does not mean commercial fusion, it is only a kind of a psychological barrier to achieve (so this is what the NIF announced; still a major breakthrough). We need at least to achieve (Total Electrical Energy out)/(Total Electrical Energy In) > 1 to achieve commercial fusion. But physicists consider the rest as engineering problems, not physics problems. And great news, there is no theoretical limit on how big Q can be: for example, the sun has a Q of infinity, as there is no required energy input. Current estimates put Q at least 30-40 to achieve commercial fusion (again: there is no physical limit to achieve that, only engineering difficulties).

Main costs are: difficult to define, because we haven't commercialized a reactor yet. I would say, for now, everything around it is expensive (magnets, the blanket, the fuel (tritium)). However, once we have sufficiently understood the optimal parameters on how to produce net gain energy, there is no reason why the design of the reactor can't then be simplified to be mass-produced.

Note: the technology used by the NIF is very different from what I described for a realistic fusion device: what I described is called magnetic confinement, and what the NIF did is called inertial confinement.


Thank you, "Current estimates put Q at least 30-40 to achieve commercial fusion (again: there is no physical limit to achieve that, only engineering difficulties)" is exactly what I was looking for.


Even for certain applications, 50ms is quite a lot. When using MIDI devices and playing piano over the computer, I certainly notice anything above 20ms.

We may be slow to react to sudden events, but we are very good at noticing the lag for predictive events (keyboard typing, piano playing, ...).


Yes, a single 60-Hz-frame delay (~16.7 ms) can certainly be noticeable.


Martin Molin of the Wintergatan fame recently posted some experiments regarding latency on his YouTube channel[0]. From watching those videos it becomes quite obvious that your statement is definitely correct.

[0] This is the first one about the recent experiments: https://www.youtube.com/watch?v=VyIk0IqC7SQ


Yeah this rule of thumb was for low fidelity things like hand radio buttons.

> We may be slow to react to sudden events, but we are very good at noticing the lag for predictive events (keyboard typing, piano playing, ...).

It does make sense in my lay person’s understanding that the chain of eye -> brain -> hand for reactive events would be slower than brain —> eye + hand for proactive events.


People cycle in the city of Lausanne (Switzerland), which has a 500m height difference between its lowest point and highest point.

Of course nobody is doing these 500m every day, but its common to see parents bring their children to school by electric bike, and maybe doing 100m of height elevation.


You can also make the road narrower and put obstacles on the road to force to reduce the speed: it is very effective, and very cheap to install with little technology.

Here is a typical example of narrow street with obstacles in Switzerland: https://www.google.com/maps/@46.8111357,7.1507033,3a,75y,167....


Correct, but that takes active effort on a road by road basis, which becomes a political problem on a road by road basis.

In a world where all cars are automated, establishing speed limits is presumably something you do once and auto manufacturers will be forced to comply.

Similarly, issues with cycling safety and pedestrian safety are instantly mitigated without complex study groups to examine why this or that particular intersection has seen three cycling fatalities.

All of this is assuming the creation of currently-mythical technology, or course


I know that most HPC cluster do have a limit on the number of files per user ; this pushes scientific applications to generate big hdf5 files.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: