I had that same criticism in my head while reading the first half of the article that mainly discusses the energy use of the project. That criticism was dispelled in the second half where fundamental problems with fusion reactors are discussed.
This article will certainly not cause me to dismiss fusion as a potential energy source outright, but it makes it very clear that even with ITER up and running, we are nowhere near being able to use fusion for energy generation. [1]
To pick up your analogy: It took only 1-2 decades to get from the Wright brothers to commercial aircrafts. In that sense, ITER looks more akin to one of Da Vinci's flying machines.
[1] Which makes me wonder about the buzz about Lockheed-Martin's skunkworks fusion projects.
All the size objections at least are just due to ITER being an obsolete design. MIT's ARC would use the same plasma physics and would have the same power output, but in a 10X smaller package by using modern superconductors.
There are also lots of alternative designs (including LM's); they have higher scientific risk because the plasma physics isn't understood as well, but may be more practical if they work out.
This is a good point, thank you for bringing it up.
I watched a talk by one of the MIT people on this, and their case is tremendously compelling. This is particularly true when you compare it to the pitfalls of the ITER project.
At some point, your megaproject is just too flawed and too out of date to make sense on its own. ITER's consumption of Tritium may do more to damage the future of fusion power than what it offers in terms of understanding of the plasma.
Sure, the newer designs are risky, but how is ITER not risky? Any decent risk analysis needs to look at things in terms of alternatives. I think ITER will give important knowledge in terms of plasma physics, I just cannot begin to fathom how that is worth the 10s of Billions and waiting 2 decades to get it.
The tragedy of megaprojects comes in the form of opportunity cost.
ITER is about farm more than just plasma physics. Tritium breeding from lithium is critical for any D-T reactor design and requires taking the vessel apart remotely and processing the blanket because of it's half life. That's just one of the many things involved and why you want a 'safe' design.
That's an excellent point, and this topic was specifically dealt with in the article.
> Just 2 percent of the neutrons will be intercepted by test modules for investigating tritium production in lithium, but 98 percent of the neutron streams will simply smash into the reactor walls or into devices in port openings.
It kind of sounds like they're not sure if breeding will work in this kind of reactor design. If not, I guess D-T fusion is just off the table as an option generally? If that's true, this technology track simply doesn't work for the purpose of energy production.
Maybe I'm being too unfair? Maybe these test modules are exactly what's needed, maybe a larger area of the modules would have costed too much. That just seems like a strange argument. I don't feel like I can make heads or tails of this. It just feels suspicious.
You can't really use liquids around plasma. You can try to have liquid lithium behind inside a solid container, but wall thickness directly relates to wall lifetime AND you want the thinnest walls possible to maximize tritium production.
Worldwide Tritium production is on the order of 1 lb per year, that's not going to cut it when you want to scale up.
Exactly, replaced once a year that's easy to write but very hard to actually accomplish.
Just like ITER it's extremely radioactive during that process, so you need to handle all of that remotely and your reactor design must be capable of both opening and closing regularly.
Sure, it does not seem as sexy as superconductors, plasma, etc but it's a surprisingly hard problem.
PS: My point about not scaling up Tritium production just means reactor designs need to be net possessive Tritium producers, which is why the walls need to be thin and swapped out regularly.
One thing that makes it relatively easy to open the reactor is that the superconducting tapes can have joints without significant resistance, instead of being a continuous wrapping. They've tested the joints to make sure of this.
The first point about the energy used to actually build ITER really does the article a disservice IMO. I mean you could make the same point about the LHC or basically any kind of big scientific project, it feels shortsighted and besides the point.
The rest of the article is more insightful, although if you followed discussions around ITER and tokamak technologies there's nothing really new in there. We're trying to put the sun in a box but we don't know how to build the box.
If I may? If you want to do footnotes on HN you can use the dagger symbol, like so†. Dagger is Unicode U+2020 which you can read about here[1]. Asterisk is out of bounds because it is typographically significant in Markdown and dagger is the next in the order[2].
It's just that conventionally numbers enclosed by [] are used for non-inline hyperlinks (denoting cited references) in order to not clutter up the text with incomprehensible and long URLs. I think this is from text email culture but it could go back even further, I am not sure about that.
† I am a footnote indexed by a dagger. On Linux at least you can enter Unicode by using the key combo Ctlr-Shift-u and keying in the appropriate number. Dagger is easy to remember because it is 2020.
I like daggers as much as the next guy (do you know the code for double dagger when you need a second one). However plenty of publications mix footnote citations and footnote comments in the same style without any confusion or aesthetic issues. Is there a good reason (other than coolness) to use dagger?
I was operating under the impression that a single asterisk would muck up the Markdown parsing on HN. Turns out I'm half wrong it seems. Some work. With the italicised tests below, the first has a prepended asterisk, the second has an appended one.
* Test
Test*
Test
Test
Test *
I think I just started using daggers because U+2020 was super easy to remember and to avoid Markdown issues and the habit has stuck.
You don't by any chance use Vim, do you?
I use the Unicode plugin[0] by Christian Brabandt in Vim to quickly search for Unicode characters.
You can use vim-plug[1][2] by Junegunn Choi, for instance, to install it and keep it updated or just install it manually yourself.
What's far more interesting has little to do with ITER. Who is paying for these hit peaces?
His last peace was linked and had a long diatribe about this issues with tritium breading starts with "the most comprehensive analyses indicate that there can be up to a 15 percent surplus in regenerating tritium." Adds plenty of "cannot make up" etc and ends with "approximately 10 percent of the injected tritium was never recovered."
1.15 * 0.9 > 1
Looking at them both they are very careful not to directly lie while being as misleading as possible.
This puzzles me immensely since nuclear fusion is far more save than nuclear fission. So if you're an organization opposed to the dangers of nuclear, wouldn't you, pragmatically, want a less-harmful alternative to exist?
> ITER is not a power station it is a research project.
That it is not, and that there is no such thing yet, is itself testimony to the difficulties of fusion. I wish it were otherwise, but these are the facts.
On the other hand, the author is being misleading when he considers ITER as if it were a power plant, except where he can show that its problems are inescapable problems of any plausible fusion-power scheme.
If / when it becomes possible to build fusion power plants, these problems should be weighed against the alternatives, such as the undeniable (though often denied) problems of climate change.
I notice that the author became convinced of the futility of fusion research only after his income became in no way dependent on society's hopes and expectations for it.
> I notice that the author became convinced of the futility of fusion research only after his income became in no way dependent on society's hopes and expectations for it.
I notice that lots of people can give highly detailed accountings of their ex-spouses failings as an individual post-divorce than they would have pre-divorce. Perhaps it's because their income is no longer dependant on their ex, or perhaps it's because they're no longer under the influence of infatuation...
A stupendous amount of our energy "debate" is emotional and fantasy driven, not deeply related to pragmatic low carbon energy production at meaningful scales. I speculate this is related to messaging and disinformation campaigns from entrenched interests, but maybe youthful idealisation and generalized ignorance is a larger contributor...
Yep. They are doing fundamental research. Maybe it will have practical applications. Maybe not. Sometimes you should just give scientists a bag of money and let them do their thing.
ITER is a research project. As with all research, its main purpose is discovery of "incidental knowledge" , i.e. things you did not set out to research, but are important to know for future projects in the direction. ITER has between 1000 and 5000 highly qualified workers and scientists on site. The experience these people gain during the project can hardly be expressed in USD.
Furthermore, many companies gain experience in manufacturing parts for ITER - experience that might be useful for other projects in the future too.
Lastly, many different players collaborate in ITER so the whole of things is also a management experiment.
This article is a "critique of fusion as an energy source", not a critique of ITER as a research project. Part of its complaint is about the marketing behind ITER, including the "misguided motto" and misleading information about the ITER power balance.
It brings up some of the same benefits you did. "[ITER's] most favorable legacy is that, like the International Space Station, it will have set an impressive example of decades-long international cooperation among nations both friendly and semi-hostile." and "A second invaluable role of ITER will be its definitive influence on energy-supply planning."
Really? Billions and billions of Euros, hundreds of thousand of metric tons of concrete, hundreds of MW of power, thousands of gallons of fresh water-per-minute, tends of thousands of nuclear waste, to run a "management experiment"?? This has to be the most ludicrous, expensive and ill-designed case study in the history of mankind.
Also, it seems the word "experiment" is inadequate. An experiment requires
1/ a theory and an expectation of what should happen
2/ a rigorous, controlled and replicable setup, where you test the theory and compare the actual result to the expected one
Having many people from different nations working together does not qualify as a "experiment"; maybe it's an experience, or a happening (like a concert). It's the Woodstock of science; it must certainly be fun for all involved but it's unlikely humanity will make much progress because of it.
This is a disingenuous quote -- the parent said 'so the whole of things is also a management experiment'.
You then go on to attack that decontextualised fragment by saying:
> Really? Billions and billions of Euros, hundreds of thousand of metric tons of concrete, hundreds of MW of power, thousands of gallons of fresh water-per-minute, tends of thousands of nuclear waste, to run a "management experiment"?? This has to be the most ludicrous, expensive and ill-designed case study in the history of mankind.
And I'd suggest:
metric tonnes
hundreds of MW
thousands of imperial measurements of water that are re-usable
tends <sic> of thousands of nuclear waste ... means what, precisely?
refer you again to the 'also' runs a management experiment
This clearly is not the most ludicrous, expensive, and ill-designed case study in the history of mankind, no matter how the coal industry wishes to describe it.
I fail to see how the "coal industry" reference applies to me however, or indeed, critics of this mammoth project that makes absolutely no sense.
I'm French and have been listening to politicians telling us the billions of Euros spent, and tons (why tonnes?) of concrete in an otherwise nice region would transform the production of energy, for most of my adult life.
As I've always suspected, that's complete BS, as the article makes clear. Why people who are not politicians would want to defend ITER, I don't know.
It should be noted that all non-tokamak approaches to nuclear fusion offer cheaper costs than ITER. Tokamaks like ITER are an outlier when it comes to projected/actual costs. I'm going to avoid listing my personal favourite fusion project yet again on HN, but what I will say is ITER is not the only game in town, and there's lots of promising research being done into non-tokamak designs.
I agree, but will mention that MIT's plan is a tokamak. But it's ten times smaller, because they use new, commercially-available high temperature superconducting tapes that weren't around when ITER was designed. This lets them use a much stronger magnetic field. (MIT doesn't have funding for it but the private company Tokamak Energy is doing something similar.)
MIT's design is also easy to take apart, has a 3-d printed reactor core that gets replaced once a year, and surrounds that with FLiBe, the same molten salt used in some fission designs, for cooling and tritium breeding.
It's also all a design, and you'd be a fool to believe any upfront cost-projections are going to be hit.
ITER was not as expensive as it is back when it was designed either, but building things which have never been built before is full of risk.
In a sane world, we'd be funding both and probably more, seeing as how fusion power is basically one of the single most important technologies we could perfect in the modern age.
True, but we have built working tokamaks like JET, which is about the same size as MIT's design, so it's not like the numbers come out of thin air either.
When you start comparing the $20B cost for ITER versus "big" projects like the human genome, which took $1B over a similar time scale, ITER really does seem like a boondoggle. It's sucking up massive amounts of research dollars and attention on scientific/engineering goals that don't necessarily even move in the right direction! Suppose they succeed; then what? What has been learned that can spur the next $20B? Will the next tokamok take an order of magnitude less money to produce, like the next genome did the day after the first was released?
Big projects always use ridiculous and deceptive hype, but "unlimited energy" takes that to a new level, IMHO.
I'm certainly not reducing the advancement to dollar values. In contrast, the expenditure necessarily must be evaluated in money and time, and it's an age old problem: who gets the research allocations?
The advancements are often unknown, but even if the genome project hadn't had the side effect of spinning off a massive amount of technology, the end product would be a significant advancement that would have accelerated all manner of biological research. This was biology's biggest project ever, and it is massively dwarfed by the boondoggles of physics.
In contrast, what can we learn form ITER? The opportunity for learning seems awfully minimal, and if it does complete its goal, as the article points out, it's not terribly desirable. And if the end goal isn't desirable, where's the side-effects of tech development? There appears to be none. However many contractors get rich.
Potential for advancements can be difficult to judge, and the biggest advancements are almost never expected. But when judging must be done about potential, it's typically best done by those close to the field rather than by political means. ITER seems to be more of a political beast [1] than a scientific one. A political beast that has now robbed the field of fusion of $20B of dollars that could have gone to other research projects. (There's significant intra-science politics of course, that greatly steer the course of research as much as scientific goals. However I'd much rather go with those political winds than the politics of nations when deciding research.)
> In contrast, the expenditure necessarily must be evaluated in money and time, and it's an age old problem: who gets the research allocations?
Sure, but a) when does the evaluation occur (after one year, after two years, after ten years, or after a new or accidental technology has been demonstrated to save humanity from certain destruction)?
> In contrast [to genome projects], what can we learn form ITER?
Experimental physics experiments have always suffered this comparison. I can't address your question, sorry.
Your suggestion that it may all be about the end goal (as if there's always and only a single end goal for research) is perhaps intentionally misguided - as you then hint at in your subsequent paragraph.
ITER does not do fundamental research like the LHC, they focus on very specific technical problems in the field of fusion power - as if it's already a given that their particular approach to power generation is desirable and workable.
The author's concern is that even is the problems are solved, tokamak fusion power is still a prohibitively expensive and unsustainable idea; it's only natural to question why those research resources funds aren't devoted to other areas that at least make an attempt to obtain directly useful results. The tritium economy issue is particularly wexing, any commercial D-T reactor will need outside, fission produced tritium.
I don't mean specifically to compare to biology, but even within physics, is ITER a good place to put research money? Though I'm not a physicist, it seems that even many other fusion projects would be better bets.
Human genome cost well over 3 billion dollars and was mostly a pointless waste of funds. Now days we could do the same thing for under 1/100th the cost, but the focus was on making a useless 'map' rather than improving sequencing or finding out what DNA was actually doing.
That's the most preposterous assessment I've heard of the genome project. The reason it costs so little now is because of the tech development that happened to make the genome come together.
Not sure how you are going to sequence to find out what things do without having that map. And I sat that as someone who is entirely interested in what things do and is not so interested in the map itself (check my profile)
First off DNA does not just sit around on it's own. You get computation from all the things wrapping around DNA. Making a the specific sequence less important than generally assumed.
Anyway, location is almost meaningless. It's the actual sequence that's important and the genome project did not provide a full sequence of an actual human genome. So again, useless on it's own with a completely arbitrary end date for 'success' that again did not actually mark the accomplishment of anything.
Worse, to be useful as a map you need to use much faster sequencing techniques to verify in what way some specific population is different from the average sequence. And then correlate that population with some difference to try and find out what that does. But, once you can do that you can do the entire sequence more cheaply.
At best it provided funding for biotech, at worst it was 100% political in both motivation and execution.
PS: ITER is more than 30 year old design at this point, we have several designs that should be vastly cheaper that are simply less verified. What ITER tests is not about cheap it's about verifying what's going on when you get more power from fusion than you get from external heating. That's critical for any form of sustained magnetic confinement. It also let's you test things like Tritium breeding that again required for D-T fusion at scale as we simply don't have enough Tritium to use it as fuel at scale but we have plenty of Lithium and Deuterium. It also tests remote handling of the containment vessel which is again required for any useful design etc. And to do all that and more you want to take the lowest risk path not just high risk high reward.
So I see you did not bother to check my profile before responding, so I'm not sure if it's worth even typing this out. You also have preposterous positions that are so far outside mainstream views that you must clearly be a full outsider, which would be 100% OK if you were bothering to support your positions at all. However, in case it is worthwhile...
I mostly study RNA, protein signaling, and protein-DNA interactions. So you're preaching to the choir on the necessity of moving beyond the simple map.
All of current day's techniques for interrogating what you want:
* ChIP-seq for
* Chromatin marks
* TF binding
* CTCF binding in particualr
* Enhancer identification
* Hi-C to find topologically associated domains and laminar domains
* Methyl seq
* RNA-seq (yes, even RNA-seq)
* Genome-wide association studies
* Linkage disequilibrium
ALL of these use the genome map as the basic founding block. Saying you don't need the genome map for this is like saying you don't need a compiler or interpreter or even an assembler to write code. Actually, it's more presposterous than that, one could feasibly write machine code in hex by hand, none of these basic building blocks of modern understanding of the cell would be possible without the genome as a basic map for study. A reference genome is the first step of study for any organism, because it is the foundation upon which everything else is built.
That's not to say we don't need in vitro study of individual proteins, loci, etc. We do. But we need much much more, and the genome map serves as a basis for nearly all study of individual loci these days.
>Worse, to be useful as a map you need to use much faster sequencing techniques to verify in what way some specific population is different from the average sequence. And then correlate that population with some difference to try and find out what that does. But, once you can do that you can do the entire sequence more cheaply.
This is wrong, a map is even more helpful if you can only do low throughput sequencing. In 2001-2003, before the advent of massive sequencing capacity, the genome was super useful. SNP chips, like the ones that 23andMe use to this very day, provide weak but cheap ways to profile common population variation, and the genome provides the way to start to link significant variants to protein coding genes, or enhancers or whatever other 3D features we discover in the future of genome research.
>At best it provided funding for biotech, at worst it was 100% political in both motivation and execution.
Yeah, tell that to all the biologists that use genome browsers, or do high throughput resquencing projects all the time. It's absolutely indefensible. Huge amounts of biotech also spun off of a small $3B government investment, just like the Silicon Valley spun off of lucrative defense contracts, but the government investment spurred industries that changed the world and and what we can do in it.
edit: you may have also skipped over my initial objection as "improving sequencing" would have been a great use of those funds.
I come at it from the plant side which did plenty of useful and actual DNA manipulation without a "map". Vs. the human side which was almost entirely focused on drugs.
So, saying it's necessary is absolute bullshit. Useful, sure, necessary no.
Further, if they had waited 5 years to start they could have finished at the same time or even faster for less money. It's execution was an extreme waste of resources without any need to 'hurry'.
Remember 1990 was 28 fucking years ago it was a very different time.
PS: That said 3 billion is a literal drop in the bucket. And yea, the output was not worthless in absolute terms, my objection is simply one of timing. It would have been like trying to build self driving cars in 1950 the tech was simply not ready.
I think there are quite a few credible alt-fusion projects out there. I have no particular animus towards ITER but I am disappointed that its multi-decade scope, megabudget, and sunk cost, are starving research allocation into other approaches. Some of which seem to have clearer paths to energy production than magnetic confinement fusion.
My favorite dark horse is the inertial electrostatic confinement Polywell design, which is currently being funded (crumbs in comparison to ITER) by the Navy, who would of course love a submarine sized fusion reactor. https://en.wikipedia.org/wiki/Polywell
Cool thing with the Polywell - if the physics actually pan out, the full size one should be able to fuse hydrogen and boron, which won't produce neutron radiation, just alpha particles that can be directly converted to electricity.
Who cares? How much were the solar cells on the market recently? 22 cents a watt? Any chance fusion could beat that price, even taken into account solar's dismal load factors?
But you need access to sunlight for solar panels to work. You can't have that constantly due to the rotation of the earth. Or if you move further away from the sun. Or into the constant shadow of a planet or moon, to shield your scientific equipment from sun radiation. Or even underwater/underground on earth. Etc.
So there will always be a use case for fusion energy, if it works.
>You can't have [access to sunlight] constantly due to the rotation of the earth.
This is the main counterexample to me, since it applies to commercial power generation on Earth. Even underwater/underground, in probably 99+% of cases, a simple cable is easier than a self-contained fusion reactor, and for the remainder of cases (submarines, aircraft carriers, space colonies, etc) it has to compete with mature self-contained fission reactors on cost/safety/mass/volume/longevity/acoustics.
With that in mind, couldn't anovikov's question be trivially rewritten to
"Any chance fusion could beat that price, even taken into account solar's dismal load factors and the Li-ion batteries it needs to deliver constant power?"
Remember too that energy storage has been following a slow Moore's Law (halving in price every decade), and looking ahead in the R&D pipeline it shows no sign of slowing.
Solar is the lazy person's (hacker's) fusion. Who needs to put the Sun in a box when we have the Sun itself? Seriously, why waste money on the box? (again I refer to commercial power here, not space or military) Understatement of the century here, but gravitational confinement has been rather extensively demonstrated.
With this fundamental laziness advantage, I expect mature commercial solar to economically beat mature commercial fusion in the long-term.
Heck, fusion doesn't even have a scaling advantage over solar. They're both limited by the Earth's heat rejection capability. If you built enough fusion power to exceed the power of the sunlight that hits Earth, oops you just fried the surface...
TL;DR for grid electricity, solar+batteries will inevitably "win" over fusion.
True, there may be uses cases for it. But with renewables price collapse in the last 10 years, and very likely further at least 2x price reduction of solar and slight on wind, this is no longer a make or break thing for humanity. More research and commercialization of advanced batteries would further reduce importance of fusion. I am getting to think that fusion has nearly ran out of time. 10 years down the road, potential market for it may become too small to justify the immense expenses - and by that point, 'mainstream' projects are not expected to bear fruit yet.
It's easy to make the leap you're making but while intermittent renewable costs have indeed been falling, they're doing it in an environment with cheap dispatchable backup natural gas. Once you get to around 50% intermittent renewable penetration the storage costs get crazy. In a four-state area in the Pacific Northwest from December 5 to December 15th, all 4 billion watts of wind generation sat dormant because of a wind lull. Batteries for that would cost $90 billion and take up a football field 100 stories tall.
Batteries for that would cost $90 billion and take up a football field 100 stories tall.
Cost is certainly a reasonable argument to make. But not the "football field" argument for how much area it would take. Given how small a number that is, I just can't see how that is concerning at all. And besides, nobody is going to place $90 billion of batteries into a single structure.
A football field is 5351.2 square meters.[1] Instead of a single 100 story tall structure, how about 100 separate single story structures? That's 535120 square meters.
Let's consider a site in the Pacific Northwest that has lots of excess land available, the Hanford Site[2] (for some reason, nobody wants to use it for shopping malls or residential subdivisions). That site is 1,518 square kilometers.
You could place those 100 football fields of batteries onto 0.035% of the land area of the Hanford site. There are probably still functional high voltage transmission lines there, and if not, then building new lines to get that battery power to BPA's nearby grid wouldn't be difficult or expensive.
Anyway, I'm just being silly. It doesn't make sense to put those batteries into either a single $90 billion dollar 100 story building, or onto cheap land at the Hanford site. But having 20 or even 100 separate battery sites scattered throughout the area is certainly feasible.
Your overall point is valid, even though batteries will eventually become much cheaper. It's currently difficult to store large amounts of intermittently generated power.
You're definitely right that no one would but then in one building. I chose that analogy because it's easy to visualize, whereas lots of smaller building are not. My cost estimate included raw lithium only and assumed the land and building were free to be conservative. Also This is just for 4GWe, a tiny fraction of the PNWs total energy capacity.
Problem is : even they're cheap, you have to build massive amounts of them... So fusion energy (or anything nuclear) is still required I think. But well, I'm no expert, just repeating what I've heard (check Jean-Marc Jancovici, french guy)
Abundance and very cheap energy enable energy inefficient (so far) and crazy projects as well. For example, put some coil under the road and power electric vehicles.
ITER will only be able to run at full power for a few weeks before its materials reach their radiation damage limits, so trying to use its heat output would not make any sense. And unless they can get disruptions well under control on non-tritium shots, it will never be allowed to do any DT runs.
ITER is a dead end. It would require multiple miracles to get anything remotely competitive out of it, and that's just not going to happen.
First, realize what's holding back nuclear fission. It's not nuclear waste, or safety, or Greenpeace, or the cost of uranium. It's the capital cost of the powerplants, and to a lesser extent the non-fuel operating cost of the plants.
Fusion would make these two main problems worse, not better.
The fusion power density of ITER's plasma (note: just the plasma, not including the total volume of the reactor) is 0.6 MW/m^3. The power density of the core of a pressurized water reactor, on the other hand, is 100 MW/m^3. And the machinery around a PWR core is much simpler than all the magnets and such around a fusion core.
(The ARC reactor has somewhat higher power density, but still sucks compared to a PWR.)
That fusion power density sucks has been known for decades. It follows from basic physical considerations (square-cube law) and is largely independent of the details of the reactor design. See Lawrence Lidsky's famous 1983 article in MIT's Technology Review, "The Trouble With Fusion".
The inevitable result of this is that fusion power will be more expensive than fission power. The capital cost will be much higher.
ITER's high cost has been explained as due to international coordination, but actually the cost was lowballed from the start. If they had gone with cost estimates of previous reactors, and scaled them on the cost/size plot, ITER should have been estimated at 3-4x what they said it would cost.
Now, operating cost...
Fusion reactors are extremely complex, much more so than fission reactors. This means they will break more often. And all the parts inboard of a DT (or DD, or D3He) reactor's biological shield (in ARC, this includes the magnets) will be too hot for hands-on maintenance. So all the repairs will have to be done by robots. Good luck with that!
In a reactor the size of ITER, all the armor on the first wall will have to be water cooled. Each component slab of this armor cannot fail. A single leak in any of them will render the reactor incapable of sustaining a plasma. I understand that when conventional reliability estimation techniques are applied to these things, one concludes the reactor will be able to operate only a few percent of the time. This isn't even good enough for a research reactor, never mind a commercial reactor.
And what will fusion face if, by some miracle, a "working" reactor is finally made? In the last 40 years photovoltaic modules have declined in cost by about a factor of 200. What will happen in the next 40 years? ITER, if all its fusion output were converted to electrical power at 40% efficiency (and none needed to be fed back) would cost > $100/W. PV modules are now about a factor o 300x less than that. Even if we divide the PV number by four, ITER is so far out of the running it's absurd.
To finish, I'll add that to truly replace fossil fuels, electrical power has to come in cheap enough that resistive heat displaces natural gas. This requires electricity around $0.01/kWh. There appears to be no chance ITER, ARC, or any of the other fusion schemes could approach this, even with hugely optimistic assumptions.
There are several aneutronic fusion projects. The biggest is Tri Alpha, with over $500 million invested. They proved stable plasma in 2015 at 10 million degrees, and just completed a new reactor they plan to take to 100 million degrees; according to their model, the plasma should get more stable at higher temperatures. If it turns out that way, pumping up the heating to boron fusion temperatures is relatively simple, and they have a straightforward path to a practical reactor in 2025 or so. (Source: recent articles plus a presentation I saw from one of their people in 2016)
There's a group that thinks a sufficiently powerful petawatt picosecond laser could ignite fusion in a block of boron fuel. This will be testable with an off-the-shelf laser before long; these lasers are improving by a factor of ten every three years.
YCombinator has an investment in Helion, which is working on a hybrid D-D/D-He3 reactor (the He3 comes from the D-D reactions), saying only 6% of the energy would be released as neutron radiation.
Plus there's LPP, a tiny dark-horse project that will finally get a decent test of their idea this year.
Like you, I’m hoping for an unforeseen breakthrough, but it seems very unlikely anytime soon. Just containing such an energetic plasma for any length of time past ignition is going to require it’s own set of breakthroughs in materials and manipulation of magnetic fields.
The temperature would be very high but the amount of energy wouldn't necessarily be remarkable. And for the pulsed designs (laser and LPP), containment time isn't a concern.
Regarding point 2, the only drawback I'm aware of with D-T fusion is that it still produces nuclear waste (whereas aneutronic fusion produces zero/close to zero nuclear waste), however the nuclear waste produced by D-T fusion is said to be:
* Smaller in size than the nuclear waste produced by the current approaches to nuclear fission.
* Has a shorter half life than spent nuclear fission fuel, meaning that it doesn't take as long before it's safe to take it out of storage.
Aside from nuclear waste, are there other drawbacks to D-T fusion that people should be aware of?
Concern about tritium leaks, the need for tritium breeding which limits how fast you can build new reactors, hard neutron radiation that theoretically could be used to breed fissiles (but only if you have a high-enough tritium breeding ratio to keep the reactor fueled, despite not using all your neutrons for that), and the need for a steam turbine, which means you're not going to achieve the extremely low costs that may be achievable with aneutronic fusion.
Which doesn't mean D-T is hopeless, just not as good as aneutronic. I've seen fusion scientists say the waste from D-T reactors would only need to be contained for several decades.
Actually pure deuterium in the tests so far. But the design explicitly targets boron fusion. If it were adapted to D-T, then it would in fact need a steam turbine. The great thing about pB11 is that most of the energy output is fast-moving charged particles, and that's not true of D-T.
Sputtering of the metal in the reactor due to fast neutrons, leading to embrittlement, large amounts of neutron activated waste, and an unreasonable maintenance schedule. Lost energy due to said neutrons. Plus all of what has been said before me by DennisP.
I'm not an expert and all of this happened almost 20 years ago, but I do remember avidly reading about ITER around 2001-2003, when the project was more or less agreed between all parties.
Wikipedia[0] has a short paragraph on the history and I'd say that any project involving so many different countries and governments (and bureaucracies!) is a red-tape hell.
Just choosing the location was a long and difficult process[1]. Canada proposed a location but later pulled out, Japan put a lot of pressure to get it built there. Even within Europe it caused a political "war" between France and Spain, both proposing different sites. First the EU decided to back France and it was finally agreed upon as the final location.
From what I remember, many people complained about the decision. Japan felt left out and was offered 20% of the research personnel. The infrastructures in France had to be built from the ground up even before the actual work to build the plant could start, the area didn't even have big enough roads to bring all the materials, which in turn delayed the works and increased the costs (something that wouldn't have happened in Spain or Japan, no idea about Canada).
In short it was originally Cold War politics to get some international cooperation going around energy. Over time it morphed into a pissing/$$$ contest between nations involved.
The contrarian opinion is to increase budget 100X, construct Helium-3 mining colonies on the lunar surface, and deploy rocket foundries for missions to Titan and beyond ;)
I went to a talk by one of the materials scientists at JET a few years ago, and he stressed that in terms of that part, the biggest challenge is the neutron bombardment of the steel reactor vessel. He was saying that the biggest problem is that current calculations mean you'd need to replace the entire highly irradiated reactor vessel after 3/4/5 years or so because of structural defects and helium bubbles forming in the cracks.
In other news Wright brothers first plane unsuitable for transatlantic business travel. Colossus struggles to run Call of Duty.