There's nothing quite like trying to get fusion to work for a while to make you appreciate how awesome fission technology is.
If we applied even half the cleverness needed for fusion to making better fission technology, we'd probably be way better off.
...and then there's solar. Why even bother with producing the energy, just capture it with a very thin solid state device! Just need to automate the planting of solar panels in the desert, and we could produce all of our electricity from the Sun. Using just a fifth to a tenth of the land (and much crappier land that nothing much can grow on) that we use for /ethanol/ production alone.
(Yes, storage is tough, but is getting cheaper, and we can just plant more solar panels so there's enough power even during cloudy days... Although this is mostly a thought exercise. The best plan for deep decarbonization by far is to operate with a mix of clean power sources optimized for high capacity factor, including at least our current nuclear fleet... They help provide a reliable baseline which drastically reduces the amount of storage and over-installation required. That last 20% of power produced by nuclear is worth its weight in gold and should be protected at least until all fossil power production is ended.)
How exactly do you propose to get that electricity from the desert to say Finland?
Furthermore, you need to plaster a vast area in the desert with solar plants, which brings along all sorts of concentrated security risks. The alternative is to have many smaller plants dispersed, which results in a big loss of efficiency.
I'm all for clean energy, but you seem to be handwaving away pretty huge issues as "thought exercise". Our inability to store energy is not just some side concern that will go away any time now. There is a reason carbon fuels are so prevalent - their energy density is far above what we can store in batteries. Without scalable and high density energy storage, unpredictable energy sources like solar or wind are much less useful because often there is no sun or wind at high demand and on the contrary there can be an overproduction during low demand. Germany already causes serious headaches for the European grid when it floods the network with electricity on sunny/windy days. They actually have to pay other countries to take over the extra capacity.
Solar, wind, tidal etc... energy are all awesome and clean (if we disregard the land that they take up), but there are many difficult issues that remain to be solved before they can become a core part of the energy production.
> Furthermore, you need to plaster a vast area in the desert with solar plants, which brings along all sorts of concentrated security risks
This sentence seems to be at odds with itself. How does needing a large area concentrate security risks? If anything it seems to disperse them.
Unlike a coal or nuclear plant, you can't just walk up to a critical part of a solar farm and blow it up. Any part that you damage is relatively isolated from the parts you don't damage. Thus, it takes considerably more effort to take the entire plant offline.
Sorry I worded that sentence poorly. Let's assume two scenarios:
a) We create a 200 square mile solar field in the Sahara, from which cables run to Europe to supply electricity. To take this plant out, one just cuts the cables. There, you just plunged Europe into darkness.
b) We create a 20x 10 square mile solar fields in the Sahara, all dispersed over a very large area. Well, this is an engineering and construction project on the scale that dwarfs even the previous project. It is truly vast. Nevermind all the political issues again.
I'm surprised to see this kind of simplistic reductionist thinking on HN.
"Just cut the cables" isn't trivial. Have you ever seen a high voltage tower? Not only are they not accessible without professional gear, it's also going to require even more expensive gear to cut multiple inch thick conductors.
In all likelihood he power to Europe will be transmitted via multiple undersea cables, the effort of disrupting which would be unavailable to all but state agents, in which case the security issue is irrelevant, as you are likely already in a state of war.
> the effort of disrupting which would be unavailable to all but state agents
It's a good thing then that there is a long history of all state agents in northern Africa maintaining friendly and stable relations with European governments.
(Yes, said Europeans also carry a great deal of responsibility for this situation, but that doesn't make the situation any more tractable).
Partnerships with whom? Several different warring tribes that want to gain control over the other or achieve independence? Morocco or Jordan maybe, the rest of them most likely not.
I'm not 100% positive that this is the right video, but I think it is. As part of the "overthrow a nation" plan, he talks specifically about destroying critical infrastructure, like electricity.
The solution for taking out high voltage powerlines? A pair of Skilsaws hanging under a quadcopter. Drop down onto it and... something bad's going to happen!
Absolutely. The infrastructure is very fickle, but also enormous on the scale of countries.
Since we're talking about hypotheticals, I could say these things are also true:
1. There are counter-drones deployed and patrolling air space above the towers.
2. There are hundreds of independent power lines with 2N redundancy, so cutting 5-10 high voltage lines would do nothing.
3. The undersea cable points are protected by state agents and have good security for amateurs to compromise.
4. Europe will have 24-48 hours worth of storage even if ALL the lines are cut from the Solar Generation plants.
5. Europe will have backup power generation capacity in the form of locally stored and operated fossil fuel plants.
____
In other words, there's absolutely no evidence that the future hypothetical solar installation is going to be more vulnerable than the infrastructure that already exists today, and surprisingly Europe doesn't plunge into darkness every week.
I don't think the main problem here is random terrorists cutting power to Europe by blowing up a power line tower. The real problem is that nation states that control the Sahara haven't proven to be stable (Syria, Egypt), or don't always have great relationships with Europe (Algeria in the 90s).
As to points 4 and 5, although storage and some backup capacity is reasonably feasible, this would mean that Europe could very well be forced to go to war in short order to get their full power supply back if a rogue organization gained power in North Africa.
Burry the cables. Won't protect you from everything, but makes it a bit harder to compromise them. You can even have "cages" around them that alarm a nearby response team of any manipulation. Of course, this adds siginifcant cost, but then you have to take the total economical cost into account if an attack is successful.
Damage that takes a few hours to fix is not going to destabilize a country. If you can scale it to keep up 24/7 disruption you already have an army and far more useful approaches like destroying ports.
The feet of the towers are simply accessible, so there's nothing stopping people from even just using a metal file. It wouldn't be safe, but you'd get the job done.
Or just plow a truck into their feet. You only really need to bend one of the feet and the wind will do the rest.
> I'm surprised to see this kind of simplistic reductionist thinking on HN.
The thinking is in fact correct. The parent should have used a better, more serious, and guaranteed inevitable scenario: war. Bomb the concentrated infrastructure and take down all of Europe's power supply. It wouldn't take very high precision and it wouldn't take a vast number of hits: it would require a few, clumsy hits, which would be nearly impossible to stop.
The present system, which tends to be far more distributed than not (to take down all the power in Europe today, you need to hit a lot of spread out targets), is vastly superior to extreme concentration.
Why cut it? Explosives would be much easier (think IEDs), and could be improvised from simple sources like gasoline and petroleum jelly. No need for a tool to "cut" a power line, regardless of it being above or below ground.
I also think the location thing may be addressed in later solar technology. i.e. For our currently tech the sun really needs to be out. But as far as I remember, there's still a good amount of energy available on cloudy days (assuming it's not pitch black). As solar technology progresses so too will the locations it can be installed increase.
> it's also going to require even more expensive gear to cut multiple inch thick conductors.
I agree with most of what you've said, but cutting a few inch thick conductors doesn't seem all that hard. I think a single stick of dynamite would do it pretty easily. Two for good measure. This would still be an act of war of course.
Then "evil state" bombs cables, or power plant. The point is that the cost of defending such a crucial part of our infrastructure would not be trivial, and it's still hosted on foreign soil, so... politics.
Renewable power is very much going to be a distributed more than our current power generation. Further, you want redundancy in power transmission so you can do maintenance anyway and many countries are already dependent on foreign sources for a significant chunk of there daily energy demand for economic reasons. Any actor that can destroy thousands of cables can cut the power systems we have now.
Further, outside of city's you can generally get solar panels from a roof to provide power for the building under it making the grid a lot more redundant vs cutting cables.
Isn't this a problem with any centralised electrical power source? In fact I've always thought that terrorists could cause far more disruption by taking out a couple of transmission line towers than blowing people up. And it would only take a decent sized spanner in many cases.
The only solution to this (and to the massive resistive losses over power lines) is localising power production. And that's where solar, wind, small-scale hydro and local battery storage look very effective.
Seems a lot easier to fix a cable cut than a sabotaged nuclear power plant. I mean if you are the point where a nation-state is detonating fuel-air explosives over solar farms then there's probably no power plant design that is going to survive.
Not that easy. Furthermore, it's the same for any large power plant of any type. This is a very poor argument against solar. It's also a stretch to imagine that Europe would rely on a single power plant for its sustenance.
High latitudes (above 50 degrees) have a big problem with solar power. Luckily, almost the only people who live that far from the equator are Northern Europeans, who tend to have access to hydro and nuclear. (this is why it's so stupid for Germany to shut down their nuclear power... they're replacing it with coal, brown coal, and gas!)
The vast, VAST majority of the world's population lives FAR closer to the equator.
Again, you seem to have ignored the part of my post where I mentioned installing enough solar to produce power even during cloudy days. Cost of utility solar is super cheap now. If it continues reducing in cost, it'll be plenty cheap enough to just install 3-5x as many panels as you need to provide power during cloudy days. Yes, solar CAN get cheap enough for that to be a good idea.
I haven't ignored the part where you mentioned overprovisioning. Where do you propose to put the excess power during sunny days? Stable baseline sources like nuclear cannot just be shut down or spun up on a moment's notice to accomodate a vast number of solar energy entering or leaving the grid. At the same time, we don't have a capability to store electricity at sufficient scale so we do need the stable baseline to cover a high enough percentage of our production.
The fact of the matter is, as long as we don't have a scalable storage solution, power sources where we can't regulate the power output cannot take up base load and can at best be a small overlay that can pick up load if the beneficial weather happens to coincide with higher load requirements.
This really is not a big technical problem. You simply don't output the power. This is trivial with any kind of decent power-optimizing inverter setup commonly available today. Solar PV isn't spinning a turbine that needs to find a load. If you disconnect a solar panel, it doesn't damage itself, it just sits there.
But I'm sure we'd eventually find uses for this extra energy. Like producing ammonia for agriculture.
No one I know is opposed to renewable energy, but advocates really do everybody a disservice when they try to argue that an intermittent power source without storage is a reasonable replacement for base load power. As Bill Gates said in an interview "…They have this statement that the cost of solar photovoltaic is the same as hydrocarbon’s. And that’s one of those misleadingly meaningless statements. What they mean is that at noon in Arizona, the cost of that kilowatt-hour is the same as a hydrocarbon kilowatt-hour. But it doesn’t come at night, it doesn’t come after the sun hasn’t shone, so the fact that in that one moment you reach parity, so what? The reading public, when they see things like that, they underestimate how hard this thing is. So false solutions like divestment or “Oh, it’s easy to do” hurt our ability to fix the problems. Distinguishing a real solution from a false solution is actually very complicated."
The term Baseload power is not a goal, it's a Problem with some types of power not being flexible. In that context, soar is closer to Baseload power than peaking power plants. Yes, it currently better matches demand than Baseload power, but as you ramp it up the supply / demand miss mach will simply shift around the day.
Turning on some peaking power plants at night vs the day is not a significant issue. Further, the solution is excess capacity not long term storage.
You are handwaving away huge issues here. As Bill Gates said in the same interview I quoted from before:
>"...It’s kind of ironic: Germany, by installing so much rooftop solar, has it that both their coal plants and their rooftop solar are available in the summer, and the price of power during the day actually goes negative—they pay people to take it. Then at night the only source is the coal, and because the energy companies have to recover their capital costs, they either raise the price because they’re not getting any return for the day, or they slowly go bankrupt.
There are many people working on storage—batteries are a form of storage, and there’s a few others, like compressed air, hot metals. But it’s not at all clear that we will get grid-scale economic storage. We’re more than a factor of 10 away from the economics to get that."
Having fossil fuel plants that only operate when needed, but get paid more per kWh than they do now when they run all the time (since as well as fuel costs they have up front construction to cover) is not even close to being a problem. It is in fact part of the solution. What is he doing complaining about this?
Negative prices are also not the big drama he makes them out to be. Sad that he's got a reputation as some kind of genius and he's scared of negative numbers as if it's some kind of witchcraft to have too much of something and use pricing and markets to decide which providers should scale back.
Translating for those scared of negative prices: someone wants to keep providing power for some reason (maybe their setup doesn't like rapid changes in output, which would cost them money in repairs), they want to pay you $X to shut down your system instead. Do a calculation and figure out if shutting down your system is cheaper than that, if so, take the money. Market forces at work! The inflexible system don't make as much money, and the flexible systems gain it.
Or, someone with flexibility in when they need electricity says "I'll agree to heat my water tanks at the point that helps the grid most, how much will you pay me for doing that? The amount we'll pay you is the negative price, except not negative anymore".
Coal and nuclear are huge steam engines. They cool down when you turn them off which causes thermal stress and you then need to heat them up again before they can generate power. Thus Germany's problem is they have coal power plants in the first place.
Gas turbines also have thermal stress, but they are designed for this and they loose vastly less power when you turn them off. With enough of them you can have zero storage and zero problems. Because storage is simply one solution to production / demand mismatch not the only one.
Now yes, peaking power costs more but energy costing more at night is rational behavior and without subsidies you will minimize overall costs.
On the plus side, the CO2 from burning natural gas is less than coal and it doesn't emit mercury or the other particulates and doesn't have a waste issue like coal does. Unfortunately for those who care about climate change, (and if you don't, you should) there are inevitable methane releases from fracking and from distribution of natural gas and those are now known to be much worse than previously thought:
>...Back in August, a NOAA-led study measured a stunning 6% to 12% methane leakage over one of the country’s largest gas fields — which would gut the climate benefits of switching from coal to gas. We’ve known for a long time that methane is a far more potent greenhouse gas than carbon dioxide (CO2), which is released when any hydrocarbon, like natural gas, is burned. But the IPCC’s latest report, released Monday (big PDF here), reports that methane is 34 times stronger a heat-trapping gas than CO2 over a 100-year time scale, so its global-warming potential (GWP) is 34. That is a nearly 40% increase from the IPCC’s previous estimate of 25.
...The IPCC reports that, over a 20-year time frame, methane has a global warming potential of 86 compared to CO2, up from its previous estimate of 72. Given that we are approaching real, irreversible tipping points in the climate system, climate studies should, at the very least, include analyses that use this 20-year time horizon. Finally, it bears repeating that natural gas from even the best fracked wells is still a climate-destroying fossil fuel. If we are to avoid catastrophic warming, our natural gas consumption has to peak sometime in the next 10 to 15 years, according to studies by both the Center for American Progress and the Union of Concerned Scientists.
As we use more and more natural gas, we can expect more and more methane disasters like the leak from Aliso Canyon in CA which was the largest leak in US history. This released over 100,000 tons of methane into the atmosphere and required 11,000 residents to be evacuated.
100,000 tons of methane with a half life of less than 10 years. The math is complex and 1 ton of methane becomes more than one ton of CO2, but the actual long term impact is fairly minimal vs the savings from not using Coal. Remember that's ~100,000 tons / ~10,000,000,000 tons = 1/100,000 so yea methane is bad, but in this context it's 0.001% so it's still a rounding error even if it's 100 times that bad.
Anyway, I agree it's not without costs. But, if we got to the point where peaking power from gas makes up ~10% of total electricity supply and the rest of that is renewable power then we will have made massive progress. In other words don't let perfection stand in the way of progress.
>100,000 tons of methane with a half life of less than 10 years. The math is complex and 1 ton of methane becomes more than one ton of CO2,
You are downplaying the impact of methane. As I quoted in the last post, the IPCC estimate is that over 100 years a given amount of methane is 34 times more potent as a heat-trapping gas compared to CO2. Over a 20 year period (which is probably more relevant considering where we are in climate change) methane is 86 times more potent.
>...but the actual long term impact is fairly minimal vs the savings from not using Coal.
That's the problem - there is enough methane being released by our new reliance on natural gas there is concern that we may be having just as much of an impact on climate change as if we went and burned coal.
>...Remember that's ~100,000 tons / ~10,000,000,000 tons = 1/100,000 so yea methane is bad, but in this context it's 0.001% so it's still a rounding error even if it's 100 times that bad.
The problem is that EVERY natural gas well, storage facility and pipeline leaks methane. This was just the largest one. As I quoted in the last post, "...Back in August, a NOAA-led study measured a stunning 6% to 12% methane leakage over one of the country’s largest gas fields — which would gut the climate benefits of switching from coal to gas."
>... But, if we got to the point where peaking power from gas makes up ~10% of total electricity supply and the rest of that is renewable power then we will have made massive progress.
Yea no kidding - 90% power from intermittent sources would be pretty remarkable. The problem is that 90% power from intermittent sources won't be possible unless there are major advances in grid storage technologies.
In terms of having to use natural gas, nuclear power plants in France and in Germany operate in load-following mode, so there is no reason this can't be done elsewhere. Minimizing our use of natural gas when when you consider the climate changing negative externalities of natural gas, would be a good thing.
That's just lying with numbers. 86 / 5 = 17.5 so average from year 20 to year 100 is 34 - 17.5 = 16.5x. And again the majority of that 16.5 is over years 20-40.
Further, because of this decay increases in release rate don't stack at 86x year over year. You get a magnified increase in the first year and a smaller net increase every year after that until your down to the carbon content of methane.
Under 10% total gen from peaking power is viable with over production of wind and solar and zero grid storage. It's not going to be 10% 24/7 but 0 most of the day, regular generation at part of the day, and rare spikes up to 50%. You can drop that under 5% with moderate grid storage of around 1h average demand, but you start needing vastly more storage the lower you want to take that number.
No, those numbers are from the IPCC which represents the mainstream consensus. Take it up with them if you think you are right and they are wrong. Using the 20 years number is reasonable considering how far along we are in climate change. As the IPCC report states: "There is no scientific argument for selecting 100 years compared with other choices (Fuglestvedt et al., 2003; Shine, 2009). The choice of time horizon is a value judgement since it depends on the relative weight assigned to effects at different times."
>...Under 10% total gen from peaking power is viable with over production of wind and solar and zero grid storage.
Yea I hear advocates make such claims all the time. Unfortunately I never hear the same thing from people who actually run a major utility. We might get there someday, but for now I have to agree with Gates: "...What they mean is that at noon in Arizona, the cost of that kilowatt-hour is the same as a hydrocarbon kilowatt-hour. But it doesn’t come at night, it doesn’t come after the sun hasn’t shone, so the fact that in that one moment you reach parity, so what? The reading public, when they see things like that, they underestimate how hard this thing is. So false solutions like divestment or “Oh, it’s easy to do” hurt our ability to fix the problems. Distinguishing a real solution from a false solution is actually very complicated."
Ed: Make that 5/4 * 16.5x or 20.6 However, again it becomes more than 1 CO2 by weight because the 2 O's in CO2 make up a larger fraction of total weight than 4 H's.
Nobody ever mentions varying the price of electricity to reduce demand when the sun isn't shining. We need to get past the assumption that electricity prices should be fixed.
This won't solve the baseload problem, but it will reduce it.
Electric prices have never been fixed at scale. It's really only small scale consumers that get a fixed but often higher rate. See Enron for example. Changing when energy is cheap does not break the model.
So yea, residential customers get a fixed rate but that's because there is no point in demand based pricing. They simply charge you enough that it makes little difference what time of day you use slightly more power. In the future the same is likely to be true.
You can use the excess to create fuel, such as cracking H out of H2O, and then use that to reduce the nighttime problem. I don't think it'll be hard to find a use for excess power.
That's a horrible solution. Water is even less portable than energy, even the driest desert can't afford to import water. You're trading a highly liquid but hard to transport commodity for a highly illiquid and hard to transport commodity.
Scenario 1: Your solar power is used near the site of the power installation. In this case, there must be people living near the site in order to use all that power. Those people require freshwater. Therefore, there will be a means to transport water to the site.
Scenario 2: Your solar power is used elsewhere. In this case, there must be a means to transmit power to the people who will use it. Those people are likely to live near seawater. The closest such people are also likely to have a limited supply of freshwater.
You are already either transporting water to the solar site, or transporting power away from it to somewhere else. So the idle capacity of the solar plant will be usable for desalination and/or wastewater reclamation. And I think aqueducts, particularly seawater aqueducts, are cheaper than you realize. The majority of energy required is in increasing the elevation of the water, and that can be done with surplus power, too.
We already have a solution to all our energy needs. Funny to see people come up with outlandish suggestions. I am not sure what I am missing. Because logic points to only one direction.
I don't think there is a viable solution for buffering solar or wind (or nuclear) with renewable sources, unless you happen to have access to a lot of hydro power.
Dynamic remote cut-off of heaters, chargers, and other non-critical consumers is one suggestion, but the equipment cost is probably at least 100 Eur per consumer.
If the energy is free, you could just do electrolysis with huge carbon electrodes and immediately ignite the gases, bubbling the exhaust through your freshwater reservoir. You can knock together a setup like that in your own backyard from junkyard parts.
How you do it depends on whether you prioritize capital costs or operating costs.
You can do things with the excess power that otherwise would be un-economical to do during a cloudy day. Things like smelting aluminum, desalinating water, etc take massive amounts of power.
They already do this without any gov intervention. The local steel recycling plant only operates when power is cheap. While they do need some advance notice to make sure they have enough people to run, they probably can get a good enough idea from a weather forecast (power here is mostly hydro with a dollop of wind and fossil).
Some processes are better fits than others. If you have 2-5x extra capacity that is sold cheap on sunny days, someone will figure out how to use it.
Do we have any numbers on what it means that they run only when the power is cheap?
Does that mean they can operate in 4hr windows? Or do they run have to run the furnace for a minimum period? A large quantity of steel takes a large amount of energy to melt.
It could also be a plant for shredding scrap, where I guess it wouldn't be a time issue so much (and if it doesn't have the material inflows to run full time, timing cheap power would make sense).
Finland net imports of electricity in 2009 were 15%, so about the same as it is today. They have a lot of hydro power and little reason to stop using it. Countries are ok with risking 15% of their generation from external sources because you can get by on 85% supply with minimal disruptions and you can ramp up supply in the short term.
Solar in the Sahara is an example, but countries already spend a few percent of GDP on safety, being less efficient but more independent is generally an excepted trade-off. Further, Sahara solar > desalination > farming is probably a better use of that space vs transmitting it.
> How exactly do you propose to get that electricity from the desert to say Finland?
High voltage DC transmission works fine for this problem. All of the engineering challenges have been credibly worked out and priced (https://en.wikipedia.org/wiki/Desertec). The main problems with this approach are political, in that it requires massive cooperation between many countries over very long time scales (decades).
Why is there a security risk with putting solar in the desert? It seems like a relatively easy to secure location, and solar cells are hardly high value-density targets for theft or vandalism.
So you think northern African nations are going to let European nations station troops there and take over a big chunk of their land? That doesn't seem realistic, and seems rather imperialistic actually.
Letting the northern African nations provide the security is just plain folly, and also makes Europe completely dependent on these economically backwards and frequently hostile nations.
Why wouldn't an African nation want to charge rent to some European country for vast amounts of land that are otherwise literally worthless[1]? It's not like they're strangers to private company security forces that are basically just extrajudicial mercenaries. History says any company that wants to do this just needs to apply some bribes to the appropriate officials.
[1] In the sense that nobody is willing to buy it so it has no price.
It won't be uniformed European troops. It will be Blackwater type mercenaries. They'll have an implicit license to kill and/or abuse any locals. That's how you do security for European companies in Africa.
You really think you should put the entire welfare and existence of the European continent in the favor of some hired Blackwater mercenaries? Are you f'in serious?
The danger is political risk, not necessarily theft or vandalism. Any dependency on territory not under your own societal control can be used to hold you hostage and cannot be depended upon.
Forget the solar farm, easiest is to cut power lines. (explosives?)
Europe imports around 30% of its oil and gas from Russia. The pipelines between the east and west were originally setup during Soviet times.
Given how stable that has been it seems unlikely that a small African country, who will be getting a big chunk of cash from Europe, would do anything stupid.
They haven't been perfectly stable, though. We've seen since 2005 a few cases of oil and gas prices surging upwards as part of the Russia-Ukraine dispute (in 2005, 80% of all gas from Russia to the EU went through Ukraine).
Europe is currently highly dependent on Russia's natural gas for their electricity; I don't see this as worse. Indeed, better to diversify your power imports.
(Better still to generate locally, but not every country has the land available to do so).
Not sure why you're being downvoted. The Chinese are actively researching in the area of ultra high voltage DC transmission so that power can be distributed long distances from remote renewable resources. Even if the problem were as simple as a direct link from the Sahara to Finland, China has already built UHVDC transmission lines half that length. UHVDC technology is only improving with time and it's an exciting field to watch.
And the vast area is still very small compared to agriculture. And it can be done in chunks. The electrical grid is concentrated by its very nature, but we still manage just fine.
(Solar is also one of the few truly distributed power sources, but I expect most power in the future will come from vast solar farms due to the ease of automation.)
True, and this is a issue today as well. Which is why I think that proposing a solution that compounds this issue even further should not be taken lightly.
Edit: put differently, why should we adopt a new solution that is even more exposed to the already known risks?
A large solar field makes for a pretty diffuse target. You might be able to target the connection to the grid, but that's no worse than it is today (and would be easily repaired).
And I don't think our current electrical grid has a major problem that even a slight attention to infrastructure couldn't fix.
Ok so you are saying that the new solution isn't any worse in terms of security than the current one, so it's fine?
In any case, you cannot compare this to oil pipelines directly because we can stockpile oil. Even if all the pipelines are taken out, many countries stockpile oil that can bridge them over until an alternative solution is found. If a considerable part of the electricity comes from some remote site in a desert and the connection is taken out, that'll immediately plunge the country into darkness.
I like your optimism about all the problems being trivial, but with your mindset every problem we currently face is trivial.
At this point, why don't you just propose that we go and harvest Helium-3 from Jupiter? I mean, we already sent some probes that far, it's just some engineering issues, right?
There are a whole host of political, engineering and social issues that need to be sorted before something like you propose can be done. It can work for individual countries in certain regions, but it is not in a state where it can fix the energy problems of the world or liberate us of the carbon dependence.
"Ok so you are saying that the new solution isn't any worse in terms of security than the current one, so it's fine?"
Yes. You're blowing up the security argument when really it isn't the security of it that is the problem -- our current infrastructure isn't more secure, and we have had only minimal problems with it thus far. The problem is the imminent and looming destruction of various thousands of ecosystems, and the continuing habitability of the planet.
It could be done. We got to the moon in about twenty years, which was much more of a technological leap than this will be. With the concentrated effort and enough propaganda, we could make the switch from coal/gas/oil. We are very rapidly running out of runway.
I'm the classic US-centric American. Northern Europe can solve its own problems, they have all the tools (nuclear, hydro, wind, geothermal). The rest of the world (i.e. the vast majority of people) is closer to the equator and doesn't need a cable running to the Sahara.
About a decade ago, there was a book written which looked at the ability of Great Britain to become self-sufficient entirely in renewable energy (https://www.withouthotair.com/ looks to be the URL).
It concluded that there is literally not enough renewable energy potential in Great Britain (e.g., cover every square millimeter of space with solar panels) to do so at then-current energy consumption rates. It also pointed out problems with approaches like "giant solar fields in the Sahara" (the requisite solar fields are, well, roughly the size of Germany in terms of required area). Nuclear is technically not a renewable energy resource, but it does have requisite power generation capabilities--and very staunch anti-nuclear activism that makes expanding nuclear difficult.
"It concluded that there is literally not enough renewable energy potential in Great Britain (e.g., cover every square millimeter of space with solar panels) to do so at then-current energy consumption rates."
That is false, though. Just a little math shows why.
The UK is 242,500 km². Nameplate solar capacity on that would be on the order of 150MW/km², so 36 Terawatts. UK average electrical consumption is 32 /Giga/watts. Assume solar produces 1/4 of that 36TW on average when it's sunny during equinox, so 9TW. Even on a pretty cloudy day on the winter solstace in London, you're still going to produce at least 1/10th to 1/20th of that much, so worst case of worst cases (cloudy day on solstice) you're looking at 450GW average if you cover the entire UK in solar panels. That's still over ten times the UK's average electrical consumption.
(I used UK instead of Great Britain, but that doesn't make much difference.)
I mean, the overall point that you wouldn't want to power the UK on solar alone is totally and completely valid (and I fully support the Hinkley Point C nuclear power plant, by the way), but the idea that it's literally not possible is BS.
Even if you include TOTAL energy instead of electrical energy (in the US, electricity is responsible for 40% of energy use, so I'll use that as a guesstimate), and assume 100MW/km^2 instead of 150 (even though I already accounted for that in the cloudy winter day part), you're still producing about 3.7 times as much energy on that cloudy winter day than the total UK energy use. And that's before we use higher efficiency solar panels or deep sea hydrogen seasonal storage (the latter of which is one of the very few sensible seasonal energy storage methods).
It's an absurd claim to say it's impossible, and there's almost nothing more annoying than a source claiming to be "without the hot air" actually being full of plenty of hot air on absolutist claims like this. You don't correct BS by using equal and opposite BS.
(Most energy advocates I know do this... both renewable advocates and nuclear advocates, exchanging BS about the opposite technology. It's a clean energy circular firing squad, with the climate in the middle.)
Energy consumption here does not mean just current electricity usage. It also includes things like fueling cars and planes. Also, the 150MW/km² is apparently roughly the summer expected solar insolation; year-round average insolation comes out to ~100MW/km² per the book's numbers.
Highly distributed rotating masses would handle a large part of the storage, turning it into more of a power smoothing problem.
Secondly, I presume that at some point we'll be covering every building (rooftop and window) with some sort of solar film that converts electricity.
Why don't we have those today in large scale production? Probably for the same reason that the US builds more roads instead of mass transit - it's a bias to build what we've alway built, not a real technological limitation.
There's definitely a link between where it's economical to place a panel and panel efficiency. As the technology matures it will make sense to place it more places. I think at some point, ultimately, it will be in the paint of cars/buildings.
Exactly. Just like Moore's law drove the price of computing power to put it practically everywhere, there is a cost curve for solar that will do the same.
Solar PV is great distributed power source to install on roofs and maybe that's it. You can combine it with partial storage for household consumption, but even that is a burden.
i read an article that the nordics actually gets as much sun as southern europe, if you count the whole year. Its as good a place for solar as anywhere else.
The summers are sunny all day and night. The problem becomes how to store the energy until the dark winter months.
(There are solar panels that doesnt need heat just light)
The seasonal variation of solar power is a big problem if you want to rely mostly on solar power, though. Seasonal storage is insanely expensive with any of the usual methods. I mean, let's say to get daily storage, you need 8 hours of batteries. To get 4 or 5 weeks of storage, you need more like 800 hours of batteries. It's literally 100x the cost.
So yeah, if you want to run on natural gas for half the year, then solar in the nordics isn't bad.
But luckily the nordics have lots of hydro and don't need any gas or even solar.
But on the equator, the seasonal variation is small. The Global South will have a big energy cost advantage as solar costs keep dropping.
Spoken like someone who has never visited a desert.
As a Southern California native, I am intimately familiar with the Mojave, Anza Borrego, and Sonora deserts. I will tell you that these deserts are far from "crappy land that nothing can grow on" as you put it.
The North American deserts are areas of huge ecological and geological diversity. They are also some of the most beautiful and awe-inspiring landscapes that exist on this planet.
We do utilize a lot of area in the Western Mojave for solar and wind power. For instance, there is a HUGE field of wind turbines in the San Gorgonio pass near Palm Springs at the intersection of the 10 and the 62.
That said, I am no way in favor of scarring any more of our deserts by covering "a fifth to a tenth" of them in solar panels. However, I am not against concentrated solar facilities several square miles in area as long as they remain contained.
I have thought about how we can "disperse" a solar panel network into a system of smaller panels and batteries and the best idea I can come up with is requiring all commercial buildings to cover their roofs in solar panels and maintain on-site battery banks. Also requiring all homes and apartment buildings to have at least one solar panel per household. These ideas come with their own trade-offs although.
1% of all deserts is only correct if you are talking about only today's electricity use. Even moderate growth and actually solving energy, not just the electricity subset drives these percentages much higher. To get US centric again, NREL published an optimistic 2016 report suggesting that the maximum rooftop solar capability could ever meet is 40% of electricity. That's 15% of today's energy use in an electricity heavy country.
>1% of all deserts is only correct if you are talking about only today's electricity use. Even moderate growth and actually solving energy, not just the electricity subset drives these percentages much higher.
U.S. electricity usage has been flat for the past 10 years[0] despite a cumulative GDP growth of 34% over the same period. I'm not saying the same trend will continue to hold, but you are overestimating future electricity usage growth.
That said, I am no way in favor of scarring any more of
our deserts by covering "a fifth to a tenth" of them in
solar panels.
Tough luck. We live in a democracy. When we run out of oil, the number of people who want electricity are going to outnumber the number of people who want to look at dirt.
Thanks for writing this. I am not sure from where this solar lobby sprung up suddenly. If people are so eco conscious, that they support solar. I am not sure how they are ok with have huge parcels of land be covered with panels. We are not even talking about pollution from storage technologies.
Lots of people are trying to apply that cleverness to fission. The problem is that governments make life much harder for fission projects than for fusion projects. There's some justification for that; fission done well is great, but fission done badly can cause much more serious accidents than fusion done badly, and proliferation is much less of a concern with fusion.
Even so, the regulation is more of a problem than it needs to be. The NRC currently requires a near-complete design up front, costing several hundred million dollars, before they even hint at whether they'll approve your project. If they turn you down, you're done. If they approve, you've still just got paper; it's only then that you get to start experiments with nuclear material. If they just had a more incremental process it'd be a big help.
Meanwhile, people build deuterium fusion reactors in their basements and nobody bats an eye.
Yes, I was being US-centric. I know most about the energy situation in the US, and I happen to live here. :)
Thinking deeper into the future, floating solar farms could also be a thing. The oceans are huge, and most nations are not land-locked. They'd have their own engineering challenges, for sure, but those challenges look like patty-cake compared to fusion power.
I have to think there are major ecological concerns with blocking sunlight over large areas. They may not be negative impacts, but surely it isn't a no-impact situation.
There's a disease in our energy discussions where if something has ANY effect or can be measured at all, it is considered a big no-no. We lack a sense of proportion. Radiation is SUPER easy to measure even tiny amounts, for instance, since you can look for the signature of a certain decay tree, so people hear that you can measure radiation from Fukushima and lose their minds. Likewise, people make a big deal about solar using up a lot of land, but then forget that agriculture uses up on the order of 100 TIMES the land that solar would take up if we relied entirely on solar for electricity. The sense of scale is entirely lacking.
Agree and it extends to risks. Better to kill 100K people per year with coal, than possibly maybe hundreds might die (but almost certainly zero actually will) in a nuclear accident. Your diet and your car are far more likely to kill you than global warming.
An example of this which always gets me with nuclear power is the waste problem. The fact that the waste remains dangerous for tens of thousands of years is always held up as a huge problem, completely ignoring the vast quantities of extremely poisonous chemical waste produced by other industries which lasts forever.
"This waste is dangerous." "OK, guess we'd better be careful with it."
"This waste is dangerous for the next 10,000 years." "Holy shit! We need to build a high-tech underground fortress to contain it!"
> "This waste is dangerous for the next 10,000 years." "Holy shit! We need to build a high-tech underground fortress to contain it!"
Time is a major risk factor. Look where we were 10,000 years ago. There is relevant chance that 10,000 years in the future no one will easily understand the danger of radiation anymore, so we have to prepare for that possibility.
You seem to have missed the other half of the point I was making, which was that there's lots of non-nuclear hazardous waste which lasts forever and we don't get nearly as upset about those.
Energy storage is easier than fission or fusion. We've just been working on the other two problems for ten times as long. Grid energy storage became financially viable maybe five years ago and there are already startups in compressed air and sodium-ion batteries (which can be potentially much larger/cheaper than lithium-ion at the cost of being less space-efficient). Nuclear energy has been financially viable forever and it took an existential crisis for the richest nation in the world to devote all of its resources to it for years to make it happen.
Energy storage for a few hours is easier than fission. Seasonal energy storage, on the other hand, is much more difficult.
This is why I think a mix of clean power sources (including some nuclear) is going to be optimal.
I am of the firm belief that there are MULTIPLE feasible ways to solve the clean energy problem that we, as a society, could probably do if we put as much effort into it as we did a world war. Examples:
<>nuclear France (ultimately using breeders of uranium or thorium)
<>hydro/geothermal Iceland
<>over-built intermittent renewables buffered with grid storage and perhaps either a little hydrogen seasonal storage or a mega-grid connecting north to south
even exotic solutions like
<>inertial or magnetic confinement fusion
<>space-based solar power (which is easier than fusion)
But the optimal (both in time and money) solution is to use a mix of clean energy sources (solar, wind, hydro, geothermal, nuclear) and some storage and some demand-response (smart meters are a pretty obvious solution, here) and some increased geographical grid interconnection (UHVDC power lines).
Really the only reasonable world wide solar plan I've ever seen is solar power satellites. There has been a lot of work done there and they beam the energy back to the planet as microwaves where it is needed into the existing grid system. But so far the cost of getting all of that material into orbit to build them is something of a deal breaker.
>At this level, our 3.6 km diameter collecting area would generate about 40 GWh of energy in a day, at an assumed reception/conversion efficiency of 70%. By comparison, a flat array of 15%-efficient PV panels occupying the same area in the Mojave Desert would generate about a fourth as much energy averaged over the year. So these beaming hotspots are not terribly more concentrated than what the sunlight provides already. Again, I find myself scratching my head as to why we should go to so much trouble.
Well, you sure can't point the power delivery beam at downtown LA either.
And the footprint of the rectenna array is enormous-- Tom assumes a 100GHz transmit antenna 30m wide, which means the downlink antenna has to be 3.6km wide! And this is the minor axis-- the solar array has to be geosynchronous orbit, which means any receiving antenna not on the equator has be an ellipse.
Read the article that this conversation is attached to for a list of very good reasons. Getting fusion to work is most certainly possible, but incredibly hard. Engineering for salt water is way easier than engineering for neutron activation, achieving high neutron economy to breed the tritium, managing tritium, managing a high temperature reactor next to a very cold magnetic containment system, a high vacuum system, oh and the very basic problem that has yet to be solved (achieving even just breakeven fusion in a non-H-bomb) etc, etc. It really does look trivial to just properly seal stuff against saltwater.
We'll probably need fusion to help reach the stars, so I think it's definitely something that should be pursued, but making it an economic power source for Earth seems very unlikely to me.
We already have things that float in the ocean for long periods of time (ships, oil rigs) and which withstand ocean storms, so it seems like a solved problem compared to fusion.
Truthfully in the medium term next generation modular fission reactors seem like the way to go, as a complement to intermittent and variable solar/wind power.
If the LPPFusion approach eventually works out, it will be a giant win compared to the Tokomak or implosion fusion reactors.
By the time you mount those panels on a floating structure the lifetime energy yield may well drop to zero or (most probably) negative. Building a ship has a huge energy/emissions cost as the raw materials have to be mined, refined, transported, formed and assembled.
I'd recommend this site for a good overview on where our energy really goes and what might be achievable with renewables: https://www.withouthotair.com/
Bump to Robotbeat. Fission reactor designs, with many clever advances in safety, have never been tested because of the lack of support from U.S. DoE. Pilot-scale reactors are expensive, but would really help to advance our working knowledge of these new technologies. Overall, nuclear fission has gotten a bad wrap.
As for the cost of storage, that was worked out 20-30 years ago via the Yuca Mountain facility. If one of the local managers hadn't shifted to a "organic-based storage medium', that was not pre-approved for that application, the fire wouldn't have happened and everything would be fine.
Thorconpower is a case in point. They tried very hard to get their experimental reactor off the ground in the US but their pilot project seems to have moved to Indonesia instead now. That's a shame - Thorium and molten salt reactor experience and technological mastery would be extremely useful for solar system space colonization efforts past the orbit of Earth (insolation levels fall dramatically and a lot of basic industrial chemical engineering off planet will encounter abundant raw materials but will also need easy access to enormous quantities of energy).
Fissiln is so worthwhile pursuing and it's reputation so bad it will probably need an Elon Musk type persona to tackle it.
But that is kinda how everything goes with nuclear. You can't plan for the "if everything goes right" scenario. Shit happens. Humans are very prone to errors. But nuclear proponents always want to pretend that isn't the case and explain away very real things that have happened as being anomalous- when in reality, errors and mistakes are the way the universe works.
Much of the (valid) criticism in this article relates to the deuterium-tritium fuel cycle. This is the easiest reaction to accomplish on Earth, so most experimental reactors are designed with this fuel in mind, and we're certainly having a hard enough time making even this work.
However, I've always considered D-T fusion an intermediate step on the path to aneutronic fusion, such as Helium-3 or proton-Boron reactions. These avoid most of the radiation issues, as well as the tritium-breeding problem (although Helium-3 sourcing presents its own challenge). Since the fusion products are electrically charged the reactor could possibly also generate electricity directly, without a steam turbine and the associated energy loss. Unfortunately, it requires temperatures that are an order of magnitude higher than D-T (well beyond a billion degrees Kelvin), so we'll need to learn to walk before we can run.
The nice thing about fusion neutrons is you get to control the isotopes, you have no control over fission waste isotopes.
Some fission isotopes are really icky to deal with, as everyone has heard...
On the other hand if you don't like dealing with cobalt-60 waste at your fusion plant, simply stop using cobalt alloys in your reactor vessel.
It turns out to be "not that big of a deal" to design a fusion plant where neutron activation isn't important. The quotes are because nothing is easy in fusion but as a problem its pretty low on the list.
This is true. Back at Fiat Lux when we designed our D-D reactor, we intended it to sit inside a pool of water and borax. Since we didn't need to regenerate tritium, just absorbing the neutrons with boron was the cheapest solution. As far as I know, Borax is the cheapest effective neutron shielding known. We would have liked to have built our vacuum chamber out of purely Al (since Al-28 has a two-minute half-life), but we went with steel for cost reasons.
Unfortunately, we never made enough neutrons to activate anything worthwhile. Nevertheless, it is certainly possible to work around neutrons through design decisions.
Sure, activation in itself isn't necessarily a big problem, but I believe that embrittlement of the blanket and other plasma-facing surfaces due to the high neutron fluxes is one of the major engineering challenges for ITER and similar tokamak designs.
Billion Kelvin operating temperatures make me super skeptical about the practicality of that solution in my lifetime. It's just so many orders of magnitude beyond where our materials science is.
It's very high, but not as bad as it sounds. There is less than 1 gram of fuel in the reactor at any given time, so it's extremely diffuse, and mostly contained by magnetic fields. Much of the energy can also be radiated away by facilitating ion-electron recombination before the plasma reaches any materials.
The current design for ITER, operating at 150 million Kelvin, focuses the exhaust plasma onto a dedicated divertor surface with an estimated peak heat flux of 20 MW/m2. Experiments show that tungsten handles this fairly well, although with some cracks appearing after many cycles, but this is an active area of research.
I think the four objections here can really be summarized as two:
1) there are energy/fuel losses involved in operating a fusion reactor which aren't present in other types of power source.
This doesn't seem fatal to me; either we get the losses down low enough that this system is cost-competitive, or we don't. There's no way to know in advance where this tech will end up, and it doesn't seem like a reason to stop investing in R&D now.
2) Fusion as currently designed produces lots of neutrons, so there is the same sort of waste and proliferation concern as a normal (or fast breeder) fission reactor.
I think this is actually the interesting one; the international community will not permit this tech to spread beyond the current nuclear powers if there's a strong proliferation risk. I hadn't realized that the current design of fusion plants were basically breeder reactors, and that's very significant; look at how the theoretically-appealing fast breeder fission reactors have been hamstrung for a prequel to this fight.
Of course, if fusion gets to an integer factor cheaper than the next-best source, then it will be hard to keep a lid on this tech, but development is currently relying on many billions of dollars of research funding from the very countries that could be turning away from it out of political/security concerns.
All his major claims seem correct to me, but my experience in fusion-related research was on the theoretical side. Is there a written summary of what is supposed to be incorrect?
E.g. the nuclear waste problem: With the right shielding you'll get to radioactive waste that is much easier to handle and usually decaying enough in a 60 year time span. As opposed to the millennia from fission products. So yeah it is a problem, but manageable.
Or the parasitic power consumption problem. Yes, fusion reactors require a lot of energy to power that magnetic field, the cooling for the magnets, but the whole point of fusion research is to get an order of magnitude out of the energy put in, i.e. a G-factor which is > 1. Right now we don't have a fusion reactor that manages G >1, but Iter has the potential to operate around G = 5, and before long some smart kid will figure out a design that gives you G > 10.
The tritium breeding problem. There are indeed some smart solutions like a FLiBe salt blanket fill. Which also is great at absorbing the neutron "waste".
The nuclear proliferation problem. This is more of a theoretical problem, as there are way easier pathways to a bomb. E.g. the thorium fuel cycle to breed a U-233 based bomb is doable even with tech from the 1960'ies.
LPPFusion is attempting to harness hydrogen-boron fusion, which doesn't produce neutrons, only gamma radiation and helium nuclei (alpha particles). Both the gamma radiation and the alpha particles can be directly converted into electricity. There are many potential benefits of this approach, but a primary one is that with no neutrons, there is no nuclear waste (or potentially plutonium) produced.
I hope LPPFusion can secure substantially more funding, it's doing more worthwhile research than the majority currently being done.
It's amazing how the human race is 'leaping ahead' the nowadays. Historically we've taken evolutionary steps, e.g. the steam engine to the combustion engine. A well-established paradigm is mastered before the next. Now, while we have people exploring 'easier' forms of fusion, we have people exploring the harder forms of fusion.
Although we don't live in the most exciting times (space-faring and everything after being the most), we are at the crux of all future human endeavors. Exciting times.
True, but there are many engineering tradeoffs. Any of the listed reactions involving deuterium produce some neutrons, so those aren't desirable (plus deuterium and tritium are rare).
Of the other reactions, neither Li-6 or N-15 are readily available, plus lithium is highly chemically reactive. The proton-boron reaction requires the lowest input energy (temperature) of the desirable aneutronic reactions, so it's the hot ticket...so to speak.
Tritium is rare, and has to be bred from lithium, but deuterium isn't the least bit rare. There's enough deuterium in your morning shower to provide all your energy needs for a year, and enough in the oceans to last until the sun goes out.
(Plenty of boron too, though it's not so absurdly abundant as deuterium.)
I agree about focusing on the proton-boron reaction. I meant to refer to other fusion technologies that can sustain the reaction. As well as dense plasma focus (the Lawrenceville group that you linked), there is Polywell, Tri Alpha, the Z-machine, and some interesting laser-accelerated proton beam experiments.
(Thinking about the far future, here.)
Inertial fusion using Deuterum and Helium 3 would solve a lot of these problems. Of course, He3 is rare here on Earth.
I don't think mining the Moon for He3 makes much sense. It's just too rare in the lunar soil.
Instead, my favorite concept is mining it from Uranus (whose gravity at Earth-like pressures is actually slightly less than 1g...). There's vast amounts available at useful concentrations (in addition to deuterium). You'd need a reusable two-stage nuclear thermal rocket to get it back to orbit, but luckily there's lots of hydrogen for propellant in the atmosphere (unlike Earth where that hydrogen must be chemically split from water or methane) and we've built and tested nuclear thermal rockets before (with more recent NTR designs achieving sufficient performance for such a vehicle to close). I'm not a fan of NTR for Earth launch (a lot of cost, plus it actually requires a lot more energy since all the propellant is hydrogen, instead of a mix of methane and mostly oxygen), but it would be enabling for Uranus launch.
Another option is to simply breed He3 with pure deuterium fusion; the output of the D-D reaction is He3 half the time, and otherwise tritium, which decays into He3 with a 12-year half-life. D-D produces neutrons but at normal fission energies, not the really high energy of D-T neutrons.
Fusion startup Helion, which is funded by YCombinator, is attempting a hybrid D-D/D-He3 reactor, saying the combined reaction would produce only 6% of its energy in the form of neutron radiation.
6% is about the best we can do with a closed cycle. Extra-terrestrial sources could push those numbers even lower by running up to a pure D-He3 reaction.
Yep. I think 6% is pretty good...it's enough so you don't need a heat cycle to extract electricity, and dealing with the neutron damage is easier than going to Uranus. But working fusion reactors probably also means working fusion rockets, and if we find ourselves going to Uranus anyway, all the better.
If we get really good at fusion we could also use boron fusion. It uses the most common isotope of boron, which is plentiful on Earth, plus regular hydrogen, and the main reaction is aneutronic. There'd be some minor side reactions but it'd be under 1% of energy as neutron radiation, probably better than you'd get with deuterium in the mix.
Absolutely. Net output per $ of opex (+ amortization of capex, decommission costs, etc.) is the only relevant metric. Parasitic power loss is categorically not an issue.
This is a mistake that engineers frequently make when they are too narrowly focused. You see it all the time in rocket engineering. The rocket equation dictates that performance drops off rapidly as the mass of the vehicle becomes heavier, or the efficiency of the engines worsens. So rocket engineers are obsessed with saving weight and increasing engine performance.
However: performance, in absolute terms, doesn't actually matter. What matters is that you get your stuff in orbit, whether that's done efficiently or not.
Up to a point, you of course do need to worry about vehicle weight and engine performance -- with too much of the former or too little of the latter, you won't be able to launch any payload. This is analogous to the "break-even point" for fusion power.
Beyond that point, however, there's a tradeoff to make: if you need to launch a larger payload, you can either improve the weight or the engines -- or you can just throw more propellant at the problem.
Many engineers scoff at the latter approach, because it is utterly inelegant (and doesn't require as many engineers to accomplish). But rocket propellant is cheap. Really cheap. A cost-driven analysis that compares improving efficiency vs. throwing more propellant at the problem will often favour the latter.
The reason SpaceX succeeded in reducing launch costs -- where NASA's engineers failed to do so for 50 years -- is because they were willing to do this analysis and go to market with a lower-performance rocket. They preferred to spend an extra $100k on kerosene than an extra $10M milling engine parts out of unobtainium. Having done so, they then iteratively figured out how to make a cheap rocket high-performance -- their rockets are now very high-performance -- which turns out to be much easier than figuring out how to make a high-performance rocket cheap.
Anyhow, this article gives me a strong whiff of that kind of engineer's bias, where the good is the enemy of the perfect.
I'm not entirely sure that what you listed is the reason SpaceX has been making such progress; NASA seems perfectly willing and capable of making cutting edge engines (or rather, their contractors do). Whether or not their organization has the risk-taking and agility to change approaches like SpaceX does, that's another matter.
However, I don't think in rocketry that "throwing propellant at the problem" is ever a solution, as propellant has weight. Efficiency is indeed incredibly important, because adding more fuel means having to lift more weight, which means you need a bigger engine, which needs more fuel to run, etc in a vicious cycle. One of the solutions of course, is to burn more of it at once (ie. multiple engines), but there's limits to what's feasible. (speed/atmospheric considerations.) In other words, it's not a linear matter of throwing more propellant. Costs can go up, quick.
Coming around to the issue at hand, I can absolutely see having a high minimum threshold for a fusion power plant, where the plant needs to be large enough to generate a sizable return (excess power) for it's initial outlay, but also be able to make the maximum return (high consumption) of that power generated. You can build a massive plant in the midwest, but if the demands on it are quite low, then the economics of it doesn't work out.
That means that the economically feasible scenarios for a fusion plant at this time are probably quite limited. I'm sure that will improve greatly over time. But it seems to be that the author's point is that right now, even assuming the technological issues are solved, it is probably isn't as cost-effective as some make it out to be.
Large parasitic losses are indeed an engineering challenge.
In theory, you are correct. However, you are making the assumption that efficiency is a static number. In reality, parts age, get dusty, out-of-alignment, etc. Let's say you are building something that makes 10 units of power, but 8 of those units are needed to keep it running (when working as designed). Say that the part that uses the 8 units of power is nominally 90% efficient, but after you've run it for a few months it drops to 80% efficiency. Now you need 9 units of power to keep running, cutting your output in half. This is not just a problem in terms of dropping output; it is also an issue from the standpoint of understanding your machine. It's hard to engineer something when small changes have large effects. Your output just dropped 30% - is something broken, or is the fan just dusty again? How long do those electrodes really last anyways? Did someone use the wrong cleaning fluid? Maybe the pressure gauge is just off by 1% again.
All those things add up, just like coding a big ball of mud can make debugging difficult.
It's a challenge, yes, but there's nothing special about it. Getting that machine from 3 units of output to 10 took a lot of work, and nothing stops them from getting to 25 units of output with a similar amount of work. At which point it's safe from marginal losses in output.
Disclaimer: I know nearly nothing about this stuff.
Almost none of the problems listed there seem to be deal breakers. Most of them seem to be a matter of small improvements that probably aren't the primary focus of research right now.
For example I'd imagine researchers have a fairly steady supply of tritium, they don't need to focus fairly hard on recovering it all, just enough to save on costs since it's pretty expensive. Why bother trying to recover 100% of tritium for a reaction that you can't even get energy positive yet? Trying to make it more efficient when you can't even make it work seems to be putting the cart before the horse.
And that seems to be the case with the fuel, too (at least how described in the article). Once we get the process working with the easy fuels it seems that the next target for research would be using fuels that might be better but harder to work with.
>Corrosion in the heat exchange system, or a breach in the reactor vacuum ducts could result in the release of radioactive tritium into the atmosphere or local water resources. Tritium exchanges with hydrogen to produce tritiated water, which is biologically hazardous.
> Most fission reactors contain trivial amounts of tritium (less than 1 gram) compared with the kilograms in putative fusion reactors. But the release of even tiny amounts of radioactive tritium from fission reactors into groundwater causes public consternation.
He seems to negate his own point by saying that the tritium output is less than a thousandth of what it is in fission. It might not be perfect but that sure sounds like an improvement to me. Even better once they start recovering larger amounts of it (he said himself that they need to recover at least 99% of it).
Also, from my (limited) knowledge of tritium it isn't that dangerous. It is in watches, weapon sights, emergency signs, and many other glow in the dark things. I've researched it before due to having several tritium containing products and it seems that in small amounts it simply vents into the atmosphere and becomes a non issue.
Anyway please correct me if I'm wrong on anything.
There are many good points here, but I'm dismayed at the complete lack of discussion (heck, no mention even!) of stellarators. It seems to me that computing power has made them quite viable (as shown by the German Wendelstein Stellarator), while the Tokamak design is basically an extremely old Russian design that has never performed to expectations.
So if I were stuck with his assumption that fusion = tokamak, then yes, he'd be completely right. But if stellarators achieve their promises, then he's missing the answer completely (and given the Germans success with the Wendelstein, I have a strong confidence that tokamaks will be shuttered within a decade; abandoned to the better technology of the stellarator).
This is one article that can be greatly improved by adding "fucking" before every instance of "sun". I would also appreciate a number of citations, since it's hard for me to take things at face value after this modern fake news panic thing
>since it's hard for me to take things at face value after this modern fake news panic thing
A part of me is glad to see more people becoming increasingly skeptical. A part of me is sad to see that people think that the problem is only a modern one.
I predict that SpaceX and SolarCity will eventually roll out vast solar cell arrays in space, because the sun puts out an unthinkable amount of energy that misses earth entirely. ie The sun is already a spectacular fusion reactor. Use it.
This is an interesting article from Dr. Jassby who has both patents and papers on Fusion going back to 1977 at least.
And there are at least two things that stand out for me, the first is that he goes out of his way to craft a narrative that is negative, and the second he doesn't mention the half dozen or so fusion efforts that are on going beside the NIF and ITER projects.
The interesting thing about the narrative is that it takes 'positive' things and puts them in a negative context. For example it takes power to run a fusion reactor, and while the reactor can generate that power it lowers the net output. At the same time, if there is a problem and the reactor turns off, and there is no 'cool down' problem like there is in a fission reactor. No steaming pile of intermediate fission products decaying to generate way too much heat long after you turned the off switch to off. You can portray that as "gee it has to waste a huge chunk of energy just keeping itself running." but that seems trivial given that things like cars have the same issue. Part of the engine power goes into run the fuel pump, or the alternator.
This sort of thing recurs in the article and is perhaps most egregious in characterizing the neutron flux as a proliferation hazard. While it is absolutely true you could design a fusion reactor whose purpose was to create fissile material for bombs, that design would look nothing like a power reactor, nor could you easily (maybe even possibly), actually 'convert' a design that had been built and deployed to generate power into one that could create fissile material. So what is the goal of combining the 'facts' that high neutron fluxes of low energy neutrons can weaponize U238, and that fusion power generation can generate high neutron fluxes, without also explaining that a fusion reactor designed to make power couldn't possibly be co-opted to make bombs? Why leave the inconvenient fact off unless you're attempting to mislead the reader? And what is the point of misleading them?
And that really makes me wonder, what exactly is Dr. Jassby trying to say here?
I could completely understand an article that says "Hey, while fusion has some desirable properties, it is going to be expensive, and from where I sit it is going to be more expensive than the power it generates is worth." That is a perfectly reasonable discussion to have and one that should be had with people looking at building fusion devices.
It also makes me wonder why he doesn't mention some of the other groups who are designing and building much simpler and perhaps more efficient systems (in terms of net energy production). Does he not keep up with the field or does he omit them because they don't add to the generally negative tone? All in all this article left me with more questions than insights.
> While it is absolutely true you could design a fusion reactor whose purpose was to create fissile material for bombs, that design would look nothing like a power reactor
This directly contradicts the article:
"The open or clandestine production of plutonium 239 is possible in a fusion reactor simply by placing natural or depleted uranium oxide at any location where neutrons of any energy are flying about. The ocean of slowing-down neutrons that results from scattering of the streaming fusion neutrons on the reaction vessel permeates every nook and cranny of the reactor interior, including appendages to the reaction vessel."
I challenge you to find anywhere in the ITER facility or NIF facility where you can insert, leave, or remove any amount of natural or depleted uranium oxide at all, much less where it could be exposed to neutron flux. There is a wealth of information on the iter site you can read to understand just how difficult it would be.
"In experiments to date the energy input required to produce the temperatures and pressures that enable significant fusion reactions in hydrogen isotopes has far exceeded the fusion energy generated."
Wrong. NIF and others have achieved Q>1, albeit in single events.
No, they haven't, not even in single events, although their press releases claim that they have. Their actual papers make it clear that the claim of >1 energy gain is a result of somewhat arbitrary definitions. There has not been a single shot on the NIF where more energy was supplied by the fusion reaction than was used to fire the lasers - it's not even close.
One seems to be talking about energy released by while the other is talking about energy captured from.
I think GP was criticizing the news for making it sound like the first hasn't happened when only the second hasn't happened. (At least, in my understanding.)
The BBC article linked above describes a very specific scenario which matches neither of your described options.
"The amount of energy released through the fusion reaction exceeded the amount of energy being absorbed by the fuel"
This doesn't describe the relationship between the amount of energy injected by the lasers and the energy emitted by the fusion reaction. Presumably a lot of the energy injected by the lasers is wasted, some of it is absorbed by the fuel and it is this absorbed amount that was exceeded by the energy released.
There is even a clearer statement in the article:
"This is a step short of the lab's stated goal of "ignition", where nuclear fusion generates as much energy as the lasers supply."
It seems to me that the following order of energy levels exists with present technology:
Energy supplied to Lasers > Energy injected into system by lasers > Energy absorbed by fuel > Energy released by fuel > Energy captured to generate electricity from
'This is a step short of the lab's stated goal of "ignition", where nuclear fusion generates as much energy as the lasers supply.'
Which is just what I was pointing out. And it's a large step from honest break-even, where you count the energy input into the laser system (it's not 100% efficient). And a huge step from practical break-even, where you count the energy required to run the plant.
I was attempting to build on your contributions, but I decided to do it the same level as your comment because it made more sense to address SomeStupidPoint directly.
Attempt at constructive criticism: I think your answers were a little too concise for some members of this audience and I decided to help out by expanding on your points a little.
I've factored your comment about input energy into the lasers in my reply above.
I think "sustainably" is implied, and the author's next sentence makes that a bit clearer.
> But through the use of promising fusion technologies such as magnetic confinement and laser-based inertial confinement, humanity is moving much closer to getting around that problem and achieving that breakthrough moment when the amount of energy coming out of a fusion reactor will sustainably exceed the amount going in, producing net energy.
I've been hearing about and hoping for fusion power since probably the 1980s. The more I think about it the more I think it's probably a pipe dream.
The idea appears so simple an elusive: combine Hydrogen into Helium (and there are variations that use Helium, particularly He-3 as a fuel), which releases energy. Hydrogen is essentially limitless (He, particularly He-3, less so). Stars do it all the time.
The two big problems are of course:
1. Containment. How do you contain something that's heated up to millions of degrees? Magnetic containment has been the obvious candidate since hydrogen heated sufficiently loses it's electrons and becomes a H+ ion but this so far seems far easier said than done; and
2. Neutrons. Fusion using deuterium, tritium, He-4 and He-3 releases varying amounts of neutrons. These can't be magnetically contained (obviously) and they're destructive. We can achieve fusion but neutrons end up pretty quickly destroying the containment vessel.
Stars solve the first problem quite easily: with gravity. This isn't something that's useful to us. A star exists in a state of a balance of two opposing forces: gravity from the sheer their sheer mass trying to crush them vs the outward pressure of the fusion processes they're undergoing.
The differences in potential size are massive. A star like our own sun will eventually become a red giant extending about to the orbit of the earth. That's 150 _million_ km. Likewise a star larger than ours can end up compressed to something 10 miles or less across. This just illustrates the differentials in forces involved.
At this point I'm honestly not convinced that this is an economically solvable problem. It's not sufficient to output more energy than you input either.
Let's say a fusion reactor costs $1B to produce, has maintenance costs of $20m/year and has a life of 50 years. That's (at least) $2B you need to recover in capital costs from your "free" energy.
I honestly think burning a fuel of some sort will be here for a long, long time. I don't want to discount the rising importance of wind and solar. I do believe that with the ever decreasing costs these will become even more important, even critical, in the future. They simply won't completely replace some kind of combusted fuel.
Now what that fuel economy looks like is an unknown. There is a course a limit in what we can dig up out of the ground and to continue using it or something akin to it we need to solve the carbon problem. So what we really need is:
1. Some method of sequestering atmospheric carbon at scale; and
2. Some method of storing excess renewable energy.
There are multiple solutions to (2), the most common of which today seems to be using batteries. Current battery technology has come a long way but it has a lot of disadvantages, not the least of which is its current reliance on something that itself is limited (lithium). Plus obtaining all the materials for batteries seems to be highly environmentally destructive.
It's unclear to me that without some major innovation batteries will only ever be a niche technology for the likes of a few Teslas and airplanes. Now I'm not saying that innovation won't happen. I'm simply saying current battery tech isn't there yet.
Another potential solution for (2) is to use energy to construct a fuel that can be transported and combusted eg [1].
Synthetic hydrocarbons have a lot of advantages. Convenience, relatively simple tech and high energy density (important for weight). Large scale carbon sequestration remains a huge sticking point however. Plus actual fossil fuels remain cheaper than synthetics but this will eventually change either because fossil fuels become much rarer or renewable energy costs get sufficiently low.
One of the reasons it's still failed to eventuate is that the engineering and scientific challenges are significant and funding for fusion endeavours is already quite limited and has been declining. The benefits from fusion are insanely good (I have read that in an energy-producing reactor, you would only need ~250kg of deuterium to power the US for a year), so it would be foolish to relegate it to 'pipe dream' status just because of the current difficulties. It just needs continued effort and investment.
The 2 main issues are addressed in another couple of comments [0], [1]
Burning fuel is easy, of course it's going to stick around, but for maximum efficiency and power generation capabilities, you can't really go past fusion.
> At this point I'm honestly not convinced that this is an economically solvable problem.
The energy industry as it is, is belligerent and seemingly resistant to new technologies that aren't fossil fuel based. How long would this remain the case if actual concerted (gov, scientific and industrial) effort was dedicated to fusion rather than pissing it up the wall trying to hold on to fossil fuel based approaches for mass scale power generation.
> It's not sufficient to output more energy than you input either.
The science says otherwise, if this was the case we wouldn't even be bothering to get power out of it.
>I don't want to discount the rising importance of wind and solar. I do believe that with the ever decreasing costs these will become even more important, even critical, in the future. They simply won't completely replace some kind of combusted fuel.
I completely disagree. There's no shortage of solar power hitting the earth at any given moment. The problem with these renewable sources is that they're non-constant. The solution is simple: storage. Obviously, that's easier said than done, but it is something that's being worked on. Once really good energy storage is in place, then the variable nature of solar/wind isn't too much of a problem any more. It's not here yet (we do have some battery storage, and for some time now we've had stuff like pumped hydroelectric), but it's coming.
"Tritium has a half life of 12.3 years which means it will be dangerous for at least 120 years, since the hazardous life for a radionuclide is ten to twenty times longer than its half-life"¹
If you come in contact with it during that time you will get cancer.
If tritium is released in its atomic form, it is only a hyper-local threat, both geographically and temporally. I wasn't able to confirm this with a quick Google, but I imagine tritium will just want to go up, up, up as fast as it can, where it will never bother anybody again. (It's heavier than hydrogen but still lighter per-atom than helium, which has the same behavior). It doesn't have the problem fission fuels have where they like to seep into ground water, and be both heavy metals and radioactive.
Plus, there won't necessarily be a lot of it in a plant. Even if you're accustomed to the surprisingly large energy density of fission fuels, you're still not used to the even larger power density of fusion fuels. We're not talking moving tritium around by the tons... we're talking by the kilogram, or tens of kilograms. Compared to the difficulty of building the fusion plant in the first place, handling a few pounds of tritium safely isn't challenging at all.
Tritium binds with oxygen and forms tritiated water. It then contaminates the food chain and cause cancer. This is a current concern for populations living around nuclear power plants.
When a reporter calls their credibility into question in the second sentence, there's a problem.
"they would produce vast amounts of energy with little radioactive waste, forming little or no plutonium byproducts that could be used for nuclear weapons."
Fusion products are helium, neutrons, and neutrinos. Stars eventually fuse products up to iron.
The 'reporter' "was a principal research physicist at the Princeton Plasma Physics Lab until 1999. For 25 years he worked in areas of plasma physics and neutron production related to fusion energy research and development. He holds a PhD in astrophysical sciences from Princeton University."
Do you think it's perhaps possible that the author knows what he's talking about? If you'd continued reading after that second sentence you might have come across:
> In fact, these neutron streams lead directly to four regrettable problems with nuclear energy: radiation damage to structures; radioactive waste; the need for biological shielding; and the potential for the production of weapons-grade plutonium 239—thus adding to the threat of nuclear weapons proliferation, not lessening it, as fusion proponents would have it.
I don't understand why the OP considers sneaky Pu239 production a proliferation problem, but doesn't consider unaccounted for tritium in the fuel cycle and the general availability of Li6 to bread more tritium a proliferation problem. Anyone?
Tritium is useful only if you can manufacture a fission weapon in the first place. The vast majority of anti-proliferation efforts are aimed at preventing fission weapons capability. Preventing a nuclear weapons state from graduating from fission weapons to boosted-fission or full two stage thermonuclear is a much lower priority.
You can build a bomb with plutonium but no tritium, so if you want to prevent countries from going nuclear that's what you need to stop.
Also, the plutonium production is the bottleneck. E.g. this article about the Pakistan nuclear weapons program[0] estimated that the 50 MW nuclear reactor in Kushab could be used to produce 10kg of plutonium and 100g of tritium per year. The plutonium would be enough for about 2 bombs, while the tritium would be enough for 20.
If we applied even half the cleverness needed for fusion to making better fission technology, we'd probably be way better off.
...and then there's solar. Why even bother with producing the energy, just capture it with a very thin solid state device! Just need to automate the planting of solar panels in the desert, and we could produce all of our electricity from the Sun. Using just a fifth to a tenth of the land (and much crappier land that nothing much can grow on) that we use for /ethanol/ production alone.
(Yes, storage is tough, but is getting cheaper, and we can just plant more solar panels so there's enough power even during cloudy days... Although this is mostly a thought exercise. The best plan for deep decarbonization by far is to operate with a mix of clean power sources optimized for high capacity factor, including at least our current nuclear fleet... They help provide a reliable baseline which drastically reduces the amount of storage and over-installation required. That last 20% of power produced by nuclear is worth its weight in gold and should be protected at least until all fossil power production is ended.)