Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
A skeptic's take on beaming power to Earth from space (ieee.org)
88 points by Brajeshwar on May 10, 2024 | hide | past | favorite | 161 comments


The detail that has stuck in my mind since the last time I read about this is:

Thermodynamically speaking, if you transmit electricity from outside the Earth onto the Earth – even if you do it perfectly efficiently – you are, by definition, heating the planet.

Based on that I concluded that it is superior to generate electricity with inputs that are already hitting the Earth… But I’d be very interested to learn more about this.


The same thing happens when we create heat on earth with nuclear energy, when >50% is directly used to heat the earth via waste heat, and the electrical part becomes >99% waste heat (some tiny fraction of energy is probably permanently converted to chemical bonds, etc.)

However, this is completely dwarfed by the dynamics of flux in and out of the earth. I forget the exact order of magnitude but I think it's around 100TW, and of course about that amount needs to be rejected to space. The key dynamics that provide us balance are of course the exact and precise quantities of greenhouse gases in the atmosphere, albedo, etc.

And of course, coal and natural gas are releasing stored heat energy too, but their contributions to changes in the atmosphere far exceed the contribution from direct heat energy on the surface of the earth.


The flux is roughly (but more than) 100 petawatts.

Human energy utilization is on the order of 100 terawatts.


Human primary energy use is more like 20 TW, not 100 TW.

I understand the increase in thermal forcing from added greenhouse gases is currently > 400 TW.


Really? So we're only 3 orders of magnitude off? Seems like we can definitely get to appreciable levels of i/o in the near future.


Total solar energy hitting earth = ~45 Petawatts

Total power used by humans = 16 Terawatts.

https://www.nasa.gov/wp-content/uploads/2015/03/135642main_b...

What we get from the Sun in one hour, is how much we use for the whole year.

All our energy needs can be covered if we can tap into just 0.00001% of energy received. About 100km2 worth of solar panels.

If every residential and commercial building was covered with solar roof - we’d have all our energy needs covered during day time.

It’s free energy if we know how to use it. Plants do and they make up >99.9% of biomass on this planet.

The problem with C02 is it traps the heat. All useful work is done when high energy packets from sun in ultraviolet and visible light get dissipated into heat (infrared).

If we don’t solve greenhouse gas problem, we can’t use more energy since we’d be out of equilibrium.


Plus solar panels are actually MORE reflective than forests, so if we cut down forests to place the solar panels we'd be directly helping out against climate change, by lowering the albedo of the surface.

(forests are only a little less of a "heat island" than an asphalt surface is, and that makes sense, doesn't it? Forests actively try to maximize what they capture from the sun. That tress is capture more is why trees exist in the first place)


If people use 16 TW and solar influx is 45 PW, then humans use about 1/3000, or about 0.03% of solar influx. Not a lot, but a lot more than 0.00001%.


This is false, if your SBSP conversion efficiency on the ground is better than solar panels on the ground then you will add less energy to earth by collecting the power in space and transmitting it to the ground then by changing the reflective albedo of desert to a very nonreflective material like solar panels.


Hmm. If your PV has efficiency > the albedo of the ground it covers, it will be better than SBSP, even if the rectenna is 100% efficient (assuming the emissivity of the PV is at least as good as that of the desert in the far IR). This assumes the rectenna has the same optical properties as the ground it covers.

One could improve the PV by making it highly reflective in the near IR at wavelengths below the bandgap of the cells, while still being highly emissive at the longer wavelengths where it will radiate heat.


You could put the solar panels between the sun and Earth so they only capture photon that would have otherwise hit Earth, if that really matters. This would even be nice because if we wanted to cool the planet down we could just redirect some rays.

But I think using an appreciable amount of the power provided to Earth by the sun is still sci-fi stuff anyway, so we probably can’t make an appreciable dent either way.

Sagan put us at around Kardashev .7, it is a log scale shifted by a constant, apparently we’re under .2%, of what hits the Earth, I guess.

https://en.m.wikipedia.org/wiki/Kardashev_scale


If you set the thing up in the L1 Lagrange point it will have the same day/night problem that traditional solar has. You would need at least two, but probably realistically 3 different antenna farms to collect the power and the satellite would need to be able to track the beam across the surface of the Earth. Most of the proposals I've seen have the array in geosynchronous orbit so it can easily remain pointed at a fixed spot on the Earth.


You could also beam power from L1 to geosynchronous orbit and then to a single ground station. It really depends on how efficiently you can beam power, but hitting Type 1 on the Kardashev scale hits limitations based on radiative cooling into space vs incoming energy.

However, it makes sense to have hundreds of ground stations simply to minimize transmission losses on the ground. And presumably utilizing ground stations 24/7 is vastly less critical than maximizing the return on space based infrastructure.


At these power levels the power beaming array starts to look a lot like a weapon.

One with a beam that can target distinct areas.

Easy way to fry large amounts of your adversarys telecoms infra.

It will not put you back to the stone age, but 1970-1980 is not out of the question. For wireless communications that is.


One could put SBSP at the Earth-Sun L2 point, and just use ground based solar during the day. Weirdly, this would mean in winter at high latitudes one would mostly be getting solar power at night!

At the distance of the L2 point laser power beaming would probably be necessary to keep the transmitter and receiver sufficiently small.


This is like the very last possible issue with this plan. It's like worrying about a space elevator unbalancing the earth's rotation.

You need to get a better handle on orders of magnitude.


Exponential growth sneaks up on you. They probably thought the same thing in 1895 about the negligible magnitude of human greenhouse gas emissions.


There's no exponential growth, that's thermodynamically impossible. Heating the Earth with the microwaves would increase the Earth's radiation into space commensurately. It doesn't get stored forever. This would result in a new equilibrium almost immediately, with an unmeasurably greater temperature.

The Earth is always almost exactly in a radiative balance with space, except on pretty short timescales. If it weren't, we'd quickly cook. The radiation the Earth receives from the Sun fluctuates orders of magnitude more over solar cycles, but it's debated whether that even has a meaningful effect on global temperature.


(I'm not referring to exponential growth in temperature over time due to constant input power, but rather to exponential growth in input power over time because it's required for economic growth).


> exponential growth in input power over time because it's required for economic growth

It isn't. The amount of energy (and raw materials generally) required to produce a given amount of economic output is not constant. It gets smaller as technology advances. That offsets the effect of increased economic output. Indeed, as more and more economic output becomes information instead of physical objects, the average amount of energy required per unit of economic output will shrink even more.


Not only is this not guaranteed, it is not possible to be sustained in the long term, either physically or economically. Physically, there are bounds on the energy inputs required for any process, including information processing. Economically, if economic output increases exponentially while energy input is constant, this creates a contradiction, as it becomes exponentially easier over time for one individual to monopolize the entire energy supply.

The notion that economic growth can continue without growth of energy is a short-term illusion created by the transition to an information economy, outsourcing of manufacturing, and perhaps a lack of appreciation for the ongoing growth in energy consumption even within countries like the United States as the economy shifted away from physical goods, let alone the growth in energy consumption in countries like China that ramped up physical manufacturing to make this possible.


> the ongoing growth in energy consumption

Per capita energy consumption has not been growing in developed countries like the US. It has been decreasing for at least a couple of decades. Total energy consumption has been increasing because of population increase, but that is expected to level off around the middle of this century.


Because it shifted to China. China produces stuff for US and Europe. Add that energy to the per capita consumption.


> it becomes exponentially easier over time for one individual to monopolize the entire energy supply

No, it doesn't, because everyone else is also increasing their economic output. Assuming stable population, one individual's share of the energy supply remains constant.


The refrigerator I have today is larger, cooler, and uses less energy then the one I'd buy 30 years ago.

My phone uses less power then the one I had 5 years ago, as does my PC.

In fact every industrial manufacturing process today is using less energy then it was in the past.


And yet, the industrial manufacturing sector uses more! As does the computing sector.

This is because energy consumption reduction due to efficiency improvements goes -- to be very generous -- as 1/t, and exponential growth goes as e^t, and e^t/t is still exponential for large t.


I know what you're getting at, but for the example of global warming scientists were publishing concerns as early as 1896:

https://www.tandfonline.com/doi/abs/10.1080/1478644960862084...


That's exactly my point!


That's a pretty bad example of exponential growth.

https://en.wikipedia.org/wiki/Carbon_dioxide_in_Earth%27s_at...


What makes it a bad example?


It's not exponential?


That's a graph of CO2 concentration in the atmosphere, not human emissions. It won't look exponential until the magnitude of human emissions exceeds the magnitude of the natural carbon cycle by several times. Give it a decade or three.


This is orders of magnitude away from being anything like a concern.


If you read some criticisms of our economic system, they propose that growth will continue unabated without any limits, but after a couple hundred years, that exponential gets to the point that all the waste heat cannot be dissipated from the earth any longer and it all breaks down.


If you’re postulating unbounded growth, seems pretty arbitrary to not postulate engineering solutions to this problem (eg shift the energy intensive industry off-Earth).


Not all that energy will end as heat. And the surface that it may cover won't be significant compared with the Earth one. And if it is, it may shadow enough surface to actually cool down Earth.

In any case, it won't be worse than releasing a good amount of the captured solar energy in chemical form for millions of years in around a century.


67% of energy is wasted as heat: https://flowcharts.llnl.gov/sites/flowcharts/files/2023-10/U...

This will change when we switch to renewables.


What?

Renewables are still subject to the laws of thermodynamics and other laws of physics.

https://en.wikipedia.org/wiki/Heat_engine#Efficiency

Typical power stations (whether fueld by renewables or not) have efficiencies considerably less than 50% (the examples in this article give 40% efficiency for coal, 48% for nuclear, and 33% for geothermal).

Photovoltaics (not being Carnot-cycle heat engines) aren't subject to these specific limitations, but they have their own problems. Typical efficiencies for current mass-production models are around 20%, while "lab curiosity"-level cells still haven't broken 50%.

Note that this is just the generating side. There are also similar waste heat losses on the consumption side (for example, charging and discharging batteries is anything but 100% efficient, as anyone who's actually tried to use a modern laptop on his lap can attest).


  >Photovoltaics... are around 20%
The difference is, that 80% doesn't show up on anyone's "books." For coal, you actually need to go dig that 60% out of the ground and burn it, and it still emits a bunch of CO2.

For a fuller articulation of the point HN user thelastgallon is making, see this link:

https://www.sustainabilitybynumbers.com/p/electrification-en...


> The difference is, that 80% doesn't show up on anyone's "books."

Well, no. You still have to pay the amortized cost of the installation and real estate, the salaries of the employees, and many other things, all of which would be less if the efficiency were higher than 20%. For example, if the photovoltaics were 40% efficient (the same as coal), you'd need only half the real estate, half the semiconductor-grade silicon, and likely half the cost of many maintenance activities, none of which are free.

> https://www.sustainabilitybynumbers.com/p/electrification-en...

This site doesn't even mention solar or the efficiency thereof.


I didn't say they were free, I said that the energy doesn't show up on anyone's (energy) books. You know which books I mean: they're the ones where companies & countries report their energy reserves, the amount mined/burned/imported/exported, new discoveries this year, etc etc.

The reason this matters is because there's a lazy temptation to run the Electrification Calculation merely by looking up the amount of primary fossil energy burned annually, then assuming 100% of this must be replaced by solar/wind/whatever. However this simplistic calculation will over-estimate the amount of renewable energy needed by a factor of roughly 3x.

  >This site doesn't even mention solar or the efficiency thereof.
I fear you only want to 'win the debate' (vs reading for comprehension), but...

The entire thesis of the article is how mass renewable electrification enjoys large system efficiency gains over fossil fuels. It's pretty evident how solar is a critical enabler of mass renewable electrification.


My reading comprehension is just fine. You were arguing that coal had to be "dug out of the ground", unlike photovoltaic cells which are apparently dropped off on your doorstep for free by the Silicon Fairy or something.


There's the reading comprehension. Again, nobody is saying solar panels are free, and I'm not sure where you got the idea.

I'm saying that when states and corporations do their energy reporting, there's no need to report the non-absorbed ('waste') energy from PV. Sunlight striking the ground (and whether it's utilized in a way we appreciate, versus 'just' powering the weather and the water cycle) is not something we include in those numbers.

Heck, maybe we should make a home for your PV 'waste' energy, a new energy statistic that does account for all sunlight striking the Earth. So if you cut down vegetation to make a parking lot, it makes your country's energy numbers get worse. Neat! Maybe that would be useful as an additional metric, but it's far from what we're trying to measure with our existing energy reporting policies. Our existing policies emphasize the (much larger) problems of greenhouse gas emissions and local pollution impacts.

Anyway I think the point has been adequately made, cheers.


I think the more important facet is the output stage. Almost all energy consumed gets converted to heat, with only a small portion doing usable work. Unless there's a massive improvement in the electrical foundations of compute, we will be producing large amounts of heat no matter where the energy is sourced from.


unless... the compute happens also in space? Given how dirt cheap solar has become, how cheap shipping stuff to space is becoming and how little there are clouds and nights in the space making solar power production intermittent, it sounds like it might be economically feasible in not so distant future. (no, I haven't done any math on this. If it checks out, feel free to steal the idea)


Ugh solar wind and cosmic rays. You'd need to use very inefficient CPUs with enormous features instead of the latest small node.


Your comment got me wondering if it's possible to stay in earth's shadow continuously without constant fuel expenditure, but apparently that's not possible: https://space.stackexchange.com/questions/55271/are-any-eart...


On the other hand, there's a lot of space... in space.


The issue with large components (we're talking microns instead of 20nm), is the launch weights (coming down), and power (also coming down). Large components also mean larger silicon dies which are much more expensive, and/or fewer components per die, which means now the CPUs are on different chips and need interconnect, which increases latency and interference. Not impossible, just a load of min-max-ing to do.


You would make the stuff in space, too. Give it a gentle shove off the factory loading dock (factory is on an asteroid) and a couple of years later it shows up in earth orbit, if you get your orbital calculations right…


Getting stuff from the asteroid belt to earth orbit is about as hard as the other way round. Definitely more than a gentle push


Not literally a gentle push, but very little rocket action is needed. The gravity well of an asteroid is tiny. The rest can be done with the correct slingshot maneuvers, the problem is calculating it. I am sure I have read something or other from NASA about it.


It's not the asteroid gravity that's the issue, it's the solar gravity field, you still have to perform an orbital transfer from the asteroid orbit to Earth orbit unless you want to leave the computer there and do batch jobs with significant latency.


Also true!!


Heat dissipation in space is hard.


And the Earth is in space, so if we get to the power consumption level where Earth governments need to care about the direct planetary heating effect of the energy source, it's still a win to do the hard thing (dissipation) somewhere else, like the Moon or something.


miniscule in comparison to greenhouse gas emissions, which can make the entire planet capture and hold more energy, not just a satellite.


You also affect how much energy (heat) you radiate from earth's surface into space, by choosing the right color, materials of buildings or land around you.


The Earth is in radiative balance with space on all but very short timescales. The extra heat, even if it were enough to matter, dissipates quickly.


We're not talking about just a one-time pulse of heat, it would be a constant stream of energy. To maintain the same radiant balance the Earth's surface would need to reach a higher equilibrium temperature.

We currently have an energy budget that's 0.1% of insolation (and compounding growth at 2-5% per year), so if SBSP actually scales to its market opportunity then the effect could certainly be large enough to matter.


All nuclear power is also extra heat for the earth.


And much more, per unit of electricity consumed.


Ultimately the problem with these plans is that solar panels are cheap, but launching them into space is not. There is no point in the near future where launch costs become so cheap that it doesn't make sense to do this instead just building ten times as many panels for the same cost and installing them around the world. The math may change if we ever deploy a space elevator or do asteroid capture and orbital mining/manufacturing, but those are all in the distant future.


If SpaceX succeeds with Starship, projected launch costs to LEO could be as little as $10/kg. Even off by an order of magnitude, $100s/kg might be enough to make it viable. There are plenty of population centers where land is scarce or there are bureaucratic hurdles due to stubborn land owners.


A solar panel weighs like 10kg and costs about $100. Then you also need equipment to produce the microwaves, fuel and an engine to keep the thing in place...


So hypothetically, if it was roughly 2-3x the cost to deploy into space, there may be situations where it would be viable. No land usage on the ground, no maintenance, power could be beamed to various parts of the globe as its needed, etc. Ground based solar still wins watts/$ probably, but the better comparison might be versus small modular nuclear reactors. I think costs are close to the smaller reactors, but larger reactors are still more cost efficient. It would be like an SMR that did not require regulatory approvals and could cover a larger area as needed.


No maintenance? Really? One ‘aiming’ motor goes bad, and this thing could easily fry a long strip of the earth when it loses its lock on its ground station. I don’t think I want such an Elon Musk type of promise as “no maintenance” in this situation. Everything needs maintenance, and while space does different things to machines than air does, they still wear.


This video goes into some details about one proposed design: https://www.youtube.com/watch?v=YX1bcNqhhi8. The intensity of the beam would not be a concern. There is no "aiming" motor in this design, its using a phased array.

Ground solar panels need relatively frequent cleaning due to dust, weather, and plants/animals in some areas. In space the biggest maintenance costs are gone. With the $10-$100/kg launch costs refueling would not be that expensive. Micrometeorites may be an issue, but if the rate of micrometeorite impact is low enough it may not be a concern. The 60-year costs of ground solar panels are actually worse than a large nuclear power plant due to the maintenance, so it is a significant part.

Space solar may not be effective initially, but I think it could have its place similar to SMRs. Power could be generated 24/7. The ability to allocate power as needed across a whole hemisphere would allow us to optimize power grids without megatons of power cables. These could also beam power to moon bases or even mars bases eventually.

Also bringing up Musk was unnecessary. The space industry is bigger than one person and many of "his" ideas predate his involvement with the industry and have teams of many scientists and engineers doing the actual work. It would be a shame to throw out ideas that could facilitate human progress simply because we didn't like one personality.


I didn't mean EM himself was relevant here, I just think "no maintenance" for a device launched into the space seemed like the sort of promise he makes (specifically about how he has insisted over the years of describing his cars as 'full self driving').

TBH though I'm very much not qualified to debate about this type of technology, so I acknowledge that perhaps this could have no moving parts and if so maybe maintenance is less certain than I thought.


Conversely solar panels are cheap, but environmental impact statements, permitting and installation labour are not.

Its not immediately obvious that a mass produced solar satellite could not scale better than thousands of individual legal jurisdictions and eco systems.


Those things are true in the US. You don’t need to go all the way to space to find a place where solar panels can be deployed cheaply.


But you do need to factor in geography. Solar cells in space can be “always on and operating at peak efficiency” directly over the places that need the power, significantly reducing the need to build long distance transmission lines.


For me, the definitive critique of space-based solar power was written in 2012, by Tom Murphy. https://dothemath.ucsd.edu/2012/03/space-based-solar-power/

I haven't followed developments in solar cells or power transmission closely, but I get the sense that there's only been minor, incremental, improvements. The math still doesn't work.


I've always wondered (especially after a Dr Hossenfelder video about it). Even if we figured out all of the tech to get such a solar cell satellite into orbit for a reasonable cost, they still need a giant ground station to accept the power. How much power would that ground station generate if it was simply a bunch of solar cells instead of microwave receiver? A space cell might get power 24/7, but if 75% is lost in the conversion, how is that better than a ground based cell that gets power for only 6 hours a day?


Putting cells over the area: You get intermittent power. Putting antennas over the area: You get continuous power. And that's the achillies heel of solar--you can't have things shutting down every time a cloud comes over.

Furthermore, microwave antennas are mesh, not solid. You won't have full sun under it, but neither will it be dark.

That being said, there's a fundamental issue here that without huge improvements in launch costs it's simply not viable unless made out of lunar materials.

And note that it doesn't have to be in a synchronous orbit so long as you permit some movement of the antennas. Put say 25 stations out there and 24 ground stations--they keep hopping to the next station as the Earth rotates underneath, the 25th station is offline because it's in shadow.


The real magic is not putting all the cells in one place. It's distributing them over the land mass.

A single solar power station can be taken out by a cloudy day.

A million little solar power stations spread across an continent average together into an even power source that provides power for longer than daylight hours.


There's a practical limit on how far you can ship power. Some while ago I tried to model what it would take to maintain continuous power worldwide with just looking at day/night. Nope, couldn't be done even if your cells were free. Just the wires became impossible--I was looking at the best wires to date and that still translated into a number of nines (I forget how many) on the loss percentage--and some mechanical bottlenecks were you simply didn't have enough land to run the wires.

The higher you run the voltage the more corona loss, the higher you run the current the more resistance loss. And there's a limit to how close you can put the wires to each other before they interfere. The band of land required for the massive power bus is gargantuan.


Secondary usage as an impromptu death ray is the only way that pencils out for me. Even if you could cut the receiving area to 1/10 the equivalent solar panel area, the economics of launching a huge space array seem really difficult to ever make economic sense vs fully land based system.


Microwaves don't absorb uniformly into materials. You can have a potentially deadly amount of energy, but if you can't actually interact with it then it's harmless.

The proposal for rectenna arrays was grids over farmland which could be used for cattle grazing with illeffects to the cattle - simply not enough EM can be absorbed by them for it to matter.


This idea made more sense before batteries started to get good. But solar + batteries can get you through the night now.

A panel in space can capture maybe 3x as much energy as one on the ground over a 24 hour cycle. But there are losses in transmission and huge costs to get the thing into space.


> But solar + batteries can get you through the night now.

They can't get you through a month of Dunkelflaute in Germany (that's a 1-in-100 years event), when the normal renewable energy generation is less than 10% of the nameplate capacity.


The German energy grid is not an island but it is part of the largest synchronous grid of the world: https://en.m.wikipedia.org/wiki/Synchronous_grid_of_Continen...

Which is furthermore tightly coupled with other grids such as the UK/Ireland and Nordic countries.


Not to worry, your nuclear base load will carry you through the tough times.


Just to disspell that myth once and for all: Germany's nuclear power never exceeded ~28% of total electricity production. It would've never played a significant role in such scenario.

Oh, and let's not forget what happens when one too many power plants have to undergo maintenance while a few others have to shut down temporarily due to a heat wave... There is no such thing as perfect one-fits-all solution to stable and sustainable energy. Oh, and btw. "baseload" is a term from the 70s - there's not much energy intensive heavy industry left in Europe to keep that term meaningful and relevant these days.


Yes, Germany needed _more_ nuclear.

> Oh, and let's not forget what happens when one too many power plants have to undergo maintenance while a few others have to shut down temporarily due to a heat wave...

Let's actually forget it. The largest nuclear power plant in the US is in a freaking _desert_ and is cooled by evaporating treated wastewater. Nuclear power plants can work just fine during the heatwaves, the plants just need to be designed for that.


> The largest nuclear power plant in the US is in a freaking _desert_ ... the plants just need to be designed for that.

That's two key issues here that you just carelessly tossed aside. For one, central Europe doesn't have deserts or any large uninhabited regions for that matter. The US has a population density of 33.6 ppl/mk², compared to 236/km² in Germany. All nuclear power plants in Europe are therefore located near rivers and cooled accordingly.

Secondly, building nuclear power plants takes a shitload of money and time. Case in point:

* Hinkley Point C UK - significantly delayed, to date 50% cost overrun; only continued after the UK government gave long term guarantees, including fixed minimum electricity prices

* Olkiluoto Nuclear Power Plant Unit 3 Finland - 13 years delayed, 45% cost overrun

* Flamanville Nuclear Power Plant Unit 3 France - 12 years behind schedule, a staggering 5x cost overrun

* Plant Vogtle Unit 3-4 USA - massive delays and 2.4x cost overrun, Westinghouse filed for Chapter 11 due to losses from its nuclear business during construction

* etc.

So no, Germany didn't need _more_ nuclear.


Over time solar is only ever about 10% the nameplate capacity. Over a year my roof top panels in Australia get about 12.5% their nominal - i.e. about half of the daytime they'll see.

The German grid mix very obviously crushes back down to being fossil fuels every night.


Yes, my bad. I meant 10% of the normal average generation.

Here is an example: https://energy-charts.info/charts/power/chart.htm?l=en&c=DE&... - for about a week the renewable generation crashed in Jan 2019.

And it can happen for almost a _month_ of sustained low performance.


average capacity factor for utility-scale solar pv in california is 29%. no state in the usa is as low as 10%. why is your rooftop like maine?


Sydney, Australia. Temperate climates have a lot of grey sky, and we've had a lot more recently.


california is temperate but they put the solar farms in the desert


They're not meant to, that's what we have hydrogen for.


"Have"?


We "have" hydrogen to buffer us through winter in the same way we "have" batteries to buffer us through the night. The technology is ready, we're just waiting on sufficient renewable supply so that it starts making sense to build storage (instead of investing the money into, e.g. more transmission lines, or making loads more flexible). That point is most likely still ten to twenty years in the future. If we're lucky storage will get cheaper during that time, but even if it doesn't it wouldn't be a catastrophe.


"Ready"?

No, it's not. The long-term hydrogen storage demonstrator is not even completed yet. There is essentially no electrolyzer capacity, and long-distance hydrogen pipelines are even scarcer.

Sorry, but for now, hydrogen is nowhere close to reality in Germany. That's also why it's _subsidizing_ 10GW of new natural gas generation. After signing a 15-year LNG contract with Qatar.


Linde has been storing hydrogen under ground for a long time now [1], but you can store and transport it everywhere you can store and transport methane if you’re willing to lose a few percent per month. It’s simply another cost factor. And it’s not surprising that we don’t have a lot of electrolysis capacity given the economics I pointed out above.

[1] https://www.linde.com/clean-energy/our-h2-technology/hydroge...


The European high-pressure demonstrator is still ongoing: https://hypster-project.eu/about-the-project/

BTW, hydrogen has a 100-year GWP of 12, so leaking 2-5% (the current figures) is not acceptable long-term.


> 3x

This feels low to me (not an expert at all). So many advantages in space. much longer sun exposure, no atmosphere or weather to deal with, etc?


<antennas so big that we cannot even simulate their behavior.>

I really would love another sentence or two on this. I can't immediately think why that would be, e.g. don't Maxwell's equations apply at very large scales? Any ideas?


My two cents, though I'm no expert: I'd bet it's what the computational EM community (and other fields) calls a "multiscale" problem. EM solvers - that is, simulators that numerically solve Maxwell over some geometry-and-source boundary conditions - find E- and H-fields at certain "mesh points". In other words, they discretize 3-space into a grid and calculate solutions to Maxwell at those points.

In general, you'll want your mesh to have subwavelength distance between points, and perhaps even less in regions with complicated geometry or parts of your geometry you're particularly interested in. In the microwave regime, this means mesh points will typically have tens of centimeters or less between them. However, given that the receiving antennas in satellite-based solar power are orders of magnitude larger than that, trying to simulate such a large structure and still keep your mesh points relatively dense is just asking for the curse of dimensionality to bite you.

In other words, it's certainly possible with enough compute time, but we have better things to do with our GPU cores, especially since the whole point of antenna simulation is to assist with design by allowing you to run a bunch of simulations to tune your design without having to fabricate a bunch of DUTs. Again, I'm not really an expert, but my understanding is that this kind of multiscale problem is a hot research topic right now, not only in computational EM but in many other areas of physics simulations, especially those governed by nasty PDEs (e.g. fluid dynamics) or those which involve complicated structures at multiple scales (e.g. VLSI design).


They're talking about simulating big floppy structures in space.


"Each megasat could then convert gigawatts of power into a microwave beam aimed precisely at a big field of receiving antennas on Earth. These rectennas would then convert the signal to usable DC electricity."

What are the consequences if, for some reason, the aim becomes not-so-precise?


Vague recollection from studying this years ago...

The "receiver" is more-or-less a field "covered" by a spiderweb of bent coat-hanger wire. That doesn't block the sun, and making it huge (low power/sq. m) is quite cheap.

Since you don't want clouds/rain/fog to block the microwaves, the frequencies you use are ones which water does not absorb well. So if the beam hits a person...he probably can't even notice it.


But electronics probably would. I have fried a micro-controller with a leaky industrial microwave waveguide before.


Yes and no. On the (figurative) day when microwave power transmission becomes normal, designing & building electronics so they are relatively immune (or can leech power if needed) will also become normal. Because stuff that isn't immune will go wonky in enough circumstances (driving near a ground receiver, etc.) that consumers will notice, and get pissed.


I guess SimCity 2000 guaranteed this is the first thing everybody thinks about.

It seems that practicality and efficiency concerns limit the beam at between 10% to 30% of the solar intensity. Every single design falls in that range.



Which sort of leads itself to military usage...

And Wikipedia is only talking about humans, not about electronics, which are much more sensitive to microwaves. Want to fry the enemy's electronics? Just focus that "civilian" microwave transmitter with a few GW power tightly on where you want it to be.

I am really surprised nobody talks about this whenver space-based power transmission is mentioned.


If you trace the money, every researcher working on space based power is getting money from DoD.


The beam is controlled by a signal beamed from the rectenna, which controls the phase of the emitters at the satellite. If this fails, the emitters go out of phase and no beam is formed.


1) That's a happy path which might or might not work during malfunction. Imagine a terrorist deliberately disabling an original control signal and then using his own control signal to steer the beam on the city.

2) Beam can be used as a weapon if you will just point it and light it up and thus deploy would be sooner or later heavily restricted or outright banned like nuclear weapons in space are banned.


Futurama predicted it: https://youtu.be/0qksm5cRtcU

Unfortunately, most of these impossible proposals lack any form of passive failsafe. Plus, the insurance liability question maybe an unsurmountable unknown.


most of these proposals are downright ridiculous because they're couched in environmentalism but they act like nature will be totally fine to have a giant microwave laser or an artificial second moon.


It seems like it could be safer/easier to charge batteries in space and ship them back down... which probably speaks to the feasibility of ideas like this.


Goldeneye


And what if the aim is deliberately aimed at an enemy city?


Only one measly paragraph on the obvious improvement, eliminating the microwaves. Instead, launch independent steerable mirror satellites of the most economically efficient size, and point them at the highest bidder.

No heavy transmitters or PV cells. No new ground-based infrastructure that has to be built before you can do anything useful.


The atmosphere is opaque to sunlight very often... Because of clouds.



If god wanted power to be beamed to Earth from space, she would have put a giant reactor in the sky.


...the Sun

(also wouldn't the appropriate word be "goddess"? Interestingly in ancient German tradition the sun is female and the moon is male)


That was not intended as a riddle, so I have not prepared stickers, unfortunately.

Personifications of the sun and moon in ancient German lore had better make them respectively female and male, because otherwise the assignment would clash with the grammar, which has them as "die sonne" and "der mond".


We don't have the tech to send astronauts to work on a geostationary or geosynchronous satellites. Its a high earth orbit (22,000 miles) and all our tech is for low earth orbit (shuttle, ISS, 400-800 miles) where the ship and station are protected by earth's electromagnetic field and the propellant requirements are much less ... To the best of my knowledge we have never repaired a geosynchronous or geostationary satellite..


Casey Handmer did a couple of excellent articles on this topic:

- Space-based solar power is not a thing: https://caseyhandmer.wordpress.com/2019/08/20/space-based-so...

- No really, space based solar power is not a useful idea, literature review edition: https://caseyhandmer.wordpress.com/2019/09/20/no-really-spac...

You can get into the weeds of the detailed costings, safety, etc, but I think the clearest argument is this:

> The problem with beaming power using microwaves is that the monetizable value per Watt is incredibly low, because essentially unmetered electricity comes out of the walls of every building. The trick is to increase the value per Watt, by increasing the value and decreasing the power. The value is increased by modulating the microwaves with high speed data, and the power can be reduced by a factor of a million or so without hurting this method. Indeed, customers pay only for the data, and not for the transmitted electrical power, which is pathetically low at the receiver. Communications satellites remain the killer app for the commercial space industry

And then you look at the fact that almost every satellite communications company has gone bankrupt at some point. SpaceX with Starlink being a notable exception, but OneWeb which superficially looks pretty similar has already gone bankrupt once. If communications, which is many more orders of magnitude more valuable than power, is not enough to stave off bankruptcy, then there is no possibility of beamed power being economical without a commensurate improvement in the efficiency of space launch. And that's just a baseline as a necessary condition, not actually a sufficient condition for it to be a sensible business.


In general, the energy retention model for earth only keeps less than around 3% of the suns energy hitting our atmosphere. Shifting this property even by a fraction of a percent causes cascade shifts in global climate and ecology.

"There is always a well-known solution to every human problem - neat, plausible and wrong" (Mencken, 1920)

Threat assessment: what if a given technology is wielded by an insane king? ...because all things are eventually... =3

https://www.youtube.com/watch?v=lITBGjNEp08


The main issue with beaming solar power from orbit to surface is that this can be a weapon at a push of a button. There is no way you can make the owner of that orbit-based power station to promise they'll never redirect the beam, under any circumstances, cross their heart and hope to die. The only way to not have that threat is to not have the power station up there.

And that's how it's going to be. No matter how technically feasible the idea will ever be, it will never materialize, because the weaponization potential cannot be eliminated.


Every square in of the Earth is within the range of a weapon of mass destruction already. What's another?

(Speaking realistically, it would not be a weapon. These things take acres of antennas to collect power from.)


> These things take acres of antennas to collect power from.

Ok. Now imagine you redirect the beam to a random substation somewhere. What's the likelihood some of the transformers will fry? Or you point the beam at some cellular phone network towers?

Russia is at war right now, and they have lots of antennas pointing up, both for anti-air defense and for electronic warfare. All those would be immediately vulnerable to a concentrated power beam coming from the sky. How likely are they to say, "sure, no big deal, as long as you promise to only use that stuff for power generation, we are totally, absolutely fine with it" ?


The beam isn't necessarily concentrated enough to be damaging, so it isn't obvious that such a system would be a weapon.

In other words, the weaponization potential can be eliminated, and doesn't exist in lots of proposed designs.


>... because the weaponization potential cannot be eliminated.

So you're saying it'd make for an excellent cover story?

Besides, I've always been annoyed that Goldeneye wasn't real. Not so much Under Siege II.


If one just concentrates solar energy that would have hit other parts of the Earth, I don’t see why there would be a net heating.

That said, large scale concentration could still be extremely distributive on a regional scale.



There is another problem to solve, which is cooling the Earth. What does it take to shield the Earth from Sun so less solar power reaches the Earth? Is there such technology?


"Tin foil". Arranged in the right shape, and spun to maintain positioning at L1. About a million square kilometers give or take should do it, plus ancillary things like positioning thrusters etc.

Call it five to ten million metric tons, if we're using ultrathin kapton with sputtered aluminum. Launch costs alone would be in the upper billions to low trillions, combined with construction and maintenance, probably 1-2 trillion as a lowball estimate.

N.B. this only reduces power by a factor of 0.1%, about 1 W/m^2!


Today's solar storm might be a good way to prove if we could ever rely on that, even if it were possible.


"antennas so big that we cannot even simulate their behavior. " - does anyone know what that means?


Sounds like something that isn't true


I would think bigger antennas are easier to simulate.


How about a conductive space elevator? You could physically wire solar panels to the end.


You could, but the minimum length of a space elevator is 35,786 km and the maximum distance between two points on the surface of the Earth is about 20,000 km, so you might as well just build a surface power grid: your winter solstice midnight is someone else's midday summer solstice, and no dunkelflaute is worldwide.

Much easier on the ground, too, because you don't need to invent even one single new tech — not even better superconductors because even aluminium will work if you make the "wire" thick enough (scare quotes because, by coincidence, the circumference of the earth, 40,000 km, almost perfectly matches the conductivity of aluminium, 3.8e7 S/m, and you get a 1 Ω line from a 1 square meter cross section, which is a pretty thick "wire").

Plus, once it gets down from a space elevator, you then have to distribute it around the ground anyway.


It might only be viable if the majority of materials come from off-Earth, like the moon.


Would Room temperature superconductors change anything here?


Presumably it'd make it much more practical to store energy on Earth, killing the idea one and for all.


I can't see how, why would you expect them to?


Laser power beaming for aircraft seems more practical me.


Given how hot jet engines run, the capability to power aircraft in general that way seems identical to being able to target at least several thousands antimaterial laser weapons that necessarily are designed to be firing almost continuously and which automatically get worldwide coverage.

I'm not saying this will never happen (10% odds Musk tries it), but I don't see that being possible without a single world government, as whoever controls such a system will definitely never fear ICBMs or hostile aircraft.


You can't hit what you can't see.

Put an ablator on the nose of the ICBM. As it approaches burn-through the warhead salvage-fuses. Your defenders have EMP issues, but let's assume they can shield against that. The problem is now the sky is full of electronic ghosts from the first detonation. Pretty soon another warhead comes along through that ghost-filled sky. The defenders have a much harder time locating it accurately enough to fire. And, once again, when it's going to burn through it salvage-fuses. The sky gets worse.

You can stop any given missile. Stopping a whole string of them is quite another matter.

(Same as an aircraft pilot beating an incoming missile. Do it right and you can use the missile's speed against it, forcing it to make a turn it can't. But now you're out of position and can't do it against a second missile coming in some seconds behind.)


This was one of the ideas pursued in the Reagan-era Strategic Defense Initiative, aka Star Wars. They weren't able to develop a system that would work against ICBMs. Powering an aircraft is both easier and harder; the aircraft presumably isn't making evasive maneuvers and trying to stop you from powering it, but the economic and safety constraints are harder.


IIRC, one of the problems with SDI was that at the time the only way to make lasers powerful enough to destroy an ICBM was to pump the lasing medium with a nuke, making them one-shot devices.

Sensing may still be a problem today, especially as stealth is also improving, but detection in general is much easier than it was in the Regan era.


That doesn't sound like an airline I would care to fly!

During takeoff a 747 consumes power at a rate of about 90MW. Having something outside the plane, whether in orbit or ground-based, pumping that much power into the plane while I'm in it, sounds quite alarming. Not to mention issues with aiming, power loss, etc.

To power a plane with renewable sources, it seems most practical to generate power on the ground and use that to produce synthetic fuel.


Your laser is in the high tens of megawatts, perhaps even low hundreds. That's quite capable of shooting pretty much anything out of the sky. You had better keep the beam perfectly on target. And what's the target, anyway? How do you convert that kind power into propulsion with very high efficiency? (If your efficiency isn't good enough your plane melts.)


Choose a wavelength that doesn't penetrate the troposphere. Also, multiple beams focused on the aircraft from different angles will not be focused on the ground.

As for conversion: probably some kind of PV, actively cooled.


High-power lasers don't work over long distances in the atmosphere due to blooming.

https://en.wikipedia.org/wiki/Thermal_blooming


The aircraft would be beam powered at altitude, where the air is thin, and where the vehicle is moving quickly, so any given parcel of air is out of the beam quickly.


Dude where's my Ion Cannon?


Gee if only the Sun were somehow capable of beaming energy to the Earth.

/s


Not to mention distributing it across the world after beaming it...


Don't be a commie giving away the essentials of life for free. Monetize! /s


deregulate nuclear


> As the recently retired head of space power systems at ESA

Ah, he’s European.


SBSP is a cover for new military capability. Increase the array size by 2.5x and you get a true death ray: https://wikipedia.org/wiki/Space-based_solar_power#Safety




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: