The biggest roadblock to widespread adoption of solar as a baseload power source is the storage problem. Photovoltaics stop producing power when the sun goes down, that's not just inconvenient it's unworkable with the way power is used today. Until we can economically shift the output curve of solar power plants to match demand rather than supply it will always fill no more than a niche. Today the only way to do this is to work in concert with hydropower, but that is a very limited solution.
Also, it's not strictly necessary to solve the problem on a large scale, even at a small consumer-grade scale it could be helpful. If every house had a battery pack or supercap bank or what-have-you and it allowed for smoothing out power demand or perhaps enabled charging electric vehicles overnight, then it could have a huge impact on energy usage patterns. Even with the PV -> battery -> battery losses it would still be a substantial net win.
That's only a problem that's much farther down the road, say when solar penetration goes above 10% or 20%.
In the mean time you could just throttle hydro plants and natural gas plants to account for both usage variations as well as wind and solar production variations.
It's a problem preventing solar penetration beyond a niche market. It also, as you point out, prevents solar from actually displacing existing power plant capacity. That means that solar power comes at an extraneous cost, since it doesn't obviate the need of building even a single non-solar power plant. Those are big problems and it's not as though we're magically going to start building lots and lots of solar power capacity without solving those problems. The sooner their tackled the faster those technologies will be on the amortization/improvement train and their costs will fall.
Solar and wind are today just sideshows in power production, if you want them to be otherwise the smart move is to invest in storage technologies.
I agree with your sentiment, but solar prices have plummeted and clever new financing arrangements from Solar City et al. have made them much more accessible, in the US at least.
Bad things are cheap to benefit from only when one is allowed to dump the real costs on others. Real costs were allowed to be dumped aside on others only because there weren't any real alternative solutions for problems at hand. Things change.
For example, beyond a certain size, our cities would be unlivable without waste regulations. So the freedom to throw their waste out of the front door was taken away for people living in cities.
> The biggest roadblock to widespread adoption of solar as a baseload power source is the storage problem.
While solar as a baseload source is being pursued (with molten salt as a "heat battery") - already installed [1], this is not what "solar power" is generally referring to.
The essential solar vision is to decentralize power production, and literally go "power to the people" with residential and commercial panel installations that supplement the need for grid power.
While batteries at this level would be preferred (allowing installations to go entirely "grid optional" or "off grid"), simply augmenting grid power would likely greatly affect base load requirements for the centralized power plants (by reducing peaks).
Methylammonium ions can be had from methylamine, which is dirt-cheap (made from methanol and ammonia, usually).
Halogens are fluorine, chlorine, bromine, iodine. All common and cheap.
Plumbates are lead and oxygen. Again, common and cheap.
So, no, there shouldn't be any supply problem.
Disposal might need some watching due to the lead, but I doubt this would involve anywhere near the volume that's used in (e.g.) lead-acid car batteries.
Chlorine is available in quantity, but fluorine, bromine, and iodine are not. Bromine and iodine are actually quite rare - there's more uranium than either of those.
Lead is also somewhat rare.
> All common and cheap.
Exactly what I'm talking about. In the small quantities we use currently - common and cheap. In the quantities needed to make energy? Not common at all.
> So, no, there shouldn't be any supply problem.
Actually, if that's what it's made of there will be huge supply problems.
Based on all this I suspect this is the last we will hear of this technology. It works in the lab, but is not practical at scale.
Although maybe those elements are just used at ppm levels, and the bulk of this is silicon, then this could work.
> but I doubt this would involve anywhere near the volume that's used in (e.g.) lead-acid car batteries.
To provide energy at country level scales? It would use WAY more. I once calculated that there is not enough lead on the planet to make enough batteries to store enough energy for overnight use. (In the context of batteries to buffer the diurnal nature of solar energy.)
From the article linked by danmaz74, the most commonly studied chemical is CH3NH3PbI3, so lead and iodine are a significant portion. But it's a thin layer on a substrate like titanium dioxide or zinc oxide. They don't say exactly how thin, except to say it's less than several micrometers including the substrate. I'd really like to see numbers on how much of each element would be needed per gigawatt, and compare to what's available.
Anyone who doubts that your battery calculation was correct should read "A Nation-Sized Battery" by Tom Murphy, a physics professor at University of California. His other blog entries do a fine job with similar calculations for other energy technologies.
Assume the film is entirely lead (likely the limiting component -- as others have noted there is no shortage of bromine, chlorine, and iodine in the ocean) and 10 micrometers thick.
Lead has a density of 11,340 kg/m^3.
A ten micrometer thickness means that each m^2 of material will use about 0.11 kg of lead.
While the earth receives something like 1,000 watts/m^2 in full sun, other factors (e.g., night, clouds, different sun angles) mean that the mean is closer to 250 watts/m^2.
This stuff is expected to reach 20% efficiency, so each m^2 of collector will produce a mean output of 50 or so watts.
Human civilization as a whole produces about 15 terawatts of power, so replacing it would require about 300 billion square meters of collector.
At 0.11 kg/m^2, that would require about 33 billion kg of lead.
World production of lead is about 8 billion kg per year.
So, unless I messed up the arithmetic (please correct if so) total replacement of all existing power sources would require only about 4 years of lead output. Given that that will never happen (and even if it did, the time scale would be much longer than 4 years), it is safe to say that the availability of lead is not a limiting factor for this technology.
Your arithmetic is correct except for lead production figures and optimism on iodine. You are including recycling, so real production is about half of what you wrote.
The trouble is that at current rates of consumption the world will run out in 42 years (according to wikipedia). So this project would use 20% of all the remaining lead in the world. (Well not really, the film is not 100% lead, but it's still quite a lot.)
I call that "not enough". Although it might be worth it anyway. But what do you do when energy use goes up?
And despite people saying there is bromine and iodine in the ocean, there is no practical way to get it out of the ocean in quantity. There is everything in the ocean - in huge quantities! For example 1/10 as much gold as has ever been mined by humans exists in the ocean - but no one can get it out in quantity.
World production of iodine is about 1/200 of lead production and we already see that we barely have enough lead.
And world existence of iodine is 1/100 of lead. And considering we need a significant percentage of the lead in the world, there is no way there is enough iodine.
But we are ignoring the substrate. It won't be so thin, so we'll need a lot of it. I wonder how much of it is for mechanical strength (i.e. replaceable) and how much is essential.
Maybe this could work - using a thin film is very promising, but I'm skeptical this could be scaled.
I still don't understand where you're going with this. Just because it's not capable of entirely providing our current power consumption doesn't mean it's a bad idea for deployment. After all, our current majority power source is a finite resource that's going to run out, and similar resourcing issues apply to uranium.
You're probably going to reccomend Thorium, but the barriers to commercialisation there are more serious.
Personally I suspect we'll end up with 30% solar, 30% wind, 40% other (tidal, nuclear, geothermal, biomass etc).
> I still don't understand where you're going with this.
This is a useful technology only because it is inexpensive. If it were more expensive it would not have value.
Because it relies on rare elements it can not be used in scale because as soon as you do the price goes up, it is no longer inexpensive, and no longer useful.
This feedback loop has killed every solar technology I've read about except silicon.
It sort depends on how much material is needed for each unit area of panels. Bromine is apparently abundant in the oceans and annual production is ~half a million tonnes (wikipedia!).
Iodine is very much the same. While there's not a huge amount in the Earth's crust, relative to some elements, it's relatively abundant in seawater and relatively easy to extract – for anyone who really needs it.
This is like saying we'll never run out of coal because carbon is so prevalent: it's not the right form. Silicon purification is energy-intensive, a major factor in the cost of silicon cells. Perovskite and other thin-film technologies potentially have the advantage of it being easier to handle liquids than thin solid silicon wafers; just deposit the final result of the process on glass.
Perovskite is also common in "the planet": 93% of the mantle is silicate perovskite, according to Wikipedia. I'm not sure the geological abundance (accessible or not) is the determining factor in the cost of solar cells made with these materials.
Edit: actually, it seems from the more detailed descriptions that the perovskites useful in solar cells have a totally different chemical formula from the common geological ones, so your point about abundance of silicon needs to be measured against whatever elements these (presumably synthetic) perovskites contain.
You're not one of those RoHS deniers are you? I hate to break it to you, but EVERYTHING made in the last ten years is made with lead-free solder and ta-da: it all works perfectly well.
Over the past few years, pretty much everything critical has been built with standard SnPb solder under various RoHS exemptions. Most of those exemptions have recently expired or are expiring soon. It is far too soon to pat yourself on the back over the apparent absence of adverse consequences.
Agreed, but I'd add that the transition period coincided with the switch to water based flux processes, and the headlong rush to China. All of those things involved their own sets of birthing pains.
The issue is whether solar panels containing lead are going to be the same as solder containing lead. I'm going to go with "no, not exactly" as solder is an OTC consumable that gets thrown around during DIY, fragments dropped on the floor where pets and toddlers can chew on them, it gets heated and inhaled, etc.
Solar panels are not quite the same. No doubt lead inside the sealed unit will cause implications for safe disposal; but then lots of things have special disposal, e.g. batteries.
It's only $44, shipped! It's enough lead to measurably lower the IQ of ten million infants.
The idea that out-of-control EU regulators prevent anyone, individual or industry, from acquiring and using elemental lead products is just another libertarian bugaboo that routinely pops up here on HN.
It's a useful statistic, particularly as it places a minimum bound on the land area requirements. In fact most renewables (solar, wind, biomass, wave, tidal power) can be translated into land-use requirements: so many kW/m^2 or MW/hectare. Useful conversion: 1 kW/m^2 is about the average insolation at Earth's surface, which translates to 10 MW/hectare, or 1 GW/km^2.
Knowing that 1% of the incident sunlight on Britain would provide the nation's energy requirements, and that solar cell efficiency is 20%, means that Britain could be energy self-sufficient with 5% of its land-area covered with PV cells. The land area of the UK is about 229 million km^2, so roughly 11.5 million km^2. Knowing what the material requirements per m^2 for perovskite solar cells would give us a first-order bounds check on how viable the technology might be for this application.
The challenges for solar are not just cost (of which installation and infrastructure costs are a significant component), but substrate abundance and, unmentioned by this article, storage.
This is why I never understood the argument "but solar is so expensive right now", when it was still very early days for solar investment. If you start pouring the billions of dollars nuclear gets into solar panel research, and bring it to a high enough scale, you eventually start getting not just a real alternative, but potentially a much better alternative to any other energy source.
Once we "fix" the cost of solar panels, then we need to figure out how to store solar energy cheaply and easily, too, and then it can be a source of energy that's not just for day time and sunny seasons, too.
Also, it's not strictly necessary to solve the problem on a large scale, even at a small consumer-grade scale it could be helpful. If every house had a battery pack or supercap bank or what-have-you and it allowed for smoothing out power demand or perhaps enabled charging electric vehicles overnight, then it could have a huge impact on energy usage patterns. Even with the PV -> battery -> battery losses it would still be a substantial net win.