Yes, using obsolete data from when tech was bleeding edge in 1999 to imply things about cost going forward is misleading.
On the other hand using improvement over the same timeline, drawing a line on a graph and saying "look how X Y is gonna be in Z years" is the same exact type of stupid but pointed in a different direction.
In 1991 lithium was highly immature technology and would take about a decade to make it into fragile electronics. It took another decade to make it into power tools. Now it's viable in high end commuter vehicles. If it was easy to predict the future a decade out with any reliability we wouldn't be having this discussion.
These are not the same kind of stupid. One makes the assumption that costs will always be the same, and the other makes the assumption that cost decreases are linear, or predictable. The former is much stupider.
They are exactly the same kind of error in that both assume stability over time. One assume prices are stable; the other assumes the rate of change is stable.
To be honest this whole debate is a bit academic. In one situation someone is saying "this thing costs a lot of money and so is a toy for the rich". At the time of the statement it's probably true! You might get into an error by saying "it will always be like that because it's expensive now".
But the issue is that, _even at the time of the statement_, the price has been decreasing over time! It's falsifiable at the time of the statement! You don't need to see the future to dismantle that argument.
Inversely, costs have gone down over time for a long time. You could make the inference that there's a floor, of course, and it's reasonable to do so! But it's hard to disprove the claim that prices will keep on going down.
The former is just on its face wrong based on the current facts, the latter is a judgement call about the future. Totally different beasts, and driven from different things.
Both are assumptions about the future. Neither is falsifiable at the moment of speaking. What would be falsifiable is a statement about the historic rate of change.
I agree that it's generally more likely that a 15-year trend will continue than change. If we're talking about a year, that is. But 5 years? 15 years? 100 years? 1000 years? At some point, the general assumption changes.
But without trend data, I also agree that assuming price stability a better general assumption than assuming a major price drop. Historically, very few things keep getting cheaper. It requires a) large society willing to keep making R&D investments, and b) a technological domain with a lot of possible ways to keep lowering costs.
And what I mostly agree with is the proverb, "It is difficult to make predictions, especially about the future."
They'll never be free. They'll never be cheaper than the raw materials that go into them, or the copper to wire them together and into the car.
At some point, there's going to be a price floor that the research, materials supply and competition simply won't break through.
Guessing when that is going to happen is more luck than anything. I do not see it continuing to get exponentially cheaper for long, though.
LiFePho has removed most of the precious metals out of the equation, and the demand for electric cars will continue to compete against the growth in demand for battery storage for renewables. For an analogy, lumber prices have shot through the roof over the past few years where I live due to construction booms. Nothing about the technology has changed, and supply hasn't fluctuated greatly. These same pressures are going to be pushing against lithium batteries getting exponentially cheaper over the next few years. I don't doubt that they will find room to bring prices down, but there is a floor out there somewhere close by.
I wouldn’t anticipate suddenly hitting a price floor, but instead the rate of reduction tapering off, which we’re not seeing yet in a clear way. There’s still a long way to go before we hit the limits from resource costs. And the current rate doesn’t need to continue for long before we start hitting price parity. In some cases we’re already there.
There's a floor but it's not just price of raw materials. It includes performance of same raw materials just used better. So a battery today with X amount of raw materials puts out Y power. A battery in 10 years with the same amount of X raw materials puts out Y^4 power. At least according to E=MC^2 it's a long way before we reach the floor.
Well, the theoretical floor is much further if one goes beyond Lithium. There are various prototypes that oxides aluminum potentially reaching energy densities greater than gasoline. Now, those are currently only reversible in the sense of using aluminum smelter to restore aluminum from the oxidized form. Still we do not know if reversible process is not possible at all in a compact device and the supply of aluminum is vast.
The paper says "We estimate that between 1992 and 2016, real price per energy capacity declined 13% per year". Where do you get your data for the last 5 years?
Regardless, 1 year isn't the correct duration for a bet. People average owning a car for ~6 years, and the average lifespan is something like 12 years.
But depending on terms, I might take a year-over-year bet for battery prices. Demand is high and the pandemic has caused significant supply chain problems. They could well have gone up this year. And indeed, a quick look at news reports suggests key components, including lithium and cobalt, are surging in price. Fine examples of why assuming a historical average has future meaning can get you into trouble.
One kWh of batteries generally requires around 200g of lithium. Both by weight and price it's a crucial, but small component, so unless it suffered a 10x hike, its price isn't relevant.
Meanwhile LiFePO4 batteries, which are currently the most popular chemistry(at least in China), contain no cobalt whatsoever.
Other analysts differ on whether these things will impact the retail price. But you make my point for me: this is an extremely complex problem, and making any simple assumption about future price is a mistake.
If you're coming up to the end of a logistic ("S") curve, then assuming a linear growth (or worse, a fixed rate of increase each year, ie exponential growth) is much worse of an assumption than assuming zero change, if you extrapolate too far.
A first order approximation is always less accurate than a zeroth order approximation for bounded functions, as the first order approximation will have unbounded error whereas the zeroth order approximation will not -- unless you are in the degenerate case of the first order itself being exactly zero. The first order approximation is infinitely worse. Most (all?) things in the world are bounded. Hence if you must choose between just a first and zero-th order approximation, the zero-th order is the way to go for long run predictions. Cue the XKCD comic about the expected number of weddings.
On the other hand, if you are not interested in making long run predictions but only short run predictions, then first order approximations will tend to be more accurate in a small region around the base, but that region might be quite small.
It's about mature and immature technologies. 20 years ago, photovoltaic cells had efficiency ranges in the single digits, but rising. When efficiency rises from 5% to 10%, it makes sense to assume that it keeps rising. But electric heaters had efficiency ranges around 95%, even when my parents were kids. Now they have even better efficiency - 99.8% in my newest apartment - which is only 5% better than 50 years ago - because there is a physical limit.
Technology follows an S curve. First it increases slowly, then faster, then more slowly again. It's silly to assume mature technologies will keep getting better at the same rate and silly to assume immature technologies won't get better. Without specifying the technology, one assumption isn't really sillier than the other - physical limits are unintuitive.
Any electric heater, even those decades ago, was 100% efficient. They turn electricity watts into heat watts. Where is the energy loss? Light? That also becomes heat. Air movement? Also heat. Loss due to heat allong the cord to the heater? Thats heat too. Unless they are emitting large numbers of neutrinos, all electric heaters are simply resistors that perfectly turn electricity into heat.
Put nearly any electrical device in a box, anything from a television to a cement mixer, and it will raise the temperature of the air in that box by exactly the same amount as the watts it draws from the power source. A 500w television puts out exactly as much heat as a 500w heater.
You can't have a wind turbine which extracts 100% of the kinetic energy from wind, because after it there would be a wall of unmoving air, and the incoming air wouldn't be able to get through the turbine.
Analogously, if electricity is carried with the flow of charge - electrons - around a circuit[1], when you extract 100% of the energy as heat, the electrons stop moving and build up in the heater. So you can take the rest of the wiring away because it's doing nothing and save 50% of your costs. Then, a buildup of charge makes a voltage, and a voltage potential difference can drive a current. Therefore you can get 100% of the power out as heat, save half your money on wiring, and use the growing potential difference to power something else. Electricity makes no sense whatsoever.
OK, so that's troll-physics nonsense, but does extracting all the "energy" stop the electrons moving? If not, why not, what energy isn't being extracted? If so, why doesn't that stop current flowing - isn't "free electrons" part of what makes something a conductor of electricity?
[1] though the energy is carried in the e/m field around the surface, somehow
On the other hand, a heat pump bumps the efficiency of your electric heater to like 300%, because it works around the physical constraint. My understanding is that air-source heat pumps are continuing to noticeably improve decade-over-decade.
The commenter was point out the nuance between the two, it's obviously about confidence in an assertion. You just re-reduced it to what was already obvious?
Investing in government bonds vs investing in penny stocks: both are the same thing, as they are both an excess of confidence in predicting the future.
Except when you consider where each technology fits within its own S-curve of adoption (X axis over time, Y axis is % of the technology adopted by the market).
When factoring in the shape of the exponential decreases in costs, and that penetration of most of these technologies is at or before the inflection point (between 5%-15% market penetration), it is more likely that the cost declines will ACCELERATE moving forward rather than slow down.
Why has it felt that laptops and PCs haven’t progressed as much in the 2010s as in the 1990s or 2000s? Because in 1995, there was not a computer on every desk in every home. But now not only is the market saturated with laptops and PCs, people are walking around with mini internet connected “super computers” everywhere they go.
> Except when you consider where each technology fits within its own S-curve of adoption (X axis over time, Y axis is % of the technology adopted by the market).
Unfortunately, even a very small amount of noise in the data makes is basically impossible to know where you are in an S-curve.
Much safer to make predictions based on the far more limited good news that PV+battery is already cheaper than coal for electricity or ICE for cars.
Hmm… question for anyone who knows: with current tech, how much would it cost to develop a significant PV-powered electrolysis-and-Sabatier-process plant in any of the big coastal deserts, for exporting methane?
> how much would it cost to develop significant PV-powered electrolysis-and-Sabatier-process plant in any of the big coastal deserts, for exporting methane?
What are you thinking about as the carbon source? If coal, then this has been commercially viable for decades. In North Dakota there is a 1.5 gigawatt installation running since 1984. That one uses electricity from coal power IIUC, but today PV is cheaper than coal for electricity.
If you are talking about CO2 from direct air capture, the optimistic cost estimates of your CO2 feedstock are around $600/tonne. 1 tonne of CO2 gives ~137 kg of methane at 100% reaction yield, due to the molar weight ratio of CO2 to CH4.
So per tonne of methane produced, the CO2 cost alone is above $4000. For comparison a tonne of natural gas in the US today costs between $500 and $1000 for the end user.
This means that CO2 capture from air needs to become two orders of magnitude cheaper than today before this scheme works out.
I would say hydrogen electrolysis and then liquefaction for large scale distribution/export is way more realistic. This is what the EU seems to be going for together with Northern Africa.
To add to that: I'm not convinced that exporting even renewably-sourced methane is particularly renewable - methane has fugitive emissions when piped etc that are far worse pound-for-pound than CO2.
> Unfortunately, even a very small amount of noise in the data makes is basically impossible to know where you are in an S-curve.
While true, my point is that when combined with the fact that we are pre-inflection point, and the economics now stand on their own (renewables, Electric Vehcile TCO and various Energy Storage applications being already cheapest, competitive or very close too) it is not unreasonable when mapping out the 5-15 year future to bet on an acceleration of cost declines over a deceleration. Particularly because the actual driver of unit cost declines (Wrights Law/Moore's Law) is the doubling / magnitude of units manufactured and put through the system, for which with all the factories being ramped up and planned - point to the positive in my view on it.
Regarding your PV-powered and electrolysis-Sabatier (electrofuel) methane, I think there are two important considerations. In order for methane (or other e-fuels like hydrogen or longer chain hydrocarbons) to be made economically, the capital cost of the equipment needs to be utilized as close to 100% of the time as possible. We already know that PV excess will be centered around the daytime peak (5-7 hours per day) meaning that there would also need to be plenty of excess wind to balance this out to get anywhere close to 100% utilization of the excess energy. Until the electricity grids get sufficiently saturated with renewables broadly, most e-fuel applications will continue to not be competitive, particularly as things like energy storage applications (possibly run off an e-fuel) are likely to be economical prior there being an opportunity for the export of excess e-fuels. That's more at the a end of the S-Curve as far as I can tell.
Build enough PV to generate 24 hours worth of power for the reactor in daylight hours, and store the excess in batteries to power the reactor overnight.
The main argument for hydrogen AFAICT is that it can be exported overseas or stored for inter-seasonal use.
Batteries can't do that, as 1) when compared to literal rocket fuel, they're impractically heavy to put on a cargo ship for bulk transport, and 2) batteries trickle-discharge so after a month or two the battery will be flat.
In fairness, pure hydrogen is also leaky and hard to work with. That’s why I was asking about the economics of turning it into methane… and yet, one of the other replies I got pointed out that methane is also a bit leaky, so we might want to reform it (or whatever the word for “opposite of cracking” is) all the way up to a room temperature liquid.
For example there isn't agreement where on the S-curve fits hydrogen as automotive fuel. Or if it has a future at all. Same with other alternative technologies. The S-curve is only a hindsight device.
It actually is pretty easy to predict the future, in a limited sense, a decade out: the cost of new technologies generally follows an exponentially declining “learning” curve. This has been extremely well studied in the case of the aircraft and semiconductor industries, and is the subject of hundreds/thousands of articles. Of course, some conditions must be met—hence the articles—like plentiful inputs, lack of monopoly power, and government (dis)incentives, but we’d see evidence of these constraining battery tech by now.
The input variable is the cumulative number of units, and of course we can’t be exact about the trajectory of that number, but we can infer from X MWh manufactured -> $/MWh.
Sure. But there’s been questions about the viability of the next node for at least twenty years. If you played it safe and stuck with the current node, you’d have been wrong & at a process disadvantage 14 times out of 14.
No improvement trend goes on forever. But why is this the moment lithium ion hits the wall? It’s like trying to call the end of a bull market.
2nm is just a commercial name. The real features are much bigger, see, for example, https://en.m.wikipedia.org/wiki/10_nm_process. So we are quite far from 10 atoms per transistor.
The biggest problem is lithography. ASML managed unexpectedly for many in the industry including Intel to solve technological problems with extreme ultraviolet sources, but shrinking transistors significantly further requires soft x-rays and the perspectives of that now for mass lithography are much more uncertain than for EUV ten years ago.
And Jim Keller is not a Fab / Processing guy. And if you actually listen to what he said, he is predicting plenty of room to improve, not it will improve every 24 months, which arguably has already stopped happening since ~2017
It has stopped, though. Single-threaded performance, either in absolute value, or per dollar has not been following Moore's law for more than a decade.
And outside of data centers, single-thread performance is still king.
Performance per watt has improved, but that's not a metric the typical end user cares much about.
Single-thread performance is not king though. Software nowadays is extensively multi-threaded. If you're still writing single threaded code for a desktop program people are going to look at you like you have antenna growing out of your head.
And performance per watt absolutely is important. People care a lot about battery life of their laptops and phones and that's strongly dependent on performance per watt.
We'll get to 256 core server chips and that will be the end of the line. You are forgetting that the advertised version number denoted by nm is about what size a theoretical planar transistor would need to have to be equivalent to the advertised process.
Since that number is purely theoretical, we can construct a theoretical scenario in which its theoretical nature would become absolutely obvious. Take a 7nm process node transistor and stack it 100 times. Such a process would be called a 0.07nm process.
I think it boils down to we will hit a wall, but we don't know exactly when we'll hit it. (and how hard, chances are that higher-hanging-fruit refinements will make the transition to stagnancy so gradual that we may not notice at all)
A jump from "all past predictions failed" to "and so will all future predictions" seems rather bold to me. In the end it's like a somewhat upended variation of the "x decades to practical fusion" thing where we all hope that the old joke that x might be a natural constant is eventually proven wrong.
Well, the main direction of refinements at this point seem to be around composable/heterogeneous computing where we basically have a lot of hardware optimized for specific workloads and throw the complexity at the software people. i.e. now deal with GPUs, DPUs, FPGAs, xCPUs, etc. instead of (largely) just a standardized set of CPU instructions.
Except we don't. Where's my single-core 100GHz processor?
Improvements still happen, but not always in the same way. If you implemented an application in 2003 assuming we'd have such a processor you'd be very disappointed. Counting on exponential improvements to continue is risky bet.
Scaling may take us to some strange places, but it's worth noting that an Apple M1 chip has >100x the transistors of a 2004-era Pentium 4, and achieves ~500x the FLOPs at a fraction of the power draw.
Yes this isn't single-threaded performance, but I think we should keep in mind that exponential improvement in price/performance over many decades is possible, if never certain.
On the other hand using improvement over the same timeline, drawing a line on a graph and saying "look how X Y is gonna be in Z years" is the same exact type of stupid but pointed in a different direction.
In 1991 lithium was highly immature technology and would take about a decade to make it into fragile electronics. It took another decade to make it into power tools. Now it's viable in high end commuter vehicles. If it was easy to predict the future a decade out with any reliability we wouldn't be having this discussion.