> because building more power lines is relatively easy on the scale of climate tech we need to kick all carbon emissions.
Then why have the rates changed so much recently? More importantly if EVs are going to be the thing then home solar should be the way it get the majority of it's power. Why even build the lines? Isn't that just a subsidy?
> Let's brainstorm how to decarbonize fertilizer, or concrete.
I don't think you can. I think you should worry more about how concrete and fertilizer get _distributed_. This is essentially the same dynamic as the home solar problem above.
> on the scale of the entire country they're fairly reliable and predictable.
That's due to the way the grid itself is structure not how any one power source performs. No source of power is particularly reliable and unexpected maintenance intervals always occur. Point here being, if you try to switch a grid that's based on a mix of sources, over to a grid that isn't, you're probably going to end up with a surprising result or two during that misguided process.
>> Let's brainstorm how to decarbonize fertilizer, or concrete.
> I don't think you can. I think you should worry more about how concrete and fertilizer get _distributed_. This is essentially the same dynamic as the home solar problem above.
Isn't the primary source of CO2 from fertilizer production a byproduct of producing hydrogen gas via steam methane reforming?
We can make hydrogen without starting from methane (namely: via electrolysis), but it's not economical in comparison, at this time. (Or clearly able to scale to quite the same degree, for that matter.) But I reject the claim that it's not possible (or, for that matter, that we don't know how to do it). The issue is that the negative externalities from CO2 emissions are not priced in such a way to render existing processes less cost-effective than carbon-free alternatives.
That said, I share some of your skepticism regarding how much we can conceivably decarbonize concrete production.
Aren't those cost factors based upon the type of load curves we currently see? Isn't there some reason to suspect that the efficiency rating will drop if we experience much greater offsets between time of generation and time of demand with the types of peaks that EV charging might bring? Wouldn't it be nice to have all this without having to engage with the daunting prospect that is the "smart grid?"
There's a common misconception that thinks of electricity like a fluid or supply chain. If you generate more of it locally, it reduces the load on the grid and you need to spend proportionally less on distribution. This is false: the actual electrons in AC move a small fraction of a millimeter, electrical potential travels at a speed that is effectively the speed of light, electric conductivity is all-or-nothing, and if you need the grid at all, you need the full expense of building and maintaining that segment of grid. You probably need some form of grid if only to even out load spikes (running your clothes dryer often takes 20x the power as all the lights and electronics in your house, but not everybody runs their clothes dryer at once) and manage seasonal variations (solar power, particularly in northern latitudes, can be 5x higher in summer than winter, which is not a problem when you're powering southerly residents' air conditioning but is when your cold house doesn't need A/C).
The load curve over time only matters to the extent that you can entirely remove remote consumption. You can use batteries to smooth out night and day. You can reduce the use of batteries by sponsoring V2H EVs and workplace charging, so that you charge your EV when solar is abundant in the day, and then drive it home to power the rest of your house. But this does nothing for summer vs. winter, it does nothing for wanting to run a clothes dryer or space heater (many of which actually exceed the max power draw of a whole-home battery), it does nothing for wanting to charge your EV up to full before a long road trip.
I am in favor of microgrids, but this is more a statement that we should rationalize our distribution infrastructure rather than that get rid of the grid entirely. When power plants were large centralized industrial buildings that needed a steady supply of fossil fuels delivered by road or rail, it made sense to just build a few of them and then have a huge grid that distributed the electricity everywhere. When you can put solar on every rooftop, it might make more sense to have the smaller remote communities all invest in rooftop or community solar, wire them up in a microgrid of ~1000 homes, put in a big utility-scale battery, but otherwise disconnect them from the main grid so that power lines don't go through tinder forests. And then the big cities draw from big utility-scale solar and wind farms in the desert, connected by conventional power lines along major transportation arteries. But there's still some grid there, it's just a smaller, cheaper grid where you make the connections that are easy to maintain and distribute generation to the remote communities that can run their own self-sufficient grid.
> Aren't those cost factors based upon the type of load curves we currently see?
No, that would effect the price the electricity would fetch, not the cost to buy panels and put them on the grid. Home rooftop panels are so much more expensive because of economies of scale.
And it still costs well below what my utility charges me. If the real cost of a thing is supposed to weigh into my incentives, I need to be able to buy it for that price.
The 30% federal tax credit is not a good deal for taxpayers. You probably also benefit from net metering and the utility probably doesn't recover infrastructure costs from you due to your reduced usage. That's a bad deal for ratepayers.
Home solar/battery would provide a level of independence for each home and would lessen the load on the grid for air conditioning alone (almost 20% of grid utilization).
At 3x the price it's not a good deal for taxpayers or ratepayers. If homeowners want to do it for independence that's fine but the cost should be on them. I'm fine with giving them a credit for reduced ghg emissions.
Sure, let them pay for it. Perhaps a low interest loan to help incentivize it but theoretically it pays off in 6-9 years and then free power and less grid load, and the loan is paid back.
Even if you have rooftop solar, you still need a grid capable of supplying 100% the power because there are cloudy days and long sequences of cloudy days
Yes but EVs have batteries and people don't drive them to depletion every single day. I should have been more clear, I didn't mean the whole house, I meant the just the EVs specifically, for now. It would completely alleviate their impact on the grid as a consumer power source.
Even then, there are huge efficiencies of scale favoring industrial solar over rooftop.
The cost per KWH is at least 10x lower, and getting better.
This is more than enough to counter the distribution costs.
The same is true for industrial storage.
All told, the only upside to rooftop is avoiding grid operators, Which will just raise their price to counteract any savings on the part of homeowners. everyone is still stuck with them unless they go to municipal operators
The 800,000 American homes that added solar to their roofs last year cover 100% of the electric used by every EV that's ever been sold in the US. They cover the electric usage by the EVs purchased last year by multiples. At this rate, you can do nothing and residential solar will already add much more capacity to the grid than EVs are taking from it.
Potentially FROM 2035 only electric vehicles would be for sale on new car lots. Most gas cars already on the road will still be there for 10-20 years after that.
By then, Edison Electric Institute (a trade-group for utility companies) predicts 70-80 million EVs on the road in the US.
By 2030, 15% of US homes are forecast to have solar on the roof, which would continue covering 100% of the electric use of the nation's electric vehicles.
The average residential solar installation generates enough energy to cover a 14,000-mile-per-year vehicle's charging 3.5-4.5 times over. Each house with solar panels generates enough energy for its cars and some of the neighbors' cars that don't have solar.
I put solar on my roof two years ago. It's the average system size, taking up 2/3rds of the south-facing side of my roof. It cost 1/3rd the price of my car to get installed, it completely covers my fuel use for two cars, and it covers 100% of my home electric and heating bill 9 months out of the year.
EV electric use isn't a problem utilities need to solve so much as a solution to a lot of utilities' problems. 70 million EVs are many gigawatt-hours of battery storage that will be connected to the grid bidirectionally in the not-distant future. They can store renewable energy during the day and feed it back to the grid at night, they can power houses and businesses during peak load events so peaker plants don't need to be spun up, and lots of other things that will make the grid more resilient and cheaper to operate without significant capital expense to the utilities.
Then why have the rates changed so much recently? More importantly if EVs are going to be the thing then home solar should be the way it get the majority of it's power. Why even build the lines? Isn't that just a subsidy?
> Let's brainstorm how to decarbonize fertilizer, or concrete.
I don't think you can. I think you should worry more about how concrete and fertilizer get _distributed_. This is essentially the same dynamic as the home solar problem above.
> on the scale of the entire country they're fairly reliable and predictable.
That's due to the way the grid itself is structure not how any one power source performs. No source of power is particularly reliable and unexpected maintenance intervals always occur. Point here being, if you try to switch a grid that's based on a mix of sources, over to a grid that isn't, you're probably going to end up with a surprising result or two during that misguided process.