Hacker News new | past | comments | ask | show | jobs | submit login

"Let's say tomorrow some grad student gets fusion going at a very low price. The best way to use this to power cars would be to use it to create a fuel with a high energy density."

That is incorrect. If the fusion source is compact, and can produce a lot of instantaneous electrical power, and is quick to throttle up and down, then a "Back to the Future"-style Mr. Fusion would be the best way to power cars.

Energy density isn't the only factor. There's also a question of infrastructure. Electrical distribution has few moving parts, while extracting C02 from the atmosphere to make fuel and distributing the fuel has many more mechanical parts. A fusion plant which could produce 50 kW, weighed 2 tons, and could be installed in the back yard of a home would mean that a house could be off the grid and still have power left over to charge the car, while using intermediate hydrocarbon storage would mean trips to fill the car, or heating oil, or cooking gas.

So while I completely agree that airplanes will not be powered by batteries, I don't think that energy density is the only factor to consider in the economics equation.




> That is incorrect. If the fusion source is compact, and can produce a lot of instantaneous electrical power, and is quick to throttle up and down, then a "Back to the Future"-style Mr. Fusion would be the best way to power cars.

There's also safety to consider.


On top of that, reliability. To have a large plant with a high mechanical complexity which can justify dedicated maintenance workers to help manage complexity and the results of part wear etc is likely able to achieve the same or higher reliability than a backyard unit, in TCO terms anyway.


So I think you both agree with me that it's not necessarily the case that a hypothetical fusion power source or free energy source is best used to produce hydrocarbons for distribution and downstream use.

Regarding the comment of sophacles, power distribution is part of the economic factor. It may be that the central plant is much more reliable than a backyard plant, but the power grid - subject to thunderstorms, ice, tree falls, backhoes, curious squirrels, and so on - makes the overall power supply system less reliable than a backyard fusion plant.


No. I will agree that a distributed power system may provide overall reliability but this condition must be true:

There is still a grid. If my power source goes out, I want backup to come from other nearby sources - the timeline of power restoration from the current delivery system is on the order of minutes or hours for over 80% of outages, and on the order of a couple weeks for over 99% of the rest of outages. If my power plant breaks, I need restoration numbers that meet that. (additionally, I need plant repair bills to be lower than however much money having the backyard plant would save me. TCO considerations again).

Further, these two assumptions are built into your "better" assessment:

* It is cheaper to have a power plant in my back yard than buying it from the grid.

* The backyard source can be made safe.

Combining these two assumptions is a big deal. If both are true, I will agree that it is a good option (with the caveat listed above). However, there is a HUGE amount of R&D to get there, including a massive set of efficient production runs for parts to build all the systems to make it happen. The economics of this points to it not being likely that everyone has a backyard fusion plant.

It is far more likely to see big fusion plants in greater number scattered around the power grid to provide higher reliability in the cases of line loss etc. Further, with energy now being much, much cheaper to produce, you'll likely start seeing more reliable distribution channels for electrical power. Overhead lines would be reasonable to replace with underground ones, which are less efficient, but are also more reliable as they are less likely to be damaged in weather events. You'll also probably see a reduction in star-topology distribution - more redundancy in distribution paths, at the cost of some efficiency, because the complex equipment will be cheaper to manufacture (you know, because energy to do so will not factor into costs anymore).


The original premise was already unrealistic. I made it even more unrealistic. If there is a "Mr. Fusion" device which can produce 1.21 GW, on-demand, safely, and it small and light enough to fit in your car, then there's no need for a grid. You would just have several of those devices in your house.

My hypothetical was to show that there could be cases where it does not make sense to use a Mr. Fusion type device to produce hydrocarbon fuel which is then used as the energy source. Everything I said takes place in the original fantasy world. Under the original premise -- "some grad student gets fusion going at a very low price" -- then it must be using some principle we haven't yet thought of. And with that premise in place, almost anything goes.

Once I put realism into place, then the original hypothetical is not sustainable. The long term solutions for real life are decreased energy use, fission, hypothetical fusion, and renewable. None of the last three can exist without a grid, at least for most people. The only way to be without a grid is greatly reduced power use, a less concentrated population, and switch to local renewable resources. That isn't going to happen.


Several strings of solar panels + lithium ion batteries might very well be cheaper than paying your local electric monopoly for transmission line capacity in 20 years. And if the non-redundant parts (the inverter, perhaps) fail, it might not be all that different from your water heater failing today.

(Although if we had that technology in cheap enough form, some of the major loads in your house may switch to DC to avoid conversion losses to AC, since solar panels and batteries are both inherently DC technologies, and that might make the inverter less important.)


There will still be a grid. The scenario you mention will be useful for some, but assuming $80/month for electricity over a paid-for grid vs. $10,000 for installation of cheap solar panels+batteries gives a pay-off time of about 10 years. (Currently solar water heaters cost about $5,000, so the best is 5 years.)

I don't think most will be willing to take that capital investment.

It would be interesting to see how distributed solar compares to grid-based distribution in the face of large disasters like a hurricane or ice storm. Especially if the power lines were underground. I assume that those with damaged panels would quickly look for replacements, causing an instant demand and price spike. While the large electricity companies would have stockpiled reserves and have agreements already in place to handle the short-term demand. I don't know how this would affect the overall long-term costs.


IIRC, solar prices have been dropping at a rate of 7% per year. If that keeps up, and if the grid price remains constant, solar will eventually become cheaper than grid.

In 1992, would you have told me today's smartphones would be impossible?

If you only lose a few solar panels in a storm, and most of the panels on your roof survive, you may just use a bit less electricity for a while.


I don't think you necessarily need Li-ion for that - in my experience lead acid works well enough in the domestic case, is a lot cheaper and probably safer (although hydrogen venting is perhaps an issue).

I built a little mixed DC (lighting) + AC (1kW inverter) system out in the garden (far enough from the flat that running a cable would be a nightmare of planning permissions & digging trenches).

Since it's for intermittent use the panel is tiny (50W Kyocera) compared to the battery (110 Ah 'leisure') & inverter (1kW true sine), and it works just fine even here at 53 deg latitude. It would scale up for the entire house quite well, were it not for the fact that it's a four-in-a-block with a communal roof, and 1/2 loft space unusable due to loft conversion.


There are some interesting questions about economies of scale, etc, though.

If we figure an average American has a lead acid car battery and maybe a laptop battery pack, we currently have more lead acid batteries than lithium ion, certainly by weight, but probably also by total watt hours.

Tesla and Nissan might end up inverting that ratio in a market where they merely have to get the battery to beat the cost of gas; and if we get to the point where every American has an 85 kwH battery pack in their car, that's multiple days of average US electrical consumption (I believe average per capita electrical consumption is about 1.5 kw, and average per capita total energy consumption around 6 kw in the US).

Meanwhile, there's no path to pushing up volumes of lead acid battery production significantly. Maybe there will be a few old lead acid car batteries getting recycled after Tesla conquers the world, but if that's all we're relying on, those recycled batteries won't power very much in the grand scheme of things.


Nickel-Iron batteries have a life measured in decades and are very robust against deep discharge. They can also be refurbished. Their formulation has been around since Edison. In fact, I think he invented them. Their big disadvantage: they're heavy. Perfect for home use.

EDIT: Edison did not invent NiFe batteries, but he developed and championed them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: