Hacker News new | past | comments | ask | show | jobs | submit login

would it be possible to use a heat pump to increase the temperature to a more efficient range? Or would all of the efficiency gained be lost by operating the heat pump?



I have wondered about this as well... If your heat pump uses say 1kW, and they usually provide 4x-5x heat out, so say 5kW of heat out and your able to extract 50% of that as energy using say a turbine, you would get 2.5kW out? or maybe 1.5kW (subtracting the initial 1kW used to run it?) so maybe not great but maybe possible?


given perfect heat pumps/engines in that circumstance, you would get exactly as much energy out as you put in. Theoretically perfect heat engines are called 'reversable' for that reason: they give you the best possible exchange between heat and work. Said another way: heat pumps can give you 4-5x the heat out as work you put in precisely for the same reason you can't get much work out of that heat: the temperature increase is small. If you were to heat something up to a temperature where you could more efficiently extract work from it, it would require more energy to do so.


Yes you can use a heat pump, but that requires work to run. Think compressors and whatnot. So even with a perfectly efficient heat pump that limit still holds because you are using some of the energy gained from your new hot temperature to run the heat pump.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: