Hacker News new | past | comments | ask | show | jobs | submit login

This raises the question of just why the Mac is doing software rendering—I think the hardware it’s running on should have two compatible hardware encoders, that on the CPU and that on the GPU. Is the software being used incapable of using hardware encoding? Does it default to software rendering because of its higher quality per bit? Was it configured to use software encoding (whether ignorantly or deliberately)?



Video encoding is generally done on CPUs because they can run more complicated video encoding algorithms with multiple passes. This generally results in smaller video files with the same quality. As you increase the compute insensitivity of the video encoder you get diminishing returns. 30% lower bitrate might need 10x as much CPU time. That tweet says more about the type of encoder and chosen encoder settings than anything about the hardware.

Imagine going on a hike and climbing an exponential slope like 2^x. You go up to 2^4 and then go down again and repeat this three times so you have hiked 12km (43) in total. Then there is a athlete who is going up to 2^8. He says he has hiked 8km and you laugh at him because of how sweaty he is despite having walked a shorter distance than you. In reality 32^4 (48) is nowhere near 2^8 (256). The athlete put in a lot more effort than you.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: