To expand on this, floating point works for 95% of programs. It is the 5% that needs exact accuracy/precision offered by arbitrary precision libraries. Most programs hardly exhaust the entire range of your typical floating point usage range. For example in a GUI you won't be able to easily tell the difference between 100% and 99.9999999% by eyeballing.