Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A bit is a measure of information theoretical entropy. Specifically, one bit has been defined as the uncertainty of the outcome of a single fair coin flip. A single less than fair coin would have less than one bit of entropy; a coin that always lands heads up has zero bits, n fair coins have n bits of entropy and so on.

https://en.m.wikipedia.org/wiki/Information_theory

https://en.m.wikipedia.org/wiki/Entropy_(information_theory)



That is a bit in information theory. It has nothing to do with the computer/digital engineering term being discussed here.


This comment I feel sure would repulse Shannon in the deepest way. A (digital, stored) bit, abstractly seeks to encode and make useful through computation the properties of information theory.

Your comment must be sarcasm or satire, surely.


I do not know or care what would Mr. Shannon think. What I do know is that the base you chose for the logarithm on the entropy equation has nothing to do with the amount of bits you assign to a word on a digital architecture :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: