Hacker News new | past | comments | ask | show | jobs | submit login

Does this have any bearing on AI or machine learning. According to Marcus Hutter, the best AI is one that can compress the most (the hutter prize for compression). If compression is the same problem as data transmission, than this might an optimal AI algorithm as well.



This isn't about compressing data, it's rather the opposite: expand the data so that even if a certain portion of it is affected by interference, the original data can still be reconstructed without errors.

To take a really simple code, let's say you have this data:

10110

Then you append a checksum (actually a parity bit here) to the data 1+0+1+1+0 = 1 (in binary)

101101

Now let's say there's interference and a bit flips:

100101

The receiver calculates the checksum and sees that the sum is 1+0+0+1+0 = 0, which does not fit since the sum is supposed to be 1. Thus the receiver knows that an error happened and can request a retransmission. More advanced codes like those discussed here can allow arbitrary levels of error correction, instead of only detection of single errors like the parity bit; but higher levels of error correction come with an overhead, and apparently the innovation here is a technique to reduce this cost in wireless networks.


> This isn't about compressing data, it's rather the opposite: expand the data so that even if a certain portion of it is affected by interference, the original data can still be reconstructed without errors.

The two problems (data compression, and noisy channel coding) are tied together quite neatly by information theory though: http://www.inference.phy.cam.ac.uk/mackay/itila/book.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: