He discusses, amongst other things, the work done on computing this uncomputable number, and why that isn't a contradiction. I think one of the people involved is in the audience.
you're right that the isolated assertion is not at all sensible, but the wider context is that you are considering all possible algorithms and the sequences that they generate, and looking for some way of compressing as many of those as possible. you should read one of chaitin's books for a better idea (they're generally "accessible", but at some point you just give up trying to make complete sense of it all...)
in other words, you are choosing your compression for one particular algorithm (the one that generates omega - you could go further and say it's 1 for omega and 0 for everything else). this is a common problem in arguments about compressibility (if we're only trying to compress beethoven's ninth then i can do that in one bit too) - you have to be talking in an "average" or statistical sense for things to be useful.
having said that, the exact nature of what they're saying is still not clear to me (ie it's not clear exactly how the details of the argument work out, or how this eventually ties back to http://en.wikipedia.org/wiki/Shannons_source_coding_theorem).
i think another replier here may also be correct that chaitin, in particular, includes the length of the decompression code. unfortunately i can't delete my answer for some reason (too old i guess).
No, I think that your answer made the most sense. The length of the decompression code is a constant, whereas Omega has an infinity of digits, so it doesn't matter.
Yeah, but compression is all about communicating information. If I don't have the information that I want to communicate, it's weird to claim that it's incompressible. Are the lost works of Shakespeare incompressible because we can't know what they contained? That's a meaningless statement to me.
Besides, some of the bits are known, and thus, they are clearly compressible.
The Halting Problem theoretically defines the limits of computation, but has anyone in a pragmatic, everyday coding situation actually reached that limit? In the practical sense I've never seen it have any bearing. Perhaps the problems I'm working on are too mundane.
http://en.wikipedia.org/wiki/Microcanonical_ensemble