OK. When talking about language I find it's always good to be explicit about what level you're talking about, especially when you're using terms as overloaded as "information". I'm not really sure how to connect this finding to semantics.
If the text can be reproduced with one bit per letter, then the semantic information content is necessarily at most equal to N bits where N is the length of the text in letters. Presumably it will normally be much less, since there are things like synonyms and equivalent word ordering which don’t change the meaning, but this gives a solid upper bound.