# Information Content
I've joked that I rank people by Kolmogorov complexity. Someone who teaches me new things and challenges me in conversation makes a better (or at least more interesting) friend than someone whose every word I can easily predict. But of course, if my statement were literally true, a friend who constantly spewed incomprehensible gibberish would be better than either.
And yet, I could write a program to generate incomprehensible gibberish in a couple of lines of perl (we'd just have to be careful not to mix up the program and its output). A person talking gibberish would probably be doing *less thinking* than someone who was making careful, well-reasoned responses to what I said - even if the latter is more predictable in some narrow sense.
Cultural context has a similar effect. Someone writing German will convey more information to me than someone writing nonsense, and someone writing English more still. But an alien writing English can tell me far less than someone with whom I'm embedded in a common culture. When, in *The Melancholy of Haruhi Suzumiya*, Koizumi tells us his class "ended up doing [the Tom Stoppard version](http://tvtropes.org/pmwiki/pmwiki.php/Theatre/RosencrantzAndGuildensternAreDead)", we learn far more than seven words have any right to convey; Koizumi is easygoing and literate, but also that he appreciates the absurdity inherent in his position as a fictional character (something Tanigawa already pays plenty of attention to in the main work). It's like the "google-format compression" joke; you can compress a large video file down to 32 bytes by just storing the CRC32, and decompress by entering it into google.
So: imagine an alien race provided us with a value for some [busy beaver function](http://en.wikipedia.org/wiki/Busy_beaver) - say S(500) for some reasonably practical Turing machine (if that's not a contradiction in terms) - along with a (short) proof that it was valid. Just writing the number out would probably take several hundred bits, even with the new notation that would doubtless be invented for the task.
But these few hundred bits would contain a staggering amount of information. At a stroke we'd have proofs of many of the most important open questions in mathematics, as soon as we'd cranked our way through the relevant calculations. Yet from a complexity theory point of view, we'd regard this message as containing no more or less information than any other arbitrary bitstring of the same length. This seems odd, to say the least - odd enough to make me wonder whether we need a better definition than Shannon entropy, one that takes context into account.
One last thought: if, as I incline to believe, the rules of the universe are fundamentally very simple, then the information content of the whole universe is very small; perhaps, as Rees had it, "Just Six Numbers". Actually, I'm less worried about this one; to say anything interesting about the universe, you need to refer to a specific location within it (including which particular branch of the enormous tree of quantum-mechanical probabilities your particular 4D slice lives on), and that location is a lot of information, so this is perhaps just a restatement of Borges' *Library of Babel* non-paradox.
In researching this post I came across [http://www.hpcoders.com.au/nothing.html](http://www.hpcoders.com.au/nothing.html), which seems to explore the consequences of this. I'll be reading that next, once I've finished the excellent (and highly embedded into a small specific subculture) [Imperfect Metamorphosis](http://www.fanfiction.net/s/5829008/1/Imperfect-Metamorphosis).
comments powered by Disqus