Markov Coincidence
I was chatting with one of my friends recently. He was telling me about a Markov chain program (I think 5th order) that he had just finished. This gave me an interesting idea. Suppose that you were to measure the connectedness of the English Language as a Markov process. As you increased the order of the chain, what does that tell you about the information structure of the language? Would it be useful to display these as a histogram series, charting the probability of a word, in relation to other words, as a function of chain depth?
Coincidencedentally, Jeff Atwood, then wrote a post about Markov processes. His examples, which work on the letter level, show a remarkable qualitative change as the order of the chain is increased.
And just what happens if you train a Markov chain on Lewis Carrol?