Simon also worked on a Markov Chains projects this weekend, trying to introduce a slider into his text generating code.
“A Markov Chain can be described as a sequence of random “states” where each new state is conditional only on the previous state. (…) An n-gram is a contiguous sequence of N elements. (…) Using an N-gram model, we can use a markov chain to generate text where each new word or character is dependent on the previous word (or character) or sequence of words (or characters)”, Daniel Shiffman writes.
In other words, Markov Chains can be used to generate text automatically based on a source text. For some reason, Simon’s slider only worked in the minimum stand (“least sense”, generating words from 2-letter combinations). When turned to maximum, it only returned the same one and a half words all the time. Simon gave up at that point:
The rest of the code comes from Daniel Shiffman’s Markov Chains Coding Challenge. This coding challenge is part of the Programming A to Z course, one of the courses that Daniel Shiffman has taught at the Interactive Telecommunications Program at NYU’s Tisch School of the Arts.
This is the chapter on Markov Chains: http://shiffman.net/a2z/markov/