@lndf No link, sorry; I just made them myself. The idea with Markov chains is that you can get “the next thing in a sequence” from probabilities based on the previous thing. In this case, the “sequence” is letters (including “ ”), and the probabilities are represented by portion of spinner. So, say the first letter is “o”: you get out the spinner for “o”, and you spin it. Maybe it lands on “n”, so then you get the “n” one and spin it. Repeat for nonsense (but plausible) words.