ready to teach the students about Markov chain text generation
@lea I don't know much about markov chain but it sure looks like a fun way to learn about it, do you have a link to share about those little letter clocks ?
@lndf No link, sorry; I just made them myself. The idea with Markov chains is that you can get “the next thing in a sequence” from probabilities based on the previous thing. In this case, the “sequence” is letters (including “ ”), and the probabilities are represented by portion of spinner. So, say the first letter is “o”: you get out the spinner for “o”, and you spin it. Maybe it lands on “n”, so then you get the “n” one and spin it. Repeat for nonsense (but plausible) words.
@lndf The “probabilities” are always based on some already existing sample text (often called “the corpus”)— you figure them out by going through and counting how often each sequence happens there. In this case, the sample text is “The Raven,” by Edgar Allen Poe. For their homework, the students will do this with words instead of letters (and just in code, no physical spinners), and they’ll choose their own input text. I’m hoping the spinners help them understand the algorithm. 🙂
@lea It's already fairly clear from the description you've made ! Hopefully with the actual physical spinner it'll be even more easy to grasp
thanks for the explanation !