SUMMARY
For my Statistics and Stochastic Processes course at Northeastern, we applied the Metropolis-Hastings algorithm to construct a Markov chain whose steady-state distribution converges to the Zipf distribution, a probability distribution used to model the frequency of words in natural language, where the most common word appears roughly twice as often as the second most common, three times as often as the third, and so on.
The goal was to start with a simple proposal transition matrix T that only allows movement between adjacent states, and modify it into a new transition matrix S whose long-run behavior naturally settles into the Zipf distribution with parameters a=1 and N=8.
To do this, we first computed the Zipf probabilities for each of the eight states, then calculated acceptance probabilities for each possible transition using the Metropolis-Hastings acceptance rule. These acceptance probabilities act as a correction factor, deliberately slowing down transitions that would move the chain away from the target distribution, and allowing transitions that move toward it to happen freely. Applying these corrections to the original proposal matrix T produced a new transition matrix S whose steady-state probability vector exactly matches the Zipf distribution.
"MetHastZipf" will lead you to a formal write-up with calculations: