Markov models describe dynamic systems that move between states according to fixed transition probabilities. If we additionally assume that the process can move between any two states and that the process does not produce a cycle, then a Markov model attains a unique statistical equilibrium. In the equilibrium people or entities move between states in such a way that the probability distribution across states does not change. It follows that as a process nears that equilibrium, the changes in the probabilities diminish. Represented as a graph, the slope of the curve flattens. Recall our earlier discussion of California’s population growth when we learned linear models. California’s population growth has slowed because as the population of California has grown, the number of people leaving California has increased. That result holds true even if the proportion of Californians leaving does not change. When applying Markov models to explain phenomena or predict trends, a modeler’s selection of the states proves critical. The choice of states determines the transition probabilities between those states. A Markov model of drug addiction could assume two states: being a user or being clean. A more elaborate model might distinguish users by frequency of use. Regardless of the choice over states, if the four assumptions hold (and in this instance, the key test would be whether transition probabilities remain fixed), then the system will produce a unique statistical equilibrium. Any one-time change in the state of a system has at most a temporary effect. Reducing drug use in equilibrium would require changing transition probabilities. Continuing with that same logic, we can infer that a one-day event to spur interest in education may lack meaningful impact. Volunteers coming into a community and cleaning up a park may produce few long-term benefits. Any one-time influx of money, regardless of its size, will dissipate in its effect unless it changes transition probabilities. In 2010, Mark Zuckerberg donated $100 million to the Newark, New Jersey, public schools, an amount that was matched by other donors. That one-shot donation, which amounted to approximately $6,000 per student, has produced few measurable effects on test scores. Markov models guide action by distinguishing between policies that change transition probabilities, which can have long-term effects, and those change the state and can only have short-term effects. If transition probabilities cannot be changed, then we must reset the state on a regular schedule to change outcomes.
How understanding Markov models can help us in how we think about things. Markov models is simply a way of assigning transition probabilities to states. If I was in a location A, what is the probability of me moving to location etc. When we think about solving problems such as alcoholism via transition probabilities and increasing the probability of someone moving to sobriety and also not reverting back, we might reframe how we think about issues.
Then, one-off investments or efforts might not really make lasting impact and we should think more about lasting initiatives where we can increase the probability of people moving in the direction we want them to.