In Our Previous Chapters, We have discussed about Stochastic Modeling and Markov Model in Stock Market. We’ll begin our exploration with the simplest and most intuitive among these: the Markov Chain. A detailed examination of other Markov models will follow in subsequent discussions.
To understand how a Markov Chain operates, let’s consider the NIFTY stock index as an example. The NIFTY index, like any stock index, can exhibit different behaviors on any given day.
The entirety of all possible states a system can exist in is referred to as the state space.
Let’s consider the dynamics of the NIFTY index, which could exhibit three distinct states on the following day:
Utilizing the Markov Chain theory, we can establish a method to anticipate the state of the NIFTY index tomorrow, based on its state today. This theory provides a mathematical framework to model the transition probabilities between different states over time.
For instance, if NIFTY has been ending in green for a series of consecutive days, the Markov Chain theory can help quantify the probability of NIFTY ending in green, red, or remaining unchanged tomorrow.
This is done by calculating the transition probabilities based on historical data and today’s state.
Furthermore, these transition probabilities can be represented in a transition matrix, which provides a systematic way to model the likely transitions between states. This matrix is a fundamental component of the Markov Chain theory, enabling investors and analysts to forecast future states of the market, thus aiding in informed decision-making.
Given that our system includes distinct states, exhibits randomness, and adheres to the Markov property—that the future state is conditional only on the current state—it’s appropriate to model our system as a Markov chain. Furthermore, our modeling will be conducted in discrete time intervals.
For example –
Tomorrow, there is 60% chance NIFTY’s state (By state in this context we mean NIFTY will close in) will be Upside
given that today its state is Downside
. We are using the term State
because that is the convention.
Let’s say, there is a 20% chance that tomorrow, NIFTY’s state will be Upside
again if today its state is Upside
. You can see it is represented with a Self Pointing arrow.
We can depict this scenario using arrows in a diagram where the arrow starts from the current state (downward trend) and points towards the future state (upward trend). Let’s represent this in the diagram in weighted arrows. The arrow originates from the current state and points to the future state.
This is also called a Transition State Diagram. Since we have three states we will henceforth have a three-state Markov chain.
Each arrow is called a transition from one state to another. In the diagram here, You can see all the possible transitions. This diagram is called Markov Chain
.
States: There are three primary states depicted in the diagram:
Transitions & Probabilities:
Transitions between states are indicated by arrows, and the probabilities of each transition are displayed as numbers adjacent to the arrows. Here’s a breakdown:
From Upside:
From Consolidation:
From Downside:
Here, you can observe all potential hypothetical transitions between the different states of the NIFTY index.
Let us encode it into a transition matrix P:
$$
P = \begin{bmatrix}
P_{11} & P_{12} & P_{13} \\
P_{21} & P_{22} & P_{23} \\
P_{31} & P_{32} & P_{33} \\
\end{bmatrix}
=
\begin{bmatrix}
0.2 & 0.2 & 0.6 \\
0.3 & 0.5 & 0.2 \\
0.5 & 0.7 & 0.5 \\
\end{bmatrix}
$$
Our Markov chain is now fully described by a state transition diagram and a transition matrix denoted by \( P \). The transition matrix \( P \) is fundamental to our model as it predicts future states, with the matrix’s powers indicating probabilities for multiple time steps ahead:
This recursive multiplication of \( P \) outlines potential future outcomes of the Markov model.
If we aim to predict the market condition two days forward, we are essentially calculating the probability distribution of the market states two steps ahead.
Now that We have known the basics of the Markov Chain, Let’s explore our Markov Chain doing in a more pythonic way in our next discussion.