Application of Markov Chains in Stock Market

Application of Markov Chains in Stock Market

In Our Last Chapter, We have discussed about Stochastic Modeling in Stock Market. 

A Markov model is a stochastic model used to model pseudo-randomly changing systems. There are 4 types of Markov Model. Let’s start by naively describing how the simplest model among those, Markov Chain works. We shall revisit the concept of Markov Model in detail in later.

The States of a Markov Chain

Lets start with an example of NIFTY. Tomorrow, NIFTY can have three states – 

  • It can either end in green. 
  • It can either end in red. 
  • It is in the same place. 

Note – On any given day, NIFTY will be following one of these three states. Now, as per the theory of the Markov Chain, We need to assume that, tomorrow’s state of NIFTY depends on today’s state.

So, What happens today is dependant on yesterday’s state and so on! 

Markov Chain in Stock Market

In other words – There is a way to predict what will be the state of NIFTY tomorrow if you know the state of NIFTY today.

Example of a Markov Chain in Stock Market

For example – 

Tomorrow, there is 60% chance NIFTY’s state (By state in this context we mean NIFTY will close in) will be Upside given that today its state is Downside. We are using the term State because that is the convention.

Let’s represent this in the diagram in weighted arrows. The arrow originates from the current state and points to the future state.

Another – 

Let’s say, there is a 20% chance that tomorrow, NIFTY’s state will be Upside again if today its state is Upside.

You can see it is represented with a Self Pointing arrow.

Markov Chain in Stock Market

Each arrow is called a transition from one state to another. In the diagram here, You can see all the possible transitions. This diagram is called Markov Chain.

Pseudo-randomness – 

Markov Models are used to explain random processes that depend on their current state. So, they characterize processes that are not completely random and independent.  That’s why the term Psuedo-random is used in the definition.

Note on Probability Theory – 

The sum of the weights of the outgoing arrows from any state is 1. This has to be true because they represent probabilities and for probabilities to make sense, they must add up to 1. Now, there are some special cases of Markov chains but We are not extending our discussion there for now.

The Markov Property - Memory Less Function

The Markov property means that the evolution of the Markov process in the future depends only on the present state and does not depend on past history.

The Markov process does not remember the past if the present state is given.  Hence, the Markov process is called the process with memoryless property.

Example –

Suppose Nifty was up in x1 day, down in x2 day, down in x3 day, neutral in x4 day. How we should determine the probability that the state of x5 day will be up? (x1,x2,x3,x4 are assumed to be consecutive four trading days. And, by neutral, We meant consolidation phase.)

Answer –

As per Markov Property, to get the state of x5 day, We need to know the state of x4 day. If you see the diagram of the Markov Chain above, the answer is 50%.

Now that We have known the basics of the Markov Chain, Let’s explore our Markov Chain doing in a more pythonic way in our next discussion.

Post a comment

Leave a Comment

Your email address will not be published.

×Close