Stochastic Modeling

Markov Chain

Markov Chains in Stock Market

The Random Walk of Stock Market

Getting Equilibrium Matrix in Markov Chain

Weiner Process

Simulating Geometric Brownian Motion

Essentials of Markovian Models

Welcome to our detailed exploration of some of the most fundamental concepts in financial mathematics and stochastic processes. In this comprehensive guide, we dive into the defintions of Random Walks, Geometric Random Walks, Brownian Motion, Geometric Brownian Motion (GBM), Markov Chains, and Markov Processes in detail once more.

Let’s simplify the terms “model,” “chain,” and “process” in the context of Markov theories:

**Type**: Discrete-time stochastic process with a discrete state space.

**State Space**: Discrete; it describes a system that can be in one of a finite or countably infinite number of states.

**Characteristic**: The system transitions from one state to another, where the probability of each state depends only on the current state (Markov property).

**Usage**: Used in various fields for modeling sequences of events or states where the future state depends only on the current state, such as in queueing theory, economics, and genetics.

**Markov Property**: By definition, a Markov chain has the Markov property.

**Type**: General term that can refer to both discrete-time and continuous-time stochastic processes.

**State Space**: Can be discrete or continuous, depending on the specific process.

**Characteristic**: The process has the Markov property if future states depend only on the current state and not on past states.

**Usage**: A Markov Process is a broad category that includes many different types of stochastic processes, including both Geometric Random Walk and GBM when they exhibit the Markov property.

**Markov Property**: By definition, a Markov process must have the Markov property.

So, a Markov Chain is a type of Markov Process.

Markov Chain is always discrete and Markov Process can be both discrete and continous.

- A
**Markov Model**is the mathematical representation of a Markov Process. - It uses equations and probabilities to describe the process.
- It’s the tool you use to study the properties of a Markov Process or to make predictions based on it.
- This can include both discrete-time models (Markov Chain Models) and continuous-time models.

A Markov Model is how we represent and study any Markov Process, including Markov Chains.

The term “model” is often used to refer to the theoretical, mathematical framework, while “chain” and “process” refer to the type of stochastic process being modeled.

**Type**: Discrete-time stochastic process.

**State Space**: Can be discrete or continuous; often used to describe positions on a lattice or line.

**Characteristic**: Consists of a succession of random steps. For example, in a simple random walk, at each step, the process moves either up or down (or left/right) by a fixed amount.

**Usage**: Commonly used to model various phenomena, from particle movements in physics to stock price movements in finance.

**Markov Property**: Yes, a random walk is a Markov process in the sense that the next position depends only on the current position, not the history of how it got there.

**Type**: Discrete-time stochastic process.

**State Space**: Continuous; the stock prices can take any positive value.

**Characteristic**: The logarithm of the stock price follows a random walk, meaning the log returns are independent and identically distributed.

**Usage**: Often used to model stock prices in a simplified, discrete-time setting.

**Markov Property**: Yes, because the future price depends only on the current price and a proportional rate of return, which is random.

**Type**: Continuous-time stochastic process.

**State Space**: Continuous; it represents the continuous movement of particles.

**Characteristic**: It models the random movement of particles suspended in a fluid, a process that is continuous in both time and space. In finance, it is often used as a model for asset prices in the form of Geometric Brownian Motion.

**Usage**: Essential in various fields, from physics (particle dynamics) to finance (modeling stock prices and as a foundation for the Black-Scholes model in option pricing).

**Markov Property**: Brownian motion is a Markov process because the future state depends only on the current state and not on the path taken to get there.

**Type**: Continuous-time stochastic process.

**State Space**: Continuous.

**Characteristic**: Geometric Brownian Motion serves as the continuous-time counterpart to the discrete Geometric Random Walk. It is utilized for modeling stock prices, reflecting continuous paths with the inclusion of drift (representing average return) and volatility (representing the variability of returns).

**Usage**: Commonly used in financial modeling and for derivative pricing, like in the Black-Scholes model.

**Markov Property**: Yes, GBM has the Markov property because future values depend only on the current state, not the path taken to get there.

Here’s a deeper look into how the theory is structured and its implications:

**Random Walk, Geometric Random Walk, Brownian Motion, Geometric Brownian Motion (GBM) All are Markov Process but who are Markov chain?**

Out of the processes We’ve listed — Random Walk, Geometric Random Walk, Brownian Motion, and Geometric Brownian Motion (GBM) — the ones that can be classified as Markov Chains are context-dependent.

Here’s a breakdown:

**Random Walk**:

If the random walk is in discrete time and has a discrete state space, it can be considered a Markov Chain.

*This is because the next state depends only on the current state (the Markov property) and both time and state space are discrete.*

**Geometric Random Walk**:

Like the standard random walk, a geometric random walk can be considered a Markov Chain if it is in discrete time and has a discrete state space.

*However, geometric random walks often have a continuous state space (since they model percentage changes in prices, which can vary continuously), making them less likely to be classified as Markov Chains in practical applications.*

**Brownian Motion** and **Geometric Brownian Motion (GBM)**:

These are continuous-time processes with continuous state spaces. *Therefore, they are not Markov Chains. Instead, they are examples of Markov Processes in continuous time. The distinction here is the continuous nature of these processes, both in terms of how time progresses and how the state changes.*

In summary, whether a process is a Markov Chain depends on the nature of its time steps and state space. Discrete time and discrete state space are key characteristics of a Markov Chain. Processes with continuous time or state space, like Brownian Motion and GBM, are Markov Processes but not Markov Chains.

Post a comment