Category:Markov Processes
Jump to navigation
Jump to search
This category contains results about Markov Processes.
Definitions specific to this category can be found in Definitions/Markov Processes.
Let $X$ be a continuous random variable which varies over time.
Let $X_t$ denote the value of $X$ at time $t$.
Then $X$ is a Markov process if and only if the future behaviour of $X$ is dependent only upon the value of $X$ at the present time.
This category currently contains no pages or media.