# Definition:Lag

## Definition

Let $T$ be a time series.

A lag is a constant time interval between two timestamps of $T$.

Thus, for every observation $z_t$ of $T$, a pair of observations can be created with a given lag $k$ between them:

$\tuple {z_t, z_{t + k} }$

## Sources

Part $\text {I}$: Stochastic Models and their Forecasting:
$2$: Autocorrelation Function and Spectrum of Stationary Processes:
$2.1$ Autocorrelation Properties of Stationary Models:
$2.1.2$ Stationary Stochastic Processes: Autocovariance and autocorrelation coefficients