Definition:Multiple Regression/Linear

From ProofWiki
Jump to navigation Jump to search

Definition

Let $X$ and $Y$ be a random variable.

Let $X_1, X_2, \ldots, X_p$ also be random variables for $p \in \N_{>1}$


Let the regression of $Y$ on $X_1, X_2, \ldots, X_n$ be expressible in the form:

$\expect {Y \mid x_1, x_2, \ldots, x_p} = \beta_0 + \beta_1 x_2 + \beta_2 x_2 + \ldots + \beta_n x_n$

Then $\expect {Y \mid x_1, x_2, \ldots, x_p}$ is known as multiple linear regression.


General

Let $\mathbb x$ be a vector of $p$ explanatory variables from $X_1, X_2, \ldots, X_p$.

Let $\bstheta$ be a vector of $q$ unknown parameters

Let the regression of $Y$ on $X_1, X_2, \ldots, X_n$ be expressible in the form:

$\expect {Y \mid \mathbf x} = \map f {\mathbf x, \bstheta}$

Then $\expect {Y \mid \mathbf x}$ is known as general multiple linear regression.


Examples

Polynomial

A multiple linear regression model can be used for a model which is not necessarily linear in the $x_i$.

For example, if $x_i := x^i$, the regression function is actually a polynomial of degree $p$.


Also known as

Linear regression is also known as straight-line regression.


Also see

  • Results about multiple linear regression can be found here.


Sources