Condition for Method of Least Squares to match Maximum Likelihood Estimation
Jump to navigation
Jump to search
Theorem
Let $X_n$ and $Y_n$ be sets of $n$ paired observations of random variables $X$ and $Y$.
For the paired observations $\tuple {x_i, y_i}$:
- let $x_i$ be error-free
- let the errors in $y_i$ be identical and normally distributed with expectation $0$.
Then the method of least squares yields a result equivalent to the maximum likelihood estimator.
Proof
![]() | This theorem requires a proof. You can help $\mathsf{Pr} \infty \mathsf{fWiki}$ by crafting such a proof. To discuss this page in more detail, feel free to use the talk page. When this work has been completed, you may remove this instance of {{ProofWanted}} from the code.If you would welcome a second opinion as to whether your work is correct, add a call to {{Proofread}} the page. |
Sources
- 1998: David Nelson: The Penguin Dictionary of Mathematics (2nd ed.) ... (previous) ... (next): least squares: 2.
- 2008: David Nelson: The Penguin Dictionary of Mathematics (4th ed.) ... (previous) ... (next): least squares: 2.