# Guass-Markov Theorem and Multiple Regression¶

Meta

date: 2016-09-01

author: OctoMiao

summary: Gauss-Markov Theorem

## Gauss-Markov¶

1. Least squares estimates the parameters have the smallest variance among all linear unbiased estimates.”
2. Unbiased estimation is not always good.
3. ridge regression

### Proof of 1¶

1. Model is
2. Least square estimate of :
3. This is unbiased:
4. Gauss-Markov theorem: If we have any other linear estimator and , then .
5. To prove it we first write down the general form of a linear estimator. Question: is the general form of a linear estimator ?

Useful functions for the proof:

1. Variance: .
2. On wikipedia
1. MSE:
2. MSE is the second term is bias.
3. Least square is good but we can trade some bias to get a smaller variance sometimes.
4. Choices are variable subset selection, ridge regression.
1. Suppose new data is biased from the original data by a value , the MSE using the original estimator is only the original MSE differed by a constant. Eq. 3.22
2. We always have a larger MSE????? I don’t get this.

## Multiple Regression¶

1. Model:
2. For multiple dimensional inputs, the estimator has no correlations for different features.

© 2018, Lei Ma| GitHub| | | Page Source| changelog| Created with Sphinx