Meta

date: 2016-09-01

author: OctoMiao

summary: Gauss-Markov Theorem

- “
**Least squares**estimates the parameters have the smallest variance among all linear unbiased estimates.” **Unbiased estimation**is not always good.- ridge regression

- Model is
- Least square estimate of :
- This is unbiased:
- Gauss-Markov theorem: If we have any other linear estimator and , then .
- To prove it we first write down the general form of a linear
estimator.
**Question:**is the general form of a linear estimator ?

Useful functions for the proof:

- Variance: .
- On wikipedia

- MSE:
- MSE is the second term is bias.
- Least square is good but we can trade some bias to get a smaller variance sometimes.
- Choices are variable subset selection, ridge regression.

- Suppose new data is biased from the original data by a value , the MSE using the original estimator is only the original MSE differed by a constant. Eq. 3.22
- We always have a larger MSE????? I don’t get this.

- Model:
- For multiple dimensional inputs, the estimator has no correlations for different features.

© 2018, Lei Ma| GitHub| Statistical Mechanics Notebook | Index | Page Source| changelog| Created with Sphinx