From the previous posting, the least squares estimators
1) Estimate of of \beta_{1} : b_{1}=\frac{\sum(X_{i}-\bar{X})(Y_{i}-\bar{Y})} {\sum(X_{i}-\bar{X})^2} = \frac{\sum(X_{i}-\bar{X})Y_{i}}{\sum (X_{i}-\bar{X})^2}=\frac{\sum X_{i}Y_{i}-n\bar{X}\bar{Y}}{\sum X_{i}^2-n\bar{X}^2}
2) Estimate of of \beta_{0}: b_{0}=\bar{Y}-b_{1}\bar{X}
They are unbiased!! why? \Rightarrow Need to show that E[b_{1}]=\beta_{1} and E[b_{0}]=\beta_{0}
Proof
First of all, there are three things we're going to use.
k_{i}= \frac{X_{i}-\bar{X}}{\sum(X_{i}-\bar{X})^2}, where \sum k_{i}= 0, \ \sum k_{i}X_{i}=0, \ \sum k_{i}^2=\frac{1}{\sum(X_{i}-\bar{X})^2}
(1) \sum k_{i}=0 \Rightarrow \frac{(X_{i}-\bar{X})}{S_{XX}}= \frac{\sum X_{i}-n\bar{X}}{S_{XX}}= \frac{n\bar{X}-n\bar{X}}{S_{XX}}=0
(2) \sum k_{i}X_{i}=1 \Rightarrow \frac{\sum( X_{i}-\bar{X})(X_{i}-\bar{X})}{S_{XX}}=\frac{\sum(X_{i}-\bar{X})^2}{S_{XX}}= \frac{S_{XX}}{S_{XX}}=1
(3) \sum k_{i}^2= \frac{1}{\sum(X_{i}-\bar{X})^2} \Rightarrow \sum k_{i}^2= \sum( \frac{X_{i}-\bar{X}}{S_{XX}})^2=\frac{1}{S_{XX}}\sum (X_{i}-\bar{X})^2=\frac{S_{XX}}{(S_{XX})^2}=\frac{1}{S_{XX}}
Now, by using these results, we can finally show that the least squares estimators are unbiased!!
Proof
1) E[b_{1}]= \beta_{1}
\Rightarrow E[b_{1}]=E[\sum k_{i}Y_{i}]= \sum k_{i}E[Y_{i}] as \sum k_{i} is a constant! And Y_{i}=\beta_{0}+\beta_{1}X_{i}+\varepsilon _{i}
=\sum k_{i}(\beta_{0}+\beta_{1}X_{i})=\beta_{0} \sum k_{i}+ \beta_{1} \sum k_{i}X_{i}=\beta_{1}
2) E[b_{0}]=\beta_{0}
\Rightarrow E[b_{0}]=E[\bar{Y}-b_{1}\bar{X}]=E[\bar{Y}]-E[b_{1}\bar{X}]=\beta_{0}+\beta_{1}\bar{X}-\bar{X}\beta_{1}=\beta_{0}
Notice that Y_{i} is a random value, therefore \bar{Y}, \sum k_{i}Y_{i} are also random!!
However, \bar{X} is NOT a random value, but it's a constant!
*E(aY)=a E(Y), where a is a constant!
No comments:
Post a Comment