1. Consider the following model:
Yi = B1xi1 + B2xi2 + €i; i = 1,2, , n
Derive the the least squares (OLS) estimators B1 and B2. What happens to the estimators when
xi2 = kail, Vi, where k is a fixed constant?
2. Consider the following two-variable linear regression model:
and xi's are fixed in repeated random sampling. Consider its OLS estimator â. We derived in
one of the classes that it could be expressed as a linear function of yi's:
& = " Viyi; where Vi = : 1 txxi - 2
Define an arbitrary linear estimator of a:
where ai's are non-stochastic weights. We also saw in the class that, if * is an unbiased
estimator, the variance of &* was:
Demonstrate that minimizing (5) subject to conditions for unbiasedness gives the OLS weights
of (3) for ai. [Hint: use constrained optimization.]
This material may consist of step-by-step explanations on how to solve a problem or examples of proper writing, including the use of citations, references, bibliographies, and formatting. This material is made available for the sole purpose of studying and learning - misuse is strictly forbidden.