Linear regression model

by | Aug 18, 2021 | Assignment Help

1. Suppose that your linear regression model includes a constant term, so that in thelinear regression modely = X + ” ; (1)the matrix of explanatory variables X can be partitioned as follows: X = [i X1]. TheOLS estimator of  can thus be partitioned accordingly into b0 = [b0 b01], where b0is the OLS estimator of the constant term and b1 is the OLS estimator of the slopecoecients.(a) Find the inverse of the matrix X0X. (hint: Apply result (A-74).)(b) Use partitioned regression to derive formulas for b1 and b0. (Note: Question 5 ofProblem Set 1 asks you to do this without using partitioned regression.)(c) Derive var(b1 j X) using (B-87). How is your answer related to your answer topart (a)?(d) What is var(b0 j x)? (You should be able to answer this question without doingany further derivations, using your answers to parts (a) – (c).)2. Suppose that instead of estimating the full regression model including the constantterm, you have estimated instead a model in deviations from means; i.e., you haveregressed M0y on M0X1. We can write the estimating equation in this case asM0y = M0X11 +M0″ ; (2)Call the OLS estimator of 1 in this equation eb1.(a) Derive eb1. How does it compare to b1 in question 1?(b) Let the residuals vector for equation (2) be ee. Show that eeis identical to e, thevector of OLS residuals for equation (1).

This is a sample question

Need help with a similar assignment?

Place an order at Study Pirate

Attach all custom instructions.

Make Payment. (The total price is based on number of pages, academic level and deadline)

We’ll assign the paper to one of writers and send it back once complete.