Regression - Deriving the Ordinary Least Squares Estimates

3 important questions on Regression - Deriving the Ordinary Least Squares Estimates

How can the ordinary least squares estimates be derived?

(1) We would like to estimate beta 0 and beta 1 using a random sample from the population, with y(i)= beta 0 + beta 1 x(i) + u(i)
(2) Zero conditional mean implies E[u]=0 and cov(x,u)=E[xu]=0
Using u = y - beta 0 - beta 1 * x gets
E[y-beta0-beta1*x]=0 and E[x(y-B0-B1x)]=0
(3) The ordinary least squares (OLS) estimates satisfy the sample equivalent of these two moment conditions

What are the OLS estimates minimising?

They minimise the sum of squared residuals (SSR). The sample moment conditions are the first order conditions for this minimisation problem. In here, û(i) is the residual for observation (i) and ^y(i) is the fitted value for observation (i).

What is the formula of the OLS regression line / sample regression function?

Y = beta 0 + beta 1 * X, but beta 1 and beta 0 are estimates here and y is a sample as a result.

The question on the page originate from the summary of the following study material:

  • A unique study and practice tool
  • Never study anything twice again
  • Get the grades you hope for
  • 100% sure, 100% understanding
Remember faster, study better. Scientifically proven.
Trustpilot Logo