Understanding the properties of least squares estimators is crucial for interpreting regression results. It allows students to assess the reliability and accuracy of predictive models for outcomes like GPA, job offers, or salaries.
The general multiple linear regression model in matrix form is:
The least squares estimator minimizes the sum of squared residuals:
The error variance is estimated by:
Where SSE = ∑(Yᵢ − Ŷᵢ)². This is also called the Mean Squared Error (MSE).
The covariance matrix for the estimated coefficients is:
This helps assess the variability and potential correlation between coefficient estimates.
Suppose we want to model a student's GPA based on:
The model is:
Given n = 5 observations, the least squares objective is:
Apply the formula:
Steps involved: