Unbiased K-L estimator for the linear regression model Article

Aladeitan, B, Lukman, AF, Davids, E et al. (2021). Unbiased K-L estimator for the linear regression model . 10 10.12688/f1000research.54990.1

cited authors

  • Aladeitan, B; Lukman, AF; Davids, E; Oranye, EH; Kibria, GBM

abstract

  • Background: In the linear regression model, the ordinary least square (OLS) estimator performance drops when multicollinearity is present. According to the Gauss-Markov theorem, the estimator remains unbiased when there is multicollinearity, but the variance of its regression estimates become inflated. Estimators such as the ridge regression estimator and the K-L estimators were adopted as substitutes to the OLS estimator to overcome the problem of multicollinearity in the linear regression model. However, the estimators are biased, though they possess a smaller mean squared error when compared to the OLS estimator. Methods: In this study, we developed a new unbiased estimator using the K-L estimator and compared its performance with some existing estimators theoretically, simulation wise and by adopting real-life data. Results: Theoretically, the estimator even though unbiased also possesses a minimum variance when compared with other estimators. Results from simulation and real-life study showed that the new estimator produced smaller mean square error (MSE) and had the smallest mean square prediction error (MSPE). This further strengthened the findings of the theoretical comparison using both the MSE and the MSPE as criterion. Conclusions: By simulation and using a real-life application that focuses on modelling, the high heating values of proximate analysis was conducted to support the theoretical findings. This new method of estimation is recommended for parameter estimation with and without multicollinearity in a linear regression model.

publication date

  • January 1, 2021

Digital Object Identifier (DOI)

volume

  • 10