Bayesian lasso principal component regression with an application
DOI:
https://doi.org/10.29304/jqcsm.2023.15.41357Keywords:
LASSO, Principal Component regression, MCMC, Prior distribution .Abstract
The topic of variable selection in constructing statistical models has received wide attention in many applications, including medical, economic, engineering, and various other fields. Since, this method helps researchers save time and effort, by concentrating on crucial variables and eliminating unimportant variables from the statistical models under study. The famous variable selection technique known as the Least absolute Shrinkage and Selection Operator (LASSO) method works by reducing the number of variables with high explanatory power. This study will combine two methods for variable shrinkage, which minimize both simultaneously by removing variables with little effect and reducing the variance and bias of the estimated parameters. Two techniques were used to validate the results of our current study. The first approach uses simulation, where data is produced using predetermined hypotheses and models. The second approach uses data from actual sources .
Downloads
References
Akaike, H. (1973). Maximum likelihood identification of Normal autoregressive moving average models. Biometrika, 60(2), 255-265. distribution of birth outcomes. Empirical Economics, 26:247{257.
Alhamzawi, R., 2013, “Tobit quantile regression with adaptive lasso penalty”, The 4th International Scientific Conference of Arab Statistics 450 ISSN, pp. 1681-6870).
Andrews, D. F. and C. L. Mallows (1974). Scale mixtures of normal distributions. Journal of the Royal Statistical Society, Series B 36, 99–102. 7.
Alibuhtto, M. C., & Peiris, T. S. G. (2015). Principal component regression for solving multicollinearity problem.
Alabi, O. O., Ayinde, K., & Olatayo, T. O. (2008). Effect of multicollinearity on power rates of the ordinary least squares estimators. Journal of Mathematics and Statistics, 4(2), 75.
Efron, B., Hastie, T., Johnstone, I., & Tibshirani, R. (2004). Least angle regression. The Annals of statistics, 32(2), 407-499.
Fan, J., & Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American statistical Association, 96(456), 1348-1360.
Frost, J. (2017). Multicollinearity in regression analysis: problems, detection, and solutions. Statistics by Jim, 2.
Hashem, H., Vinciotti, V., Alhamzawi, R., and Yu, K. (2015). “Quantile regression with group lasso for classification.” Advances in Data Analysis and Classification, 1–16.
Junttila, V., & Laine, M. (2017). Bayesian principal component regression model with spatial effects for forest inventory variables under small field sample size. Remote Sensing of Environment, 192, 45-57.
Li, Q., Xi, R., & Lin, N. (2010). Bayesian regularized quantile regression. Bayesian Analysis, 5(3), 533-556.
Murtaza Haider (2014). Model Selection and Akaike's Information Criterion (AIC) in R Journal of Statistical Software, Volume 36, Issue 3
Park, T. and Casella, G. (2008). The Bayesian lasso. Journal of the American Statistical Association 103, 681–686.
Schwarz, G. (1978). Estimating the dimension of a model. The Annals of Statistics ,461–464.4
Sun, W., J. G. Ibrahim, and F. Zou (2010). Genomewide multiple-loci mapping in experimental crosses by iterative adaptive penalized regression. Genetics 185, 349–359.
Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society. Series B (Methodological), 267-288.
Yue, Y. R., & Hong, H. G. (2012). Bayesian Tobit Q Reg model for medical expenditure panel survey data. Statistical Modelling, 12(4), 323-346.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Mohammed H. AL-Sharoot , Wael abd Al Hussein Kazem, Shatha Awwad Al-Fatlawi
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.