Dimension reduction analysis of Ischemic stroke by using MAVE and SMAVE methods
DOI:
https://doi.org/10.29304/jqcsm.2024.16.41801Keywords:
dimension reduction, local linear, MAVE, SMAVE, ischemic strokeAbstract
With the rapid development that the world has witnessed in all scientific fields, data analysis has become more complex due to the large size of collected data that needs to be analyzed. Because of that the data analyst needs a powerful tool to deal with such that kind of data and this is why the dimension reduction methods have developed. In this paper, we focuses on minimum average variance estimation as powerful dimension reduction tools to analysis of ischemic stroke for sample of patients under some covariates. The aim of this paper is to determine the most important covariates that affect the size of ischemic stroke as response variable.
And the results indicated that the SMAVE method is the best method for prediction of the future ischemic stoke volume.
Downloads
References
Alkenani, A. and Rahman, E. (2021). Regularized MAVE through the elastic net with correlated predictors. Journal of Physics: Conference Series. 1897- 012018.
Alkenani, A. and Yu, K. (2013). Sparse MAVE with oracle penalties. Advances and Applications in Statistics 34, 85105.
Bura, A., Hastie, T. and Tibshirani, R. (1989). Linear smoothers and additive models. Ann. Statist. 17, 435-555.
Cook, R. D. and Li, B. (2002). Dimension reduction for the conditional mean in regression. Ann. Statist. 30, 455-474.
Cook, R. D., and Ni, L. (2005). Sufficient dimension reduction via inverse regression: A minimum discrepancy approach. Journal of the American Statistical Association 100, 470, 410-428.
Friedman, J. and Stuetzle, W. (1981). Projection pursuit regression. J. Amer. Statist. Assoc. 76, 817-823.
Hoerl, A. E., and Kennard, R. W. (1970). Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12, 1, 55-67.
Jolliffe, I. T. (2002). Principal Component Analysis. Springer, New York.
Li, K. C. (1991). Sliced inverse regression for dimension reduction (with discussion). J. Amer. Statist. Assoc. 86, 316-342.
Li, L. (2007). Sparse sufficient dimension reduction. Biometrika 94, 603613.
Li, B. and Wang, S. (2007). On directional regression for dimension reduction. J. Amer. Statist. Assoc. 102, 997-1008.
Rahman, E. and Alkenani, A. (2020). Sparse minimum average variance estimation via the adaptive elastic net when the predictors correlated. Journal of Physics: Conference Series. 1591- 012041.
Sheng, W., and Yin, X. (2006). Sufficient dimension reduction via distance covariance. Journal of Computational and Graphical Statistics 25, 1 , 91-104.
Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological) 58, 1, 267-288.
Wang, Q. and Yin, X. (2008). A Nonlinear Multi-Dimensional Variable Selection Method for High Dimensional Data: Sparse MAVE. Computational Statistics and Data Analysis 52, 45124520.
Xia, Y. C., Tong, H., Li, W. K. and Zhu, L. X. (2002). An adaptive estimation of optimal regression subspace. J. R. Statist. Soc. B. 64, 363-410.
Zou, H. (2006). The adaptive lasso and its oracle properties. Journal of the American statistical association 101, 476, 1418-1429.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Mohammed .H.AL-Sharoot, Sanaa .J.Tuama
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.