Application of Neural Networks in Financial Data Analysis for Enhanced Corporate Performance Evaluation
DOI:
https://doi.org/10.29304/jqcsm.2025.17.22183Keywords:
Neural Networks, Financial Analysis, Corporate Performance, Predictive ModelingAbstract
This study explores the application of neural networks in financial data analysis and corporate performance evaluation, addressing challenges of nonlinearity and data noise. It evaluates multiple architectures (feedforward, recurrent, and convolutional), supported by real-world Python implementations and performance metrics such as accuracy and AUC. Key limitations like interpretability and overfitting are discussed, with proposed remedies and future research directions.
Downloads
References
Goodfellow, I., Bengio, Y., and Courville, A. (2016). Deep Learning. MIT Press.
Schmidhuber, J. (2015). Deep Learning in Neural Networks: An Overview. Neural Networks.
Box, G. E., Jenkins, G. M., & Reinsel, G. C. (2008). Time Series Analysis.
Tsay, R. S. (2010). Stochastic Structure and Statistical Inference of Financial Time Series.
Shiller, R. J. (2003). From Efficient Markets To Behavioral Finance.
Hochreiter, S., & Schmidhuber, J. (1997). Long Short-Term Memory.
Cho, K., et al. (2014). Learning to generate natural descriptions with the help of compact structures, which combine deep convolutional neural networks and the soft attention mechanism of[27].
Altman, E. (1968). A Discriminant Analysis for the Prediction of Business Failure.
Tam, K. Y., and Kiang, M. Y. (1992). Application of neural networks to management decision: Managerial decision-making with neural networks.
Kimoto, T., Asakawa, K., Yoda, M., and Takeoka, M.G. System for stock market prediction.
Zhang, G., Eddy Patuwo, B., and Hu, M. Y. (1998). Prediction on Artificial Neural Network.
Altman, E. (1993). Distresses at the Firm Level and Corporate Bankruptcy.
Tam, K. Y., & Kiang, M. Y. (1992). Artificial Neural Networks in Bankruptcy Prediction.
Oliveira, A. L., et al. (2007). Using Artificial Neural Networks for Earnings Prediction.
Fischer, T., and Krauss, C. 2018. Financial Market Predictions Using Deep Learning and Long-Short Term Memory Networks.
Bao, W., Yue, J., & Rao, Y. (2017). Deep Learning for Financial Time Series Analysis.
Heaton, J. B., Polson, N. G.,&Witte, J. H. (2017). Deep Portfolio Theory.
Lundberg, S. M., & Lee, S. I. (2017). A Unified Method of Using Model Predictions.
Chalapathy, R., and Chawla, S. (2019). Deep learning for anomaly detection: A survey.
LeCun, Y., Bengio, Y., and Hinton, G. (2015). Deep learning. Nature, 521(7553), 436-444.
Sirignano, J., & Cont, R. (2019). Generic aspects of price formation in financial markets: perspectives from deep learning. Quantitative Finance, 19(9), 1449-1459.
Chen, K., Zhou, Y., and Dai, F. (2015). LSTM network for stock returns prediction: A case study of China stock market. IEEE International Conference on Big Data (Big Data), 2823-2828.
WrdsLibrary. Compustat Database Documentation. [Online]. Availbale:https://wrds-www.wharton.upenn.edu/pages/about/data-vendors/s-p-compustat/
Yahoo Finance API Docunentation. [Online]. Source: https://finance.yahoo.com/
Kaggle Datasets. [Online]. Available at: https://www.kaggle.com/datasets
Batista, G. E. A. P. A., and Monard, M. C. (2003). A comparison of four missing data methods in supervised learning. Tapouek, P., & Paclik, P. (2003). Mining by column objects.
Liu, F. T., Zhou, Z. H. & Ting, K. M. (2008). Isolation forest. Proceedings of the Eighth IEEE International Conference on Data Mining 413-422, 2008.
Altman, E. I. (2000). Corporate insolvency forecasting at the Hastings and Seagrove journals. Journal of Banking and Finance, 29(2), 557- 584.
Hornik, H. White, and K., M. Stinchcombe (1989). MLFFNs are universal function approximators. Neural Networks, 2, 359-366.
Srivastava, N., Hinton, G., Krizhevsky, A., Salakhutdinov, R. & Sutskever, I. (2014). Dropout: A simple way to prevent neural networks from overfitting. Journal of Machine Learning Research, 15(1):1929-1958.
Widmer, G., and Kubat, M. (1996). Inferring in non-iid environment with class imbalance, concept drift, and hidden context. Machine Learning, 69-101 23(1)
Devlin, J., Chang, M. W., Lee, K., & Toutanova, K. (2019). BERT: Bidirectional encoder representation from Transformers. In Proc. of NAACL-HLT, pages 4171–4186.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Fatimah Ghazi Suwaidan, Taif Jawad Kadhim, Ali Atta Kshash, Ghaith Hakim Malik

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.