Gender Classification from Human Face Images Using Deep Learning Based on MobileNetV2 Architecture
DOI:
https://doi.org/10.29304/jqcsm.2025.17.11970Keywords:
Deep learning, Transfer learning, CNNs, MobileNetV2, ClassificationAbstract
A person’s face provides information about many identifying characteristics, including age, gender, and race. Among these characteristics, gender prediction has drawn much attention due to its many applications and use cases. Using a face to determine an individual’s gender and assign it to the appropriate category of “male” or “female” is called gender recognition. Typically, there are multiple steps in the process, such as face detection and feature extraction to record the distinctive features of the face. Efficient feature extraction for gender classification is possible through a number of approaches, including deep learning-based convolutional neural networks (CNNs), which have shown excellent performance in learning hierarchical representations directly from raw pixel data. Previous attempts at gender recognition have focused on several static physical features, such as fingernails, body shape, hand shape, eyebrow, face, etc. In this study, we propose a gender classification using deep learning from human face images. This study uses a deep learning model based on the MobileNetV2 architecture for gender classification. The model was pre-trained and fine-tuned on the dataset using transfer learning techniques. During training, the model was optimized using Adam optimizer with focus loss function and learning rate scheduler. The dataset name is the largest gender-specific face recognition dataset from kaggle with sample images (man and woman). Experimental results show perfect performance and F1score value of 96%.This means that the model achieves excellent performance in the balance between precision and recall, and it is close to perfect performance on many classification tasks.
Downloads
References
. P. TERHÖRST, D. FÄHRMANN, N. DAMER, AND F. KIRCHBUCHNER(2021). ON
SOFT-BIOMETRIC INFORMATION STORED IN BIOMETRIC FACE EMBEDDINGS, VOL. 3, NO. 4, PP. 519–534, DOI: 10.1109/TBIOM.2021.3093920.
. A. JAIN AND V. KANHANGAD(2017). GENDER CLASSIFICATION IN SMARTPHONES
USING GAIT INFORMATION ANKITA, EXPERT SYST. APPL., DOI:
1016/J.ESWA.2017.10.017.
. I. SIDDIQI, C. DJEDDI, AND A. RAZA (2014). AUTOMATIC ANALYSIS OF HANDWRITING FOR GENDER CLASSIFICATION, , DOI: 10.1007/S10044- 014-0371-0.
. OSMAN OF, YAP MH (2018). COMPUTATIONAL INTELLIGENCE IN AUTOMATIC FACE AGE ESTIMATION: A SURVEY, IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE 3(3):271–285.
. (2017).JAPANESE SMOKERS TO FACE AGE TEST, HTTP://NEWS.BBC.CO.UK/2/HI/ASIA-PACIFIC/7395910.STM.
. GUO G (2012). HUMAN AGE ESTIMATION AND SEX CLASSIFICATION. IN: VIDEO ANALYTICS FOR BUSINESS INTELLIGENCE, SPRINGER, PP 101–131.
. MAKINEN E, RAISAMO R (2008). AN EXPERIMENTAL COMPARISON OF GENDER CLASSIFICATION METHODS, PATTERN ¨ RECOGN LETT 29(10):1544–1556.
. BALUJA S, ROWLEY HA (2007).BOOSTING SEX IDENTIFICATION PERFORMANCE, INT J COMPUT VIS 71(1):111–119 .
. I. YASAKA AND O. ABE (2018). DEEP LEARNING AND ARTIFICIAL INTELLIGENCE IN
RADIOLOGY: CURRENT APPLICATIONS AND FUTURE DIRECTIONS, PLOS MED.,
VOL. 15, NO. 11, PP. 2–5,, DOI: 10.1371/JOURNAL.PMED.1002707.
. Y. LECUN, Y. BENGIO, AND G. HINTON (2015). DEEP LEARNING, NATURE, VOL.
, NO. 7553, PP. 436–444, DOI: 10.1038/NATURE14539.
. I. DENG AND D. YU(2013. DEEP LEARNING: METHODS AND APPLICATION.
. ARNOLD, S. REBECCHI, AND S. CHEVALLIER (2011). AN INTRODUCTION TO
DEEP LEARNING TO CITE THIS VERSION, EUR. SYMP. ARTIF. NEURAL
NETWORKS, NO. JANUARY.
. ZHUANG, FUZHEN, ZHIYUAN QI, KEYU DUAN , DONGHO XI, YONGCHUN ZHU, HUIXIONG, AND QING HE(2020), ACOMPREHENSIVE SURVY ON TRANSFER LEARNING ,PROCESSINGS OF THE IEEE .
. YILDIRIM ME, INCE OF, SALMAN YB, SONG JK, PARK JS, YOON BW (2016) ,GENDER RECOGNITION USING HOG WITH MAXIMIZED INTER-CLASS DIFFERENCE. IN: VISIGRAPP (3: VISAPP), PP 108–111.
. HASSNER T, HAREL S, PAZ E, ENBAR R (2015). EFFECTIVE FACE FRONTALIZATION IN UNCONSTRAINED IMAGES. IN: PROCEEDINGS OF THE IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, PP 4295–4304.
. ALEXANDRE LA (2010). GENDER RECOGNITION: A MULTISCALE DECISION FUSION APPROACH. PATTERN RECOGNITION LETTERS ,31(11):1422–1427.
. YUXIANG ZHOU, HONGJUN NI,FUJI REN, XIN KANG (2019). FACE AND GENDER RECOGNITION SYSTEM ASED ON CONVOLUTIONAL NEURAL NETWORKS ,IEEE.
. KUNAL JAIN, MUSKAN CHAWLA, ANUPMA GADHWAL, RACHNA JAIN & PREETI NAGRATH (2020) . AGE AND GENDER PREDICTION USING CONVOLUTIONAL NEURAL NETWORK, CONFERENCE PAPER.
. TANIA DI MASCIO, PAOLO FANTOZZI, LUIGI LAURA & VALERIO RUGHETT (2021). AGE AND GENDER (FACE) RECOGNITION: A BRIEF SURVEY,CONFERENCE PAPER, SEPTEMBER .
. GURAMRITPAL SINGH SAGGU, KESHAV GUPTA & PALVINDER SINGH MANN (2021). EFFICIENT CLASSIFICATION FOR AGE AND GENDER OF UNCONSTRAINED FACE IMAGES, CONFERENCE PAPER , 14 DECEMBER .
. SANDEEP KUMAR , NEETA NAIN1 (2022). REVIEW: SINGLE ATTRIBUTE AND MULTI ATTRIBUTE FACIAL GENDER AND AGE ESTIMATION.
. NEHA SHARMA, REECHA SHARMA & NEERU JINDAL (2022). FACE-BASED AGE AND GENDER ESTIMATION USING IMPROVED CONVOLUTIONAL NEURAL NETWORK APPROACH.
. AQIL MUHAMMAD, DIAN PRATIWI ,AGUS SALIM (2023). PENERAPAN METODE CONVOLUTIONAL NEURAL NETWORKS PADA PENGENALAN GENDER MANUSIA BERDASARKAN FOTO TAMPAK DEPAN, JURNAL KOMTIKA.
. İSMAIL AKGÜL (2024) .DEEP CONVOLUTIONAL NEURAL NETWORKS FOR AGE AND GENDER ESTIMATION USING AN IMBALANCED DATASET OF HUMAN FACE IMAGES.
. AQIL MUHAMMAD , DIAN PRATIWI AND AGUS SALIM (2023), PENERAPAN 25. METODE CONVOLUTIONAL NEURAL NETWORKS PADA PENGENALAN GENDER MANUSIA BERDASARKAN FOTO TAMPAK DEPAN,JOURNAL KOMTIKA.
. HTTPS://WWW.KAGGLE.COM/DATASETS/MACIEJGRONCZYNSKI/BIGGEST-GENDERFACE-RECOGNITION-DATASET/DATA
. SANDLER, ANDREW HOWARD, MENGLONG ZHU, ANDREY
ZHMOGINOV,AND LIANG-CHIEH CHEN(2018). MOBILENETV2: INVERTED
RESIDUALS AND LINEAR BOTTLENECKS, IN PROCEEDINGS OF THE IEEE
CONFERENCE .
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Nisreen Ryadh Hamza

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.