International Journal For Multidisciplinary Research

E-ISSN: 2582-2160     Impact Factor: 9.24

A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal

Call for Paper Volume 6 Issue 6 November-December 2024 Submit your research before last 3 days of December to publish your research paper in the issue of November-December.

Face Mask Image Classification Using Fine-Tuning and The Effect of FGSM And PGD Attacks.

Author(s) Kouassi Joshua Caleb Micah, Lou Qiong
Country China
Abstract This research seeks to use a fine-tuning method for automatic diagnostics and classification of face masks without face mask images. We trained Deep Convolutional Neural Network models on the face mask dataset obtained from the Mendeley data repository to determine whether individuals will adhere to policies that face mask reduces the spread of SARS-COV-2. The proposed architectures used in this study include the VGG19, MobileNetV3, and InceptionV3 models. These models are known for their role in classifying images. They were trained using the fine-tuning approach and their respective outputs were compared. After training the face mask dataset, it can be said that the fine-tuned InceptionV3 model performed extremely well against the other models obtaining an accuracy of 99.21%, and was able to predict 99.63% of the test dataset. However, the robustness of the fine-tuned model was tested against fast gradient sign method (FGSM) and projected gradient descent (PGD) attacks which generate adversarial images using the gradients of the model. Additionally, the classification report shows after the FGSM attack that the model's accuracy was reduced by 43%. Also, after the PDG attack, the accuracy of the model was reduced by 19%. We assessed the models using many performance metrics such as precision, recall, F1 score, and accuracy before subjecting them to two common adversarial attack techniques: FGSM and PDG. Finally, we demonstrated how the proposed robust method improved the model’s defense against adversarial attacks. The findings emphasize the critical need to increase awareness about adversarial assaults on SARS-COV-2 monitoring systems and to advocate for proactive steps to protect healthcare systems from comparable threats prior to practical deployment.
Keywords Convolutional Neural Network, Face Mask, Fine-Tuning, FGSM attack, PDG attack
Field Computer > Artificial Intelligence / Simulation / Virtual Reality
Published In Volume 5, Issue 5, September-October 2023
Published On 2023-10-02
Cite This Face Mask Image Classification Using Fine-Tuning and The Effect of FGSM And PGD Attacks. - Kouassi Joshua Caleb Micah, Lou Qiong - IJFMR Volume 5, Issue 5, September-October 2023. DOI 10.36948/ijfmr.2023.v05i05.7029
DOI https://doi.org/10.36948/ijfmr.2023.v05i05.7029
Short DOI https://doi.org/gstc8d

Share this