Paper Title: EL-Alert: An Explainable Lightweight AST Model for Military Situational Awareness and Surveillance
Conference Name: ICTC 2024 The 15th International Conference on ICT Convergence
Abstract: Situational awareness and monitoring are essential aspects of military operations, necessitating the rapid and precise identification of diverse audio events. Although convolutional neural networks (CNNs) have been widely employed for these tasks, they often lack the necessary transparency and efficiency for deployment in resource-limited environments. This study introduces EL-Alert, an explainable lightweight audio spectrogram transformer (AST) model specifically designed for military situational awareness and surveillance. Our model harnesses the potential of transformers to analyze spectrogram representations of audio signals, providing a robust and efficient solution for real-time sound event classification. EL-Alert also incorporates explainable AI techniques, such as gradient-weighted class activation mapping (Grad-CAM) and integrated gradients (IG), enabling transparent decision-making and enhancing trust in automated systems. We conduct experiments on a comprehensive military audio dataset, demonstrating that EL-Alert outperforms traditional CNN-based methods in terms of performance and interpretability. The results show that EL-Alert achieves high accuracy in identifying military-relevant audio events while maintaining low computational requirements, making it suitable for deployment in the field. For reproducibility and further research, the code for EL-Alert is publicly available on GitHub at https://github.com/Judith989/EL-Alert/tree/main.