Innovative methods of automotive crash detection through audio recognition using neural networks algorithms

Вантажиться...
Ескіз

Дата

2024

Науковий керівник

Назва журналу

Номер ISSN

Назва тому

Видавець

Institute of Special Communication and Information Protection of National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”

Анотація

The automatic e-Call system has become mandatory in the European Union since 2018. This requirement means that all new passenger vehicles released on the European market after this date must be equipped with a digital emergency response service, which automatically notifies emergency services in case of an accident through the Automatic Crash Notification (ACN) system. Since the response of emergency services (police, ambulance, etc.) to such calls is extremely expensive, the task arises of improving the accuracy of such reports by verifying the fact that the accident actually occurred. Nowadays, most car manufacturers determine an emergency by analyzing the information coming from the built-in accelerometer sensors. As a result, quite often sudden braking, which avoids an accident, is mistakenly identified as an emergency and leads to a false call to emergency services. Some car manufacturers equip their high-end vehicles with an automatic collision notification, which mainly monitors the airbag deployment in order to detect a severe collision, and call assistance with the embedded cellular radios. In order to reduce costs some third-party solutions offer the installation of boxes under the hood, wind-screen boxes and/or OBDII dongles with an embedded acceleration sensor, a third-party simcard as well as a proprietary algorithm to detect bumps. Nevertheless, relying on acceleration data may lead to false predictions: street bumps, holes and bad street conditions trigger false positives, whereas collisions coming from the back while standing still may be classified as normal acceleration. Also acceleration data is not suitable to identify vehicle side impacts. In many cases emergency braking helps to avoid collision, while acceleration data would be very similar to the data observed in case of an accident, resulting in a conclusion that the crash actually occurred. As a result, the average accuracy of those car crash detection algorithms nowadays does not exceed 85% , which is acceptable, yet offers a lot of room for further improvement, since each additional percept of accuracy would provide substantial cost savings. That is why the task of increasing accuracy of collision detection stays urgent. In this article, we will describe an innovative approach to the recognition of car accidents based on the use of convolutional neural networks to classify soundtracks recorded inside the car when road accidents occur, assuming that every crash produces a sound. Recording of the soundtrack inside the car can be implemented both with the help of built-in microphones as well as using the driver's smartphone, hands-free car kits, dash cameras, which would drastically reduce cost of hardware required to solve this task. Also, modern smartphones are equipped with accelerometers, which can serve as a trigger for starting the analysis of the soundtrack using a neural network, which will save the computing resources of the smartphone. Accuracy of the crash detection can be further improved by using multiple sound sources. Modern automobiles may be equipped with various devices capable of recording the audio inside the car, namely: built-in microphone of the hands-free speaking system, mobile phones of the driver and/or passengers, dash-cam recording devices, smart back-view mirrors etc.

Опис

Ключові слова

artificial intelligence, convolutional neural networks, audio signal processing, штучний інтелект, згорткові нейронні мережі, обробка аудіосигналів

Бібліографічний опис

Mogylevych, D. Innovative methods of automotive crash detection through audio recognition using neural networks algorithms / Dmytro Mogylevych, Roman Khmil // Information Technology and Security. – 2024. – Vol. 12, Iss. 2 (23). – Pp. 243-256. – Bibliogr.: 25 ref.