ABSTRACT: Distractive driving is very risky as it leads to crashes easily, especially phone use while driving. Existing studies attempted to establish methods to detect different kinds of driver distraction. Real-life distraction could be comprehensive including cognitive distraction, visual distraction, auditory distraction, and physical distraction simultaneously. Previous studies isolated drivers’ facial video data from vehicle dynamic data, which would lose context and may not be realistic. This study focuses on detecting driver distraction status based on vehicle dynamics data extracted from Shanghai Naturalistic Driving Study (SH-NDS), China. The performance attributes of speed, longitudinal acceleration, lateral acceleration, lane offset, and steering wheel rate were extracted from the vehicle Controller Area Network (CAN) data. The distraction status was classified into focused and distracted, of which distracted status included phone use. A Long Short-Term Memory (LSTM) model with attention mechanism and bidirectional layer was built for driver distraction status detection. The developed model showed promising result of approximately 88.2% accuracy on testing dataset, while other machine learning models like support vector machine (SVM), k-nearest neighbor (KNN), and adaptive boosting (AdaBoost) were able to detect provide overall accuracy of 83.4%, 81.5%, and 86.8% correspondingly. These results show that vehicle dynamics attributes of speed, longitudinal acceleration, lateral acceleration, lane offset, and their standard deviation and predicted error, along with steering wheel rate predicted error were significant and effective in detecting driver distraction. The developed LSTM model could potentially be applied in an advanced driver assistant system to reduce crashes caused by distracted driving.
Xuesong Wang, Rongjiao Xu, Siyang Zhang. Driver Distraction Detection Based on Vehicle Dynamics Using Naturalistic Driving Data. Transportation Research Board 100th Annual Meeting, Washington D.C., USA, 2021. 1.25-29.