Analysing facial expressions to detect driver distraction
The Signal Processing Laboratory 5 (LTS5) group of researchers at EPFL have developed a system to detect the cognitive distraction of car drivers, by analysing their facial expressions.
Driving requires the constant coordination of many body systems and full attention of the person driving. Cognitive distraction (subsidiary mental load) of the driver is an important factor that decreases attention and responsiveness, which may result in human error and accidents.
In this paper, the research group led by Prof. Jean-Philippe Thiran present a study of facial expressions of such mental diversion of attention. First, a multi-camera database of 46 people was recorded while driving a simulator in two conditions, baseline and induced cognitive load using a secondary task.
Then, the researchers present an automatic system to differentiate between the two conditions, using features extracted from facial Action Unit (AU) values and their cross-correlations in order to exploit recurring synchronisation and causality patterns. Both the recording and detection system are suitable for integration in a vehicle and a real-world application, e.g. an early warning system.
They show that when the system is trained individually on each subject a mean accuracy and F-score of ~95% is achieved, and for the subject independent tests ~68% accuracy and ~66% F-score, with person-specific normalisation to handle subject dependency. Based on the results, the researchers discuss the universality of the facial expressions of such states and possible real-world uses of the system.
The paper, authored by Dr Anil Yüce, Dr Hua Gao, Mr Gabriel Cuendet and Prof. Jean-Philippe Thiran, from the LTS5, has been recently published in the IEEE Transactions on Affective Computing and is also available to read here.