Driverless cars worse at detecting children and darker-skinned pedestrians say scientists
Researchers from King’s College London have revealed major age and race biases in autonomous vehicles’ detection systems, in what could prompt a major re-think of the future of driverless cars.
In collaboration with colleagues, Dr Jie Zhang from the Department of Informatics at King’s College London, assessed eight artificial intelligence (AI) powered pedestrian detection systems used in autonomous vehicle research.
They found through testing over 8,000 images through these pieces of software that detection accuracy for adults was almost 20% higher than it was for children, and just over 7.5% more accurate for light-skinned pedestrians compared to their darker-skinned counterparts.
A major cause of this discrepancy is that the main collections of pedestrian images which are used to train the AI systems used in pedestrian detection, that is the system which tell a driverless car whether they are approaching a pedestrian, feature more people with light skin than dark skin. The result of this uneven data source is a lack of fairness in the AI system it’s used to train.
As Dr Jie Zhang explains: “Fairness when it comes to AI is when an AI system treats privileged and under-privileged groups the same, which is not what is happening when it comes to autonomous vehicles. Car manufacturers don’t release the details of the software they use for pedestrian detection, but as they are usually built upon the same open-source systems we used in our research, we can be quite sure that they are running into the same issues of bias.
“While the impact of unfair AI systems has already been well documented, from AI recruitment software favouring male applicants, to facial recognition software being less accurate for black women than white men, the danger that self-driving cars can pose is acute. Before, minority individuals may have been denied vital services, now they might face severe injury.”
The researchers also found that the bias towards dark-skin pedestrians increases significantly under scenarios of low contrast and low brightness, posing increased issues for nighttime driving.
The researchers now hope that manufacturers will be more transparent when it comes to how their commercial pedestrian detection AI models are trained, as well as how they perform, before they hit the streets.
Dr Zhang says: “Automotive manufacturers and the government need to come together to build regulation that ensures that the safety of these systems can be measured objectively, especially when it comes to fairness.
“Current provision for fairness in these systems is limited, which can have a major impact not only on future systems, but directly on pedestrian safety. As AI becomes more and more integrated into our daily lives, from the types of cars we ride, to the way we interact with law enforcement, this issue of fairness will only grow in importance.”