NewsSelf-driving cars struggle to see at night or in fog – but...

Self-driving cars struggle to see at night or in fog – but imitating the human brain can make them safe

Picture this: you’re driving on a mountain road, when you suddenly hit a thick patch of fog. You respond instinctively. Your vision sharpens, and you narrow your eyes to make out the shape of any oncoming cars.

Human beings handle these quick changes very well, but if it were a self-driving car – at least one with a current artificial intelligence (AI) system behind the wheel – things could easily end in disaster.

mostbet

Today’s AI vision systems are extremely accurate when visibility is good. On a clear, sunny day a self-driving car can recognise pedestrians, road signs and other vehicles with precision. However, they are extremely vulnerable to environmental changes. If it rains, or gets dark or foggy, standard AI systems become blind, incapable of detecting obstacles that a human driver would spot with ease.

Our research at the University of Valencia proposes a possible solution: instead of exposing AI models to millions of images of every possible road condition, we decided to imitate biology. But biologically speaking, why can humans see so well under such a wide range of conditions?

Read more:
Human vision: what we actually see – and don’t see – tells us a lot about consciousness

The brain’s ‘volume control’

In our brains, neurons do not work alone. They use a truly fascinating form of adaptation that neuroscientists call divisive normalisation.

To understand this (without getting into mathematics) we can picture it as an automated “volume control” system, with neurons working in a team. Let’s say one neuron is looking at a very dark area of the field of vision, such as a black car at night. The neighbouring neurons turn up the “volume” of this weak signal, amplifying the small details to make them more visible.

If we look at a bright light, the same thing happens in reverse. The brain turns down the volume to prevent us from being dazzled.

This mechanism is what allows us to adapt and see clearly in a very wide range of conditions. But in the search for speed and accuracy, modern AI systems have neglected this biological inspiration.

Read more:
AI systems and humans ‘see’ the world differently – and that’s why AI images look so garish

AI in the driving simulator

In our study, we processed images using some of the most widely used AI models, adding layers to simulate the brain’s “volume control” mechanism. In basic terms, we forced their neurons to communicate with one another and adapt to their environment, just as our own brains do.

We wanted to see if imitating biology would make cars safer. To do this, we submitted both standard AI models and our brain-inspired modification to a series of tests. Using databases from real driving in European cities, night driving images from Switzerland, and several different virtual driving simulators, we were able to compare responses to difference levels of fog,

 » …

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Subscribe Today

GET EXCLUSIVE FULL ACCESS TO PREMIUM CONTENT

SUPPORT NONPROFIT JOURNALISM

EXPERT ANALYSIS OF AND EMERGING TRENDS IN CHILD WELFARE AND JUVENILE JUSTICE

TOPICAL VIDEO WEBINARS

Get unlimited access to our EXCLUSIVE Content and our archive of subscriber stories.

Exclusive content

Latest article

More article