top of page
By John R. Quain

What Autonomous Cars Can't See Can Hurt You


In a year when the autonomous vehicle business has hit a big reset button, consensus is growing among designers that self-driving cars just aren’t perceptive enough to make them sufficiently safe. So now engineers are considering using more sophisticated sensors and technology — tech that can see through rain and even look underground.

“There are still deficiencies in the sensor suite,” said Bobby Hambrick, chief executive of AutonomouStuff, which supplies autonomous technology to auto researchers and developers.

He and others note that the problem is that cars can’t begin to figure out what’s around them — separating toddlers from traffic cones, for example — if they can’t see the objects in the first place. In the parlance of autonomous vehicle engineering, perception has to be accurate enough to enable classification.

Until now, the standard model for autonomous cars has used some combination of four kinds of sensors — video cameras, radar, ultrasonic sensors and lidar. It’s the approach used by all the leading developers in driverless tech, from Aptiv to Waymo. However, as dozens of companies run trials in states like California, deployment dates have drifted, and it has become increasingly obvious that the standard model may not be enough.

Featured Review
Tag Cloud
bottom of page