It seems that different companies are backing different technologies necessary for a reliable, safe, fully self-driving vehicle.
Some, like Tesla, are leaning towards computer vision and assuming that AI will be able to navigate roads using data from cameras. Others, like MobilEye, favour a data-heavy approach using lidar, radar, and camera sensors.
Ultimately, both still need access to a detailed map of every road on earth a car may ever drive on.
In terms of sensors, too few and a driving AI might omit obstacles making it less likely to satisfy safety standards, but too many are expensive and may render them unaffordable.
Around 80% of driver assistance systems (ADAS) used on the road today are built by MobilEye which sells ADAS chips to more than 30 carmakers. It predicts that the self-driving car software it develops will be regulated and sellable to carmakers by 2025 as it aims to precision-map street features such as pedestrian crossings and traffic lights, worldwide.
It could be argued that argues that a self-driving car and a driver assistance system are very similar as both require sensors, computer chips, and the agency to act on the information they process. The difference is that a car with ADAS still has a person driving who is ultimately responsible while a self-driving car does not.
Though driverless cars can sense conditions, people, vehicles and other hazards, besides having the ability to accelerate, turn and brake, they are not quite fully reliable. Progress through the fine margins and nuances of driving become more difficult to control the closer the cars get to full autonomy. In addition, understanding a multitude of regional variations, different signages and local laws are highly challenging for the AI systems currently available.
MobilEye wants to use high-resolution maps which are pre-prepared for the autonomous vehicles so that they don’t have to rely on sensors to process information and then apply its meaning in real time.
Build and maintenance of the map from cars installed with MobilEye driver assistance systems, will use onboard sensors to gather information and upload that data to a cloud computer network. Around 5 million cars are expected to be sending data by 2022.
Since data collection began in 2018, maps have covered all of Europe and Japan and much of the US.
Ultimately, it’s likely that a balance between AI, data, and mapping will be key to success in the marketplace.