Transportation

Mobileye Picks Luminar Lidar For Initial Robotaxis While Developing In-House Lidar


It’s no secret that Mobileye, the Israel-based pioneer in camera-based advanced driver assist systems (ADAS) has been actively developing its own Level 4 automated driving systems. Mobileye has always taken a camera-centric approach to this problem like Tesla

TSLA
, but unlike the electric vehicle manufacturer, Mobileye doesn’t believe cameras are sufficient to solve the problem with the required degree of safety. Thus Mobileye has also been using lidar and radar and it has just announced that Luminar will be the lidar supplier for its initial programs. 

Mobileye is taking a unique approach to the problem of trying to ensure its automated driving system is safe. Several years ago Mobileye developed the reliability sensitive safety model (RSS) which is a deterministic mathematical model designed to try to ensure that the system never causes or makes a crash worse. Nvidia

NVDA
subsequently developed its own similar approach called safety force field. 

The difference with Mobileye goes further. Most companies developing automated driving systems use at least three sensing modalities like Mobileye, but they take the radar, lidar and camera data and fuse them together to create a single three-dimensional model of the environment around the vehicle. This should theoretically create a more robust model. However, it is more complex and significantly more difficult to validate.

Mobileye actually creates two independent environment models from the sensor data. One model is built exclusively from the camera data. At CES 2020, Mobileye CEO Amnon Shashua showed off this model which they refer to as vidar (vision distance and ranging) which takes multiple cameras and tries to estimate the distance between objects to build a 3D model. Cameras have the advantage of being better for object classification than other sensors, but can struggle in inclement weather or low light conditions. 

Mobileye also uses radar and lidar sensors to create an independent environment model from data that is fused together. However, only the camera model is actually used for path planning and control. 

“The comfort of the ride can be executed and it doesn’t matter how it is executed but the backbone of our system is a camera-based system,” said Erez Dagan, executive vice president for product and strategy at Mobileye. “It has all of the richness of the semantics that’s required to drive a very comfortable ride.”

Mobileye runs the radar-lidar environment model only against the RSS model but doesn’t use that data directly for control. Similarly, it also runs the camera model against RSS and the results of each of these checks is compared to the output of the path planning model to ensure that no commands are issued that cause a crash. The argument is that these independent models achieve what Mobileye calls true redundancy and each of these less complex models can also be more easily verified and validated with far less data. 

According to Dagan, each of these models has an average mean time between failures (MTBF) for perception errors of about 10,000 hours. Running them independently in parallel increases the MTBF for the whole system to 100 million hours (10,000 x 10,000) which should be safe enough to operate a level 4 robotaxi. 

Since it debuted, Mobileye has always designed its own system-on-a-chip (SoC) branded as EyeQx that they run their machine vision software on. While Mobileye plans to use Luminar lidar sensors for its first applications, it is working with the hardware design teams from its parent Intel

INTC
to design both high definition imaging radar and frequency modulated continuous wave (FMCW) lidar. The new lidar sensors are expected to be ready by 2023 while no date has been given for the radar.   

FMCW lidar sends out a continuous laser beam and measures speed instantaneously via doppler shift. Luminar and most other lidar vendors use a pulsed lidar which only measures the distance for each return. Speed is calculated by measuring the difference between subsequent pulses. Imaging radar differs from current automotive radars by generating hundreds or thousands of virtual channels with each return generating a position and speed value to generate something that looks similar to a lidar point cloud. Together, these can create more robust view of the environment for better comparisons to the camera environment model.

Mobileye has been operating robotaxi pilots in Tel Aviv for some time and is partnering with providers in several other regions including China, South Korea and Dubai to expand and launch commercial services in the next few years.



READ NEWS SOURCE

Also Read  You Probably Don’t Realize How Inefficient Internal Combustion Engines Are