Transportation

Mobileye Expands – More AV Test Sites, Lidar And Imaging Radar


CES may be completely online this year, but that didn’t stop Mobileye CEO Amnon Shashua from making his now annual appearance to talk about the latest in driver assist and automated driving technologies. This year, Shashua’s focus was on the expansion of Mobileye’s automated driving test program, the company’s approach to safety and the sensors it is developing in-house with parent company Intel

INTC

Mobileye made its mark as a pioneer in the advanced driver assist systems (ADAS) sector with its vision systems from forward collision warning and lane keeping assist. Those technologies and more are now ubiquitous in most vehicles sold in North America, Europe and increasingly in China and Mobileye is the market leader. With its expertise in vision systems, it makes sense that Mobileye would make that the center of its Level 4 highly automated driving systems (ADS).

Safety has always been a key driver in Mobileye’s product strategy and it remains so to this day with development of L4 systems. When Tesla

TSLA
first launched its original AutoPilot system on the Model S in 2015, it was powered by Mobileye technology, but following the fatal crash of Joshua Brown, Mobileye realized Tesla was using the system in ways it never intended and severed its supply relationship. 

One of the key challenges of moving up to higher levels of automation is actually validating that the technology is safer than human drivers. Shashua’s hour-long presentation explains the math behind the company’s determination that an ADV needs about 100 million hours mean time between failures (MTBF) for perception errors in order to be significantly safer than human drivers. In order to achieve this and actually verify it, you need two distinct methods of sensing and software to actually perceive the environment around the vehicle. 

Most ADS companies are using a combination of at least three sensor types, cameras, radar and lidar and fusing the data together into an environment model. Mobileye is using the same sensors but in a different way. A set of 11 cameras are used to create a 3D model of the environment that Shashua described at 2020 CES as Vidar. Only this data is used as an input to the vehicle’s path planning system. The same data is also fed into Mobileye’s responsibility sensitive safety (RSS) model which uses mathematical models to ensure the system never issues a command that will cause or make a crash worse. 

But cameras alone only get you to 10,000 hours MTBF because of their limitations in detecting distance and speed and being able to see in poor weather conditions. So Mobileye also uses radar and lidar that are fused together and passed into a second instance of the RSS model. In this way, there are two separate sanity checks on the camera-based control system. Combined, these are expected to achieve the 100 million hour benchmark. This approach is definitely redundant and also diverse to make the system less likely to issue an incorrect command. So Mobileye’s safety standard should be doable. 

However, it’s critical to keep in mind the definition of L4 automated driving. An L4 vehicle can operate completely without human supervision or intervention, but with a limited operating domain. The operating domain restrictions can be anything, location, speed, weather or any other arbitrary criteria. Level 5 is the same, but able to operate anywhere and any time. Aside from Tesla, almost no one thinks that a camera-based system can achieve L5 anytime in the foreseeable future including Mobileye. The Mobileye approach to “True Redundancy” should be able to do anything Tesla can but with less likelihood of a catastrophic failure because of the radar/lidar cross-check. 

However, the Mobileye L4 will have operating limitations, something Sashua acknowledges. The decision not to use either the radar or lidar as inputs to the control will limit the operating domain. Shashua doesn’t rule out using additional sensors at some point in the future, but has no current plans. 

The current Mobileye fleet has been testing in Tel Aviv, Israel for several years already and began operations in Munich, Germany last fall. With the successful testing in those two locations, Mobileye is now expanding its test operations. In late 2020, Mobileye AVs began running on the roads around Detroit and in the coming months, the company plans to add Paris and Shanghai. Mobileye is also working with the state of New York in hopes of adding New York City sometime in 2021. 

A key to this rapid expansion is the use of Mobileye’s Roadbook HD map system. Based on the Road Experience Management (REM) platform, Mobileye has been building this crowd-sourced map system since 2018. There are now more than 1 million vehicles from six automakers equipped with the Mobileye EyeQ4 chipset and contributing road feature data to REM. Approximately 700 million kilometers of roads globally have been mapped already with about 8 million km being added daily. By 2024, Mobileye expects to be refreshing/adding 1 billion km daily. This has enabled Mobileye to start running in new cities within a few days where other companies have to spend several weeks building maps of the area to be driven. 

The existing Mobileye test fleet is using lidar sensors provided by Luminar, a startup that is working with both Volvo and Toyota. That’s the hardware that Mobileye plans to use when it launches commercial robotaxi services in the next two years. However, that is only a temporary solution. 

Mobileye is working closely with its parent company Intel to develop new radar and lidar sensors in-house that provide better performance at a lower cost. The goal is to have an L4 system that is reliable, safe and cost-effective enough to offer on consumer vehicles by 2025. 

The Intel-Mobileye sensors are dubbed EyeC and Shashua revealed some additional details this week. Unlike current generation automotive radar systems that typically have no more than a dozen or so returns. While they can return very accurate measurements of distance and speed of the targets being measured, they don’t have sufficient resolution to be able to distinguish the type of target or whether a stationary object is a vehicle parked on the side of the road, a sign or overpass. This is why Tesla’s Autopilot which features only a single forward radar in addition to cameras so often runs into parked vehicles or vehicles crossing an intersection. 

Mobileye and Intel and are developing an imaging radar sensor that features more than 2,300 virtual channels. This will enable it to create an image similar to a lidar point cloud. Since radar can “see” in the dark and through fog, rain or snow, it can be an important supplement to cameras. It’s also completely chip based with no moving parts so it is relatively inexpensive and robust to real world driving conditions. 

The lidar being developed is based on technology from Intel’s silicon photonics division. Mobileye has chosen to develop frequency modulated continuous wave (FMCW) lidar. Unlike most current lidars that pulse the laser and measure the time for each pulse to be reflected in order to measure distance, FMCW uses a continuous laser beam. Distance and speed are determined using the doppler effect which changes the frequency of the reflected light. Time of flight lidars estimate speed from the difference in distance between subsequent laser pulse reflections. Since FMCW uses doppler it gets instantaneous speed just like radar. This allows for more accurate speed measurements and the ability to filter out things like raindrops and snowflakes. 

Mobileye hasn’t provided details on its beam steering system but does claim the system is completely solid state and Intel will manufacture the sensors using existing production processes. Mobileye has also selected a unique laser wavelength. Most current silicon lasers operate at around 905 nm which are relatively low cost but the output must be limited to prevent eye damage. 

Companies such as Luminar, Aurora and Aeye use 1550 nm lasers which are eye-safe but more expensive. Mobileye has opted for 1310 nm lasers which are easier and cheaper to make while remaining eye-safe and having reduced water absorption than 905 nm. 

In addition to selling complete automated driving systems, Mobileye also plans to offer its EyeC radar and lidar sensors to other companies that wish to buy them opening up a new revenue stream for the company. 

Mobileye has positioned itself to remain a leader in the ADAS space while also expanding its product offerings that can supplement ADAS with its new sensors. As highly automated driving systems become more mature and common in the second half of the decade, Mobileye may grow its presence even further.



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.