Like many if not most developers of automated driving systems (ADS) Pittsburgh-based Argo AI has relied on Velodyne lidar sensors as a key component of its sensor suite ever since the company was founded in late-2016. However, when the first production application of the Argo ADS debuts in 2022 with Ford, it won’t have Velodyne’s laser sensor. Those vehicles will be equipped with sensors developed from Argo’s 2017 acquisition of Princeton Lightwave.
The decision to switch from Velodyne to its own internally developed sensor was driven mainly by performance requirements that weren’t being met. Much of Argo’s development through several generations of Ford Fusions and now Escapes has been focused on urban driving in cities including Pittsburgh, Detroit, Miami, Washington DC and Austi, Texas. But Argo and its partners and Ford and more recently Volkswagen wanted to enable operation at highway speeds as well.
Currently, the highest performance Velodyne rotating sensor is the Alpha Prime which has a maximum range of 220m for targets with 10% reflectivity and 150m with 5% reflectivity. That was considered insufficient for highway speeds. This is partly due to the use of a 903 nm laser which is within the range that can cause damage to the human eye. As a result power output must be limited.
The Argo lidar is also a rotating design like the Velodyne puck sensors, but it differs in some fundamental ways that allow it to achieve significantly better claimed performance. According to Argo’s head of hardware, Zach Little, the new lidar sensor can detect 10% reflective targets at 400m and 3.5% reflective targets at 250m. Most of the industry has coalesced around 10% reflectivity range as specification they advertise. One of the challenges for all ADS lidar and camera sensors is detection of black objects including vehicles, but also road debris like truck tire treads.
“The black paint range, you know we see significantly further than 50m but at least in all scenarios we see 50m with our LIDAR to the really low, to the 0.3% reflectivity target is a specific automakers black paint,” said Little. That’s the lowest we’ve found, and hence why that metric is very important to us.”
There are multiple factors that contribute to achieving that longer detection range including longer wavelength lasers that are in the near-infrared range beyond what the eye can detect. Most lidar companies such as Luminar and AEye that aren’t using lasers in the 900nm range have opted for 1550nm. While Little declined to say specifically what wavelength Argo is using, he did say it is greater than 1400nm. The longer wavelengths allow a higher power level for the laser, thus illuminating a longer distance.
However, the key differentiator for Argo compared to all other automotive lidars is the Gieger mode avalanch photodiode (GmAPD) array. This is the component that detects the photons reflected back from objects. Most lidars use photosensors that create an analog output signal where the value is proportional to the number of photons detected. Closer objects with higher reflectivity will generate more returns and a stronger voltage from the detector. The problem with this approach is that weaker returns yield a lower voltage and a minimum threshold is needed to separate signal from noise.
The individual pixels on the GmAPD array detect single photons with spike output that is clearly distinguished from noise. The result is a digital on/off output from each pixel. A statistical sampling technique is used to aggregate the returns across the array with many pixels picking up returns from each laser pulse. This provides the ability to accurately detect returns and reject interference at longer distances.
Argo isn’t discussing the actual number of laser emitters, but there is a column of lasers which pass through a lens that creates a vertical slice of light with each pulse. Thus everything in that slice gets detected at the same time, with much higher resolution, essentially like capturing a photo image. There isn’t actually a beam steering system as found in most solid-state lidars, just the vertical slice and the mechanism that rotates the whole sensor.
This approach yields a resolution of 70 points per degree squared. While that is significantly less than 1,000+ points per degree squared claimed from sensors with dynamic scanning such as AEye or Luminar, those sensors can only achieve that very high resolution over small regions of interest. The Argo lidar captures this resolution over its full vertical field of view on every frame. Thus smaller objects that may be motion are less likely to be missed as mirror or MEMS beam steering system scans back and forth, one row at a time. The dynamic scanning sensors have advantages in certain scenarios but the Argo has an edge in others.
The resolution and range capabilities allow the Argo sensor to create an image that can also be analyzed by some machine vision algorithms to do classification of targets. This is an important aspect of creating a safe and robust ADS. While cameras generally provide the best object classification capabilities, they are not as good for measuring distance and they don’t perform well in low light conditions or bad weather. Lidar like this Argo sensor performs very well in low light and radar provides an additional complement in poor weather. The combination allows for cross-checking of results from different algorithms to provide greater certainty about what is being detected.
The Argo lidar is designed to be automotive grade with the ability to operate from -20C to 65C. Another interesting feature is the choice to have the outer sensor housing rotate. Most current rotating sensors have a stationary housing with the rotating parts contained inside. Little claims that this approach allows the Argo lidar to throw off water and other debris, helping to keep it cleaner.
Some of Argo’s test fleet is already running with the in-house lidar with most of the rest to be updated by the end of the year. Argo AI CEO Bryan Salesky worked with Chris Urmson, CEO of Aurora on the Carnegie Mellon team that won the 2007 DARPA Urban Challenge and then later at the Google