Transportation

Nvidia Launches 1,000 TOPS Automated Driving Chip, Volvo To Launch Orin-Powered System In 2022


Most people know Nvidia

NVDA
for the high-end graphics processing units (GPU) that it designs to power video games. But over the years, it has found ways to make those GPUs useful for a lot more than just rendering textures and three-dimensional objects to shoot at. Nvidia chips are now widely used in everything from data centers to robots to the car in your driveway. During his annual GPU technology conference (GTC) keynote, Nvidia co-founder and CEO Jensen Huang highlighted how all of those are coming together in its next-generation chips dubbed Atlan. 

Nvidia silicon is already widely used in vehicles from many brands around the world to power infotainment systems and instrument cluster displays. It can also be found in the majority of the automated driving development programs going on around the world. The notable exceptions are Tesla

TSLA
which originally used Nvidia silicon before shifting a chip of their own design in 2019 and those that are using Mobileye. 

Nvidia launched its first development platform for driving assist and automated driving systems (ADS) with the Drive PX in 2015. As developers have continued to hammer away at the problem of getting a computer to safely and reliably drive a vehicle, the processing demands have also increased exponentially. 

Nvidia has in turn developed ever more powerful system-on-a-chip (SoC) designs that blend the capabilities of earlier multi-chip solutions into new processors with better performance and lower power consumption. The earlier Parker and Pascal chips were supplanted by Xavier which is now appearing in production vehicles. Platforms like Drive Pegasus that combined 2 Xavier and SoCs and 2 Volta GPUs can now be replaced by a single Orin SoC which will arrive in production vehicles in 2022. 

Moving on to Atlan

Now, the forthcoming designs that feature as many as four Orin SoCs like the Nio ET7, will eventually be replaced by Atlan which Huang announced in his keynote. Atlan is a single chip that Nvidia claims will deliver 1,000 trillion operations per second (TOPS). For reference, the Xavier can hit 30 TOPS and the highest end version of the Orin family achieves 254 TOPS. The Carnegie Mellon University Chevrolet Tahoe that won the DARPA Grand Challenge in 2007 was powered by 10 blade servers with Intel

INTC
Core Duo processors that had combined capabilities of about 1.8 billion operations per second. Atlan will be about 50,000 times faster. 

While most ADS developers are using some form of Nvidia hardware, not all are utilizing these SoCs. Some have created their own custom compute platforms using off-the-shelf GPUs and various other chips including Intel X86 CPUs. 

Nvidia’s SoCs all combine some number of ARM CPU cores with GPU cores, tensor processing units optimized for neural network calculations and a range of other devices all on a single silicon die. The current Orin SoCs have up to 12 ARM Cortex-A78 CPU cores and an integrated GPU based on the same Ampere architecture that powers the latest RTX 30-series video cards. 

Nvidia has yet to reveal all of the details on Atlan, but it will be based on the company’s next-generation successor to Ampere as well as new ARM cores as well as new cores dedicated to deep learning and vision processing acceleration. All of these cores are great at crunching the data and helping to generate an understanding of the world around an automated vehicle. 

However, with dozens of cameras, radars, lidars, infrared and ultrasonic sensors feeding streams of data into a chip like Atlan, the chip itself has become more like a compact data center. As a result, Nvidia is also incorporating what it calls a BlueField data processing unit (DPU). The BlueField DPU is a new device that Huang announced at the 2020 virtual GTC. It’s based on interconnect technology that Nvidia got from its 2019 acquisition of Mellanox, a company that specialized in ultra-high speed networking for supercomputer and data center applications. Nvidia is claiming that the Atlan SoC will have 400 Gbps networking capabilities with a secure gateway. 

In addition to raw bandwidth and data processing capabilities, an automated vehicle also needs power efficiency. Since these vehicles will all be electrified and in most cases fully electric, reducing the power consumption of the ADS is critical to avoid killing the range. Most development ADS today consume about 1.5 to 3 kW of electrical power for the compute and sensing. A compute platform like the 4-Orin system that Nio will use will likely consume somewhere in the range of 250-300 W. Nvidia is revealing the performance per watt or power consumption targets for Atlan yet, but it will no doubt be more efficient than Orin.

Volvo picks Orin

Nvidia is targeting Atlan at 2025 vehicle production programs and test samples are probably at least 1-2 years away. In the meantime, numerous manufacturers have announced plans to use Orin beginning in 2022. Chinese automakers including Nio and SAIC as well as automated truck developers have selected the new chip. Mercedes-Benz is also working with Nvidia on a new centralized compute architecture based on Orin for programs starting in 2024. 

Volvo is now expected to be the first automaker with a global presence to launch an Orin-based system. It will arrive in 2022 starting with the next-generation XC90 and spread to other models that use the same SPA2 vehicle platform. In recent weeks, more details about the automated driving capabilities of the XC90 have been trickling out thanks to announcements from lidar supplier Luminar and Volvo subsidiary Zenseact. 

Luminar and Zenseact are collaborating on the ADS that they are calling Sentinel that debuts in the XC90. Volvo is planning to reveal more details on the actual functionality of the system in the coming months, but for now is providing some hints about the compute platform. Volvo has already been working with Nvidia on development of what it calls the core computer for SPA2 platforms based on the Xavier SoC. 

Base software management, energy management and driver assist features will run on the Xavier platform. This will likely also serve as the backup computer for the ADS domain computer which will use Orin. As we move to vehicles that don’t require continuous human supervision for driver assist the way current systems like GM Super Cruise or Tesla Autopilot do, fail operational capabilities are required. If anything goes wrong with the primary Orin platform, the Xavier computer will likely provide the ability to bring the vehicle to a safe stop. 

Like other companies that are moving to centralized compute platforms, the new Volvos will have reduced wiring complexity and presumably greater reliability and support for over-the-air software updates. 

Volvo isn’t the only company announcing plans to use Orin at GTC 2021. Faraday Future, the EV startup that has been struggling for several years to get past financial issues and get vehicles into production is now targeting 2022 availability for its first product, the FF91. The FF91 and Faraday’s planned follow-up models the FF71 and FF81 will also use Orin. 

Still lots of development and testing to do

Before anyone can launch a fully functional ADS, there is plenty more development and validation work to do. Because the environments where an ADS has to function will vary so wildly and are often difficult to repeat, it’s going to be nearly impossible to do all of the necessary testing in the real world. Every ADS development company is making extensive use of simulation including those working with Nvidia. 

Nvidia announced its Drive Constellation simulation platform back in 2018. It was based on the Drive Pegasus (2 Xavier SoCs and 2 Volta GPUs) and a rack full of Xaviers to generate simulated sensor signals. Nvidia used the ability of its GPUs to create photorealistic scenes to generate synthetic data for simulation. Starting from real world data captured by test vehicles, the sensor signals could be manipulated by changing the lighting, weather or adding other road users to the scenario. 

The challenge with any simulation is that it is only as good as its fidelity to the real world. Machine vision systems in particular are susceptible to errors based on very subtle artifacts created in digital environments. The human brain is surprisingly resilient at filtering out anomalies when processing signals from the eyes and other senses. Computers still struggle to match this capability and training a machine learning algorithm on data that doesn’t quite match reality can lead to erroneous conclusions. 

Most current simulation platforms have been based on using video game engines like Unreal. These systems have excellent capabilities at replicating physics and creating real-enough environments for games, but real-world perception is quite different. For its second-generation Drive Sim platform, Nvidia is adopting its Omniverse platform that was announced in 2020. 

Omniverse is a production environment designed by Nvidia to leverage its latest RTX-series GPUs and their support for real-time ray tracing. Ray tracing is a process that provides vastly more realistic lighting effects. Light reflectivity varies wildly based on the material that photons are bouncing off including the texture, opacity and color among other factors. An environment that can harness multiple ray tracing GPUs should look much more like the real world and get closer to the fidelity required to have confidence that if an ADS works in simulation it will work on the road. 

On road driving is an aspect of ADS development that occurs before, during and after simulation and it takes a lot of hardware and software. For several years, Nvidia has been offering a kit it calls Drive Hyperion as an extension of the Drive AGX compute platforms it sells for ADS development. Hyperion adds a full suite of sensors and software for both ADS and interior experience development. 

The new eighth-generation Drive Hyperion system includes a compute platform with two Orin SoCs for the automated driving plus another Orin SoC for intelligent cockpit functionality. The sensor suite includes 1 lidar, eight 8MP cameras, four 3MP fisheye cameras and nine radar sensors. There are also three interior cameras available for driver and occupant monitoring. Hyperion users will also get source access to the Drive AV and Drive IX software stacks. The Hyperion 8 system will be available later in 2021.



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.