Nvidia officials, who earlier this year unveiled the Drive PX 2 platform for self-driving cars, are opening up about the company’s next-generation autonomous vehicle technology that will include a new GPU architecture and computer vision accelerator.
Unveiled this week at Nvidia’s GPU Technology Conference (GTC) in Europe, the Xavier system-on-a-chip (SoC) will be a supercomputer with artificial intelligence (AI) capabilities that company officials said will act as the brain for autonomous vehicles. In introducing Xavier, Nvidia founder and CEO Jen-Hsun Huang called it “the greatest SoC endeavor I have ever known, and we have been building chips for a very long time.”
Xavier, which will begin sampling with automakers, suppliers, startups and research institutions in the fourth quarter of 2017, includes an integrated GPU based on Nvidia’s upcoming Volta architecture and a custom eight-core CPU architecture, according to officials. It also will offer a new computer vision accelerator and will deliver 20 trillion operations per second (TOPS) of performance while consuming only 20 watts of power.
Xavier will include 7 billion transistors and be built on a 16-nanometer FinFET process technology.
Nvidia officials said a single Xavier SoC will be able to replace a current Drive PX 2 platform that comprises dual mobile SoCs and two discrete GPUs while consuming significantly less power.
Self-driving cars are among the emerging markets—which also include AI and deep learning—that Nvidia officials are targeting as key growth areas going forward. The company over the past couple of years has rolled out a range of products designed for the nascent driverless car market, and this year it unveiled the Drive PX 2 platform, which uses Nvidia’s Tegra “Parker” processor. Parker consists of the vendor’s “Denver 2.0” CPUs and two Pascal-based GPUs designed to power deep-learning applications that will make cars smart enough to recognize and respond to obstacles.
Earlier this month, Nvidia unveiled a smaller, single-processor version of the Drive PX 2, which also includes AI capabilities and machine learning to enable the computer system in the car to collect, store and process data from multiple cameras and sensors in real time.
Nvidia also offers its DriveWorks suite of software, which includes tools, libraries and modules, to help with the development and testing of autonomous vehicles. The company is demonstrating both the Drive PX 2 and DriveWorks this week at the GTC Europe show.
Also at the event, Huang announced a partnership with mapping and navigation technology company TomTom to develop AI capabilities to bring a cloud-to-car mapping system for self-driving cars. The companies will use TomTom’s high-definition map coverage—which officials said already covers more than 74,500 miles of highways and freeways—and the Drive PX 2 platform to develop a system for real-time in-vehicle localization and mapping for autonomous vehicles traveling on highways.
In addition, the DriveWorks software development kit (SDK) now includes integrated support for TomTom’s mapping environment.
“Self-driving cars require a highly accurate HD mapping system that can generate an always up-to-date HD map in the cloud,” Rob Csongor, vice president and general manager of Nvidia’s Automotive unit, said in a statement.
Nvidia isn’t the only chip maker pushing its way into the self-driving car space. Intel in July announced a program with BMW and Mobileye to develop a technology platform that will enable the automaker to put self-driving cars on the road by 2021. NXP Semiconductors—helped by its acquisition last year of Freescale Semiconductor—is rolling out products for the space, and Renesas Electronics on Sept. 13 announced a deal to buy fellow chip maker Intersil for $3.2 billion in an effort to expand its presence in the automotive space. Qualcomm has rolled out its own portfolio of products for connected and autonomous cars.