The name of the base system (platform): | Artificial intelligence (AI, Artificial intelligence, AI) |
Developers: | Nvidia |
Date of the premiere of the system: | 2016/09/15 |
Last Release Date: | 2021/04/12 |
Branches: | Transport |
Technology: | Robotics |
Content |
2021: Next Generation Nvidia Drive
On April 12, 2021, at its annual GTC Nvidia conference, the next generation of the DRIVE platform called Atlan. The developer calls it a "data center on wheels."
DRIVE Atlan essentially combines the entire computing infrastructure of a smart car in one chip. The platform can provide simultaneous operation of automotive self-government systems, intelligent on-board devices, multimedia applications with a high level of safety.
DRIVE Atlan includes next-generation computing cores with ARM architecture, deep learning accelerators, and machine vision. The reported performance exceeds 1,000 trillion operations per second.
DRIVE Atlan samples are expected to become available in 2023, and vehicles based on this platform are not earlier than 2025.
NVIDIA actively cooperates with automakers. In the next six years, cars equipped with NVIDIA Drive technologies will enter the roads from companies such as Volvo Cars, Mercedes-Benz, NIO, SAIC, TuSimple, Cruise, Zoox, Faraday Future, VinFast and not only.
The full list of announcements on GTC 2021 is available here.
2018
Implement facial recognition
On October 15, 2018, Visionlabs announced the introduction of facial recognition inside and outside the car into the Nvidia Drive platform. According to the company, this technology will replace the keys, provide "smart" and safe access to the car, personalization, control over the condition of the driver.
The Visionlabs Luna platform can function as a plugin on the Nvidia Drive IX platform, powered by the Nvidia Drive AGX supercomputer. Thanks to the facial recognition system, drivers will not need a key to start the car. The camera can recognize the owner of the car "from afar" and as it approaches confirm his identity in order to ensure keyless access and personalization in driving.
Drive IX will allow automakers based on artificial intelligence to create next-generation cars with a fundamentally different approach to their control. As of October 2018, artificial intelligence for facial identification is a unique technology. It should work reliably, while at the same time allowing you to recognize faces in a safe and user-friendly way. Ratin Kumar, Senior Software Director and Head of Drive IX Platform at Nvidia |
Nvidia Drive IX uses the performance of the Nvidia Drive AGX platform, which allows automakers and suppliers to accelerate the production of automated and autonomous vehicles. [1]
Partnership with Continental
On February 5, 2018 Nvidia Continental , they announced a partnership to create a full-fledged self-driving car system with artificial intelligence (AI) based on the Nvidia Drive platform. The system is planned for launch on the market in 2021. Partners plan to create computing systems with AI of different levels: from the 2nd level with partial automation to the 5th level with complete self-government, in which case the car will no longer have a steering wheel or pedals.
The main statyacontinental Autonomous systems of driving
Nvidia Drive IX Intelligent Experience
According to January 2018, the NVIDIA DRIVE IX Intelligent Experience platform is a set of software tools for creating AI applications, such as face recognition to unlock the car, perceiving the environment to notify the driver of a possible danger, recognition of gestures for control, speech recognition for voice control and tracking the look of the driver and not only.
Nvidia Drive Architecture
The Nvidia Drive architecture allows car manufacturers to create and market cars and trucks that are functionally safe and can be certified to international safety standards such as ISO 26262.
Nvidia Drive is a holistic security platform that includes the following processors, technologies, and modeling systems.
- Process: Identify steps to create comprehensive security practices for designing, managing, and documenting an autopilot system.
- Processor Development and Hardware Functionality: Use diverse processors for fault tolerance. The developed decision NVIDIA for NVIDIA Xavier™ which are turning on the central and graphic processing units, the accelerator of deep training, ISP for processing of images, PVA for computer sight and the video processors conforming to standards of the highest quality and safety concern them. They support dual lockstep processing technology and ECC (error-correcting code) in memory and buses, and have built-in testing capabilities. The ASIL-C NVIDIA DRIVE Xavier processor and the ASIL-D security microcontroller are capable of providing an ASIL-D rating.
- Software part: includes the world's best security technologies from key partners. The NVIDIA DRIVE operating system includes a 64-bit real-time operating system BlackBerry QNX, which has an ASIL-D security certificate, as well as a TTTech MotionWise security application framework that isolates applications from each other, while providing real-time calculations. NVIDIA DRIVE fully supports Adaptive AUTOSAR, an open automotive architecture and application framework. The NVIDIA tool suite, including the CUDA ® compiler and TensorRT™, uses ISO 26262 tool classification levels to ensure the security and reliability of the development environment.
- Algorithms: the software stack of the autonomous car NVIDIA DRIVE AV performs such functions as its own movement, perception, localization and routing. For fault tolerance, all features support redundancy and multiplicity. For example, redundancy of perception is provided by the variety of sensors used - lidars, cameras and radars. Deep learning and computer vision algorithms running on CPU, CUDA GPU, DLA and PVA increase system redundancy and multiplicity. The NVIDIA DRIVE AV stack is a complete backup system for the self-management stack developed by the automaker and provides cars of the fifth level of autonomy with the highest level of functional security.
- Virtual reality modeling: a self-driving car is a very complex system with the most advanced technologies. The key task is to make the system work as specified, which is reflected in the term SoTIF (security of a given functionality). The system should work in a certain way in a wide range of situations and weather conditions. Testing on the road is not always manageable, repetitive, exhaustive and fast, so a realistic virtual environment is needed. NVIDIA created the NVIDIA AutoSIM virtual reality simulator to test the DRIVE platform and create rare conditions. Working at base of the supercomputers NVIDIA DGX™, NVIDIA AutoSIM it is used for regression testing and has to simulate billions of kilometers of the road.
2017
Nvidia Drive PX Pegasus
On October 10, 2017, Nvidia introduced a computer with artificial intelligence (AI), designed for fully autonomous robotic taxis.
The system codenamed Pegasus is an extension of the computing platform for AI Nvidia Drive PX tasks, allowing you to create autopilot cars of the fifth level of autonomy (Level 5). According to the company, the Nvidia Drive PX Pegasus platform provides over 320 trillion operations per second, which is 10 times the performance of its predecessor Nvidia Drive PX 2.
Appointment
Nvidia Drive PX Pegasus allows you to create vehicles that can work without a driver - completely autonomous, without a steering wheel, pedals and mirrors, with an interior in the style of a house or office, able to arrive on demand and safely deliver passengers to their destinations, ensuring freedom of movement even for the elderly and people with disabilities.
Specifications
Nvidia Drive PX Pegasus is based on four high-performance processors for AI tasks. The platform has two of the latest Nvidia SOC processors with graphics cores based on the Nvidia Volta architecture, as well as two next-generation graphics processors designed to accelerate deep learning and computer vision tasks. This system will provide computing capabilities for fully autonomous cars in a license plate-sized computer with significantly lower power consumption and cost, Nvidia claims.
The Pegasus system has an ASIL D high-level safety certificate and is equipped with autonomous inputs/outputs, including CAN (controller network), Flexray, 16 special high-speed inputs for cameras, radars, lidars and ultrasonic sensors, plus 10Gbit Ethernet connectors. The total memory bandwidth exceeds 1 terabyte per second.
Relevance
As of 2017, of the 225 partners using the Nvidia Drive PX platform, 25 are developing fully autonomous robotaxis based on Nvidia CUDA graphics processors. Their trunk so far resembles a small data center in which computer racks with Nvidia server GPUs are installed, performing deep learning, computer vision and parallel computing operations. The dimensions, power requirements and cost of such systems do not allow bringing such cars to the mass market. The computational requirements of unmanned vehicles are 50-100 times higher than those of the most advanced conventional cars with a person at the wheel.
Creating a fully self-driving car is one of the most important tasks of our society and one of the most difficult to implement, "said Jensen Huang, co-founder and president of Nvidia. - Pegasus with its record speed of computing in AI tasks and efficiency will provide the necessary breakthrough that the whole industry is waiting for. |
Availability
Pegasus will be available to Nvidia partners in the second half of 2018. Nvidia DriveWorks software and Nvidia Drive PX 2 configurations are already available for developers of autonomous cars and algorithms.
Nvidia Drive PX
As of October 2017, Nvidia Drive PX is one of the solutions of the Nvidia family for AI tasks.
Neural networks trained in data centers on supercomputers for AI Nvidia DGX-1 can work on Nvidia Drive PX platforms installed in cars. The unified architecture allows you to use the same software algorithms, libraries and Nvidia Drive tools that work in the data center to perform interference operations in the car. This cloud-based approach allows cars to receive updates "by air" in order to constantly, throughout the life of the car, enrich in capabilities and functions.
According to October 2017, the Nvidia Drive PX platform scales from one mobile processor for Level 2 +/Level 3 cars to a combination of several mobile processors and discrete GPUs for Level 5 cars. These configurations are based on a single open software architecture. All this allows automakers and suppliers to go from development to production in a wide range of solutions - from the AutoCruise mode on the highway to the AutoChauffeur mode for trips from one point to another and to Pegasus for fully autonomous machines.
Nvidia joined AI Technology Partnership
On June 27, 2017, Nvidia announced a strategic partnership with ZF and HELLA. The goal of the partnership is to introduce artificial intelligence technologies into the NCAP (New Car Assessment Program) car safety certification program. The agreement is not exclusive.
ZF and HELLA will introduce a system for self-driving cars, which includes front cameras, software functionality and radar systems.
The goal of the partnership is to provide an NCAP safety rating for passenger cars, for commercial and special vehicles through the technology-enabled NVIDIA DRIVE PX platform [[Artificial intelligence (AI, Artificial intelligence, AI)|Artificial intelligence (AI, Artificial intelligence, AI)]]. NVIDIA DRIVE PX combines NCAP security and unmanned capabilities in one platform.
The NVIDIA DRIVE PX platform will allow ZF and HELLA to develop software for scalable systems, starting with driver assistance systems that use camera and radar acquisition technologies for autonomous control.
The creation of an unmanned vehicle is one of the most important tasks of modern society and one of the most difficult to implement. As a result of our collaboration with ZF and HELLA, AI-based unmanned driving solutions will be created with NCAP security support for millions of cars worldwide. |
We are building a powerful ecosystem step by step. ZF recently became the first supplier to integrate NVIDIA AI technology for automobiles and commercial vehicles into the ZF ProAI solution. A few days ago, HELLA and ZF joined forces in a non-exclusive partnership, and today NVIDIA has joined us to make roads safer and help develop autonomous driving. Dr. Stefan Sommer (Dr. Stefan Sommer), CEO of ZF Friedrichshafen AG |
By combining our experience in the development of front camera and radar technologies with NVIDIA experience in the development of a deep learning platform, we will be able to widely implement autopilot functions in many transport segments of the market. Dr. Rolf Breidenbach, CEO of HELLA KGaA Hueck & Co. |
2016: Nvidia Drive PX 2 AI
Nvidia Drive PX 2 AI is a computer for tasks related to artificial intelligence (AI).
On September 13, 2016, Nvidia introduced the Nvidia Drive PX 2, a computer for processing information and supporting problems based on artificial intelligence (AI) technologies.
The single-processor configuration of the Nvidia Drive PX 2 AI computing platform with the AutoCruise function, which includes automatic driving along the highway and binding to an HD card, consumes 10 watts and allows machines to use deep networks to process data from numerous cameras and sensors. The Chinese company Baidu plans to use the platform as an automobile computer for self-driving cars of the cloud-car system.
Drive PX 2 will help automakers and partners accelerate the production of automated and autonomous vehicles. A car with a compact Drive PX 2 computer and AutoCruise feature is able to understand in real time what is happening around, determine its location on an HD card and lay a safe route.
Many automakers want to equip their cars with an AI-enabled computer in a compact, economical form factor. NVIDIA Drive PX 2 addresses this challenge for OEMs and our leading partners and complements Nvidia's existing data center solution for mapping and training. |
The Drive PX platform is used by more than 80 automakers, suppliers, startups, research institutes working on solutions for autonomous vehicles. The Drive PX 2 architecture can be scaled from a single mobile processor to a bundle of two mobile processors and two discrete GPUs, to a system of multiple Drive PX 2 computers. The ability to scale the solution allows manufacturers to quickly move from the development stage to the production of a wide range of self-driving solutions - from AutoCruise for highways to AutoChauffeur for moving from point A to point B and, finally, to a fully autonomous car.
Nvidia Drive PX is part of the Nvidia family of AI computing solutions. Specialists who train deep networks in the data center on NVIDIA DGX-1 can run them on Nvidia Drive PX 2 in a car. The car uses the same algorithms, libraries and tools of Nvidia DriveWorks as used in data centers.
The integrated solution is based on a unified Nvidia architecture to solve AI problems. Throughout the life of the car, the solution will receive wireless updates to add technology and capabilities.
Nvidia DRIVE PX 2 is based on a "system-on-a-chip (SOUND)" and is equipped with a graphics processor based on the Nvidia Pascal architecture. The NVIDIA Parker single-system-on-chip configuration can process data from numerous cameras, lidars, radars, and ultrasonic sensors. Automatic I/O is supported, including Ethernet, CAN, and Flexray.
The Drive PX 2 single-processor computer will be available to manufacturers in the fourth quarter of 2016, the company's press service said.
DriveWorks software and Drive PX 2 configuration with two on-chip systems and two discrete GPUs are available for developers working on stand-alone vehicle solutions.