Radar scene emulator speeds path to full vehicle autonomy
To increase the speed of test and more pertinently make testing more comprehensive, Keysight Technologies has introduced the Radar Scene Emulator.
It enables car makers to lab test complex, real-world driving scenarios.
The road to fully autonomous driving is currently parked at partial automation. The destination is full automation.
There are potholes on the route.
Michael Reser, Business Development Director Automotive and Energy Solutions at Keysight Technologies indicates challenges to the validation of advanced driver assistance systems (ADAS) and autonomous driving (AD) including Inability to properly emulate the real world with pure software simulation and complex ADAS/AV functions in modern vehicles.
This means full-scene emulation in the lab is critical to developing the robust radar sensors and algorithms needed to realise advanced driver assistance systems (ADAS)/autonomous driving (AD) capabilities.
Keysight’s full-scene emulator combines hundreds of miniature radio frequency (RF) front ends into a scalable emulation screen representing up to 512 objects and distances as close as 1.5 meters.
Automotive companies understand how complex it is to test autonomous driving algorithms, and the safety issues at stake.
Using full scene rendering that emulates near and far targets across a wide continuous field of view (FOV), the emulator enables customers to rapidly test automotive radar sensors integrated in autonomous drive systems with highly complex multi-target scenes.
Keysight’s Radar Scene Emulator employs patented technology that shifts from an approach centered on object detection via target simulation to traffic scene emulation, offering automotive OEMs the following key benefits:
- See the big picture: The Radar Scene Emulator allows radar sensors to see more with a wider, continuous FOV and supports both near and far targets. This eliminates the gaps in a radar’s vision and enables improved training of algorithms to detect and differentiate multiple objects in dense, complex scenes. As a result, autonomous vehicle decisions can be made based on the complete picture, not just what the test equipment sees.
- Test real-world complexity: Testing radar sensors against a limited number of targets provides an incomplete view of driving scenarios and masks the complexity of the real-world. Keysight’s Radar Scene Emulator allows OEMs to emulate real-world driving scenes in the lab with variations of traffic density, speed, distance, and total number of targets. Testing can be completed early for common to corner case scenes, while minimising risk.
- Accelerate learning: Keysight’s Radar Scene Emulator provides a deterministic real-world environment for lab testing complex scenes that today can only be tested on the road. Its test approach allows OEMs to significantly accelerate ADAS/AD algorithm learning by testing scenarios earlier with complex repeatable high-density scenes, with objects stationary or in motion, varying environmental characteristics, while eliminating inefficiencies from manual or robotic automation.
- Improved scene resolution: The ability to distinguish between obstacles on the road needs to be tested for a smooth and fast transition towards vehicle autonomy (i.e., Level 4 and 5 autonomy as designated by the Society of Automotive Engineers (SAE)). Keysight addresses this technology gap with point clouds (multiple reflections per object) which improves resolution for each object.
"Keysight’s Radar Scene Emulator offers automotive OEMs a breakthrough solution that will bring the road to the lab through full scene rendering," said Thomas Goetzl, vice president and general manager for Keysight's Automotive & Energy Solutions business unit. "The vision of fully autonomous vehicles is rapidly approaching, and we’re thrilled to be accelerating this vision into a reality."
Keysight’s Radar Scene Emulator is part of the company’s Autonomous Drive Emulation (ADE) platform, created through a multi-year collaboration between Keysight, IPG Automotive and Nordsys. The ADE platform exercises ADAS and AD software through the rendering of predefined use cases that apply time-synchronised inputs to the actual sensors and subsystems in a car, such as the global navigation satellite system (GNSS), vehicle to everything (V2X), camera and radar.
As an open platform, ADE enables automotive OEMs, and their partners, to focus on the development and testing of ADAS/AD systems and algorithms, including sensor fusion and decision-making algorithms. Automotive OEMs can integrate the platform with commercial 3D modeling, hardware-in-the-loop (HIL) systems and existing test and simulation environments.