Range, velocity and azimuth — or horizontal angle — have long been the key dimensions that radar systems use to perceive a vehicle’s surroundings.

But with 4D radar, another dimension has been added: elevation.

It’s growing in popularity as suppliers and companies developing these sensors have found that increasing radar range and resolution, and using elevation, are critical for better and more precise object detection.

Traditional radar and camera-based systems, such as for Level 2 and Level 2 Plus automated driving functions, typically have a range of around 200 meters.

Adding elevation boosts the vehicle’s view of its surroundings and increases the system’s range. Improving resolution provides greater detail of the scene.

Part of the appeal of the technology is not only being able to detect more details about the driving landscape and the pedestrians, objects and obstacles in it, but also having more data about situations typically challenging to sensors, such as overpasses and harsh weather conditions.

Increased range and resolution via 4D radar could make or break a vehicle’s ability to successfully navigate complex driving scenarios at higher levels of automation, said Marc Bolitho, senior vice president, engineering, for ZF’s electronics and advanced driver-assist systems division.

“For automated vehicles, there’s a need to have more capability in sensing to see further out and to see with greater resolution, so you can pick up additional objects in the road, you can distinguish and separate those objects,” Bolitho told Automotive News.

“It’s really the combination of that increased range, the ability to separate objects due to that increased resolution, and then the elevation,” he added. “But it’s not just the elevation.”

ZF’s 4D radar with increased resolution — which the supplier will be providing to SAIC, China’s largest automotive manufacturer, starting next year — has a range of 350 meters. ZF says its system has 16 times more resolution than typical automotive radar and receives about 10 data points from a pedestrian, compared with the usual one or two.

Continental has also developed a higher-resolution system using elevation — its ARS540 4D image radar maps a driving environment up to 300 meters.

Israeli startup Arbe has introduced a 300-meter “ultra-high resolution” 4D radar, which the company announced it will provide to Chinese AV tech company AutoX’s Level 4 robotaxis.

Another Israeli company, Vayyar, has developed a 4D imaging “radar-on-chip” that can be used for advanced driver-assist systems and for in-vehicle monitoring, such as child-presence detection or seat belt reminders.

These 4D systems allow a vehicle to identify many more data points than systems that don’t incorporate elevation, Bolitho said.

“We can see small objects. We can see a tire in the road, for example, or you can see a wooden pallet in the road,” Bolitho said. “You can pick up road boundaries as far as the edges of roads, if there’s some differentiation in the heights of the road edges.”

Separation

Another key point is being able to see the dimensions and orientations of other vehicles, allowing the system to separate and classify vehicles in the road that are close to one another, he said.

“You can really start to separate a car on the road versus an overpass or a tunnel,” Bolitho said. “That’s really important for autonomous vehicles because you don’t want to classify a tunnel as a vehicle and stop. You want to be able to understand that that’s clearly separated from the road surface and that you can drive through that tunnel.”

The systems are attracting more attention from the industry as another piece to solving the automated-driving puzzle.

The market for 4D radar is expected to hit $6.4 billion in revenue by 2025, according to Guidehouse Insights.

But there are still challenges to address.

Incorporating that much more data about the surrounding environment requires more processing power, Bolitho said.

“Packaging all of this inside of a product that can be packaged on a vehicle to get all of this capability is one challenge,” he said.

“And then, processing all of the data as well, to be able to identify and classify the objects around the vehicle.”

Similar Posts