What can’t be seen can hurt you. That’s the thinking behind a range of fluidic, compressed-air, ultrasonic and aerodynamic cleaning gear and hydrophobic coatings for the sensors. They are a vital part of automated driving systems.

A clean windshield and healthy wipers help with driver vision and sensor capability. But only if the sensors, like the driver, are positioned behind the windshield’s spritzed and swept glass. Certain thermal-imaging, night-vision and lidar sensors don’t work if they’re mounted under glass, so they are moving onto rooftops, the rear hatch, bumpers and fenders.

“It doesn’t take much to obscure an optical sensor,” explains Russell Hester, director of product development at dlhBowles, a company that engineers and manufactures automotive cleaning systems.

When optical sensors such as rearview cameras and lidars are even partially obscured by precipitation, road grime or bug spatter, their capability drops, or they stop functioning. Any vehicle using vision-style sensors for park assist, adaptive cruise, automatic emergency braking or lane keeping needs a clear view to ensure consistent operation.

“Many new vehicles have camera-based systems fused with radar-sensor inputs to apply the brakes to stop you from rear-ending another vehicle,” Hester states. “Right now, these are technically convenience features. Everyone should be able to look both ways before they back out of a parking spot. But if you’re not paying attention, having the vehicle stop for you if there’s cross traffic is a safety-related item. But it’s not mandated or regulated. In the future, customers will come to expect these systems to function in place of their own eyes and decision-making. It boils down to the customer expectation that if I’m spending the money, certainly in a premium vehicle, and it has this feature where it can drive me or help me drive, then I want it available.”

Recently, AAA ran two tests with four vehicles from different automakers with automatic emergency braking. Under ideal, dry conditions, no cars ran into the soft test obstacles. But in simulated “moderate to heavy” rainfall, 17 percent crashed at 25 mph. At 35 mph, 33 percent crashed.

In a second test of the four lane-keeping systems, vehicles crossed into the other lane 37 percent of the time under ideal conditions. But under simulated rain conditions, lane crossing increased to 69 percent.

“OEMs are quickly realizing that these very discerning customers are going to expect that a little bit of dirt and a little bit of condensation and a little bit of rain won’t disable system function,” Hester says. For now, though, they can.

Researchers, particularly at autonomy companies using lidar or cameras, are working on advanced software that detects the level and type of obscuration at individual sensor windows. It then determines the volume of fluid to be delivered from individual nozzles.

It might even ultimately activate ultrasonic vibrations from piezoelectronics within the sensor to generate scum-loosening sonic shimmies. Dirt is transferred from the window into a thin film of applied fluid, and the window’s high-frequency vibration atomizes that fluid into the atmosphere. This technique could also be used to de-ice, remove condensate and shed raindrops.

These future smart-scrubbing strategies will provide more effective cleanup, reduce overall energy consumption and conserve the tank of alcohol-based cleaning fluid.

Of course, sensors that don’t get dirty don’t need cleaning. Most individual sensor windows are treated with hydrophobic coatings that shed moisture. There also are hydrophobic washer fluids available that have been tested and accepted by cleaning-system makers.

A further smudge-busting tactic, reviewed at Ford Motor Co., involves aerodynamic surface features that direct airflow to divert bugs and dirt particles. One approach: Ducts near the camera lens that funnel incoming air back into the overall airstream. The idea is to push the bulk of the flow around the sensor.

Ford autonomous-vehicle system core supervisor Venky Krishnan described the automaker’s work in a 2019 blog post. He said the company designed the tiara, the structure that sits atop Ford’s AVs and houses various sensors, as a “first line of defense.” Even if an insect manages to get past the air curtain, nozzles can spray fluid to clean the dirty camera lens.

“Just as we must equip self-driving vehicles with the brains to process what’s happening in their environment, we must also equip them with the tools to deal with that environment — no matter what kind of gunk it decides to throw at them,” Krishnan wrote.

Similar Posts