It started with a cup of coffee.
In attempts to curb drowsy driving, Mercedes-Benz introduced a system that identified clues that a motorist was growing tired, such as how often they departed their lane. If the system detected trouble, it illuminated an icon resembling a steaming cup of coffee on the instrument cluster.
Roughly a decade later, driver-monitoring systems are being readied for wider deployment. From their caffeine-boosting beginnings, these systems have evolved. Today, cabin-facing cameras track a driver’s head position and eye gaze. In the future, these systems might assemble profiles of individual drivers and discern their emotions and cognitive capabilities.
“There’s a progression in the technology,” said Tom Herbert, product director at Veoneer, a Swedish technology company that will introduce a next-generation driver-monitoring system this year centered on human-machine interaction. “It’s about, ‘How do we improve the relationship between the car and driver, and what are the right pieces of technology to accommodate that?’ It really becomes more of a fusion.”
As the promise of self-driving vehicles for consumer use remains far off, improving human drivers could result in long-term safety gains. Automakers and safety groups believe effective driver monitoring paves the way for reductions in traffic deaths as systems safeguard against fatigue, alcohol and drug impairment and inattention. Moreover, driver monitoring offers an additional safety layer to a new wave of advanced driver-assist systems that enable drivers to remove their hands from the wheel — but not their eyes from the road.
These systems include Cadillac’s Super Cruise, which launched on the 2018 CT6 sedan and will expand to 22 more General Motors vehicles in the next three years. This month, Ford said it would bring a rival hands-free system called Active Drive Assist to the market, starting with the new Mustang Mach-E. Others developing driver monitoring include Veoneer, Australian supplier Seeing Machines and Israeli startups such as EyeSight Technologies and ADAM CogTec.
Tesla’s Autopilot feature has played a prominent, if inadvertent, role in driver-monitoring innovations. Tesla’s owners manual says drivers should always keep hands on the steering wheel, and the company measures torque on the wheel to ensure human engagement.
But the National Transportation Safety Board found in investigations of multiple fatal crashes during which Autopilot was engaged that solely using steering wheel torque was an “inadequate” method for determining whether drivers were attentive. Those findings have sent some automakers, though not Tesla, in search of driver-monitoring innovations that use cameras to keep an eye on driver involvement.
The market for driver-monitoring features, and the advanced driver-assist systems of which they are part, is set to grow. Global consulting firm SBD Automotive says 8 percent of all new vehicles sold in the U.S. will contain a system with Level 2 automation by 2025, and 100 percent of those will contain a driver-monitoring system. For now, those systems remain in formative stages.
“There are lots of different flavors of driver-monitoring systems, but the ones we see in the industry today are all very, very basic, and honestly, not very effective,” said Alain Dunoyer, head of autonomous car research at SBD Automotive. “They’re more looking at, ‘How long have you been driving?’ rather than ‘How good is your driving?’ But that’s going to change.”
Mario Maiorana has been leading that change. As the chief engineer of Super Cruise, he has overseen development of what GM calls its Driver Attention System, which is responsible for monitoring drivers and communicating with them.
When a driver enables Super Cruise, a solid green light bar appears at the top of the steering wheel to indicate the system is active. Algorithms running on a feed from an inward-facing camera determine whether the driver’s head and eyes are facing the road.
Should a driver’s attention wander for approximately five seconds — it varies depending on operating conditions — the green light will flash, an indication a driver should pay attention. If they fail to do so, a red indicator will flash, and the car will provide haptic and audio warnings. If those are ignored, Super Cruise will ultimately disengage and tell the driver to retake control.
One key in designing the system has been finding a balance between ensuring attentiveness and not making the technology so reactionary that drivers don’t use it.
“We think we’ve struck a very good balance between detecting driver attention while not being annoying,” Maiorana said. “Another learning is that the Driver Attention System really becomes a training aid. It trains you to pay attention. People using Super Cruise don’t like it when the system gives control back.”
That balance helps establish trust with drivers, who want enhancements that incentivize positive habits more so than buzzers and warnings telling them what they should not or cannot do.
Though deciphering the state of everyday motorists is a relatively new concept for the automotive sector, driver monitoring has been standard practice in certain corners for more than a decade.
Seeing Machines, an Australian company that utilizes computer vision systems to observe driver attention, has been using cameras to watch operators of specialized mining equipment for 15 years.
“We got our start on these very large earth-moving trucks that are three stories tall,” said Nick DiFiore, senior vehicle president and general manager of the company’s automotive practice. “They run long shifts, and you can imagine the damage that can be done with a mistake.”
So Seeing Machines outfitted vehicles with early operator-monitoring systems, providing a shake of the seat for any driver showing signs of fatigue. Further, the company would send an alert to a designated manager who could ensure the driver took a break.
The company had been working with automakers and gained further traction in the industry in November 2017, when NTSB investigators concluded Tesla’s steering-wheel torque measuring was not a sufficient way to determine driver engagement. The agency recommended all manufacturers developing Level 2 automated systems should “develop applications to more effectively sense the driver’s level of engagement and alert the driver when engagement is lacking.”
“We started getting calls almost immediately after that report came out,” DiFiore said. “It was a turning point.”
He said Seeing Machines now works with six automakers on projects in various stages of development.
Across the industry, those features are growing more sophisticated. Current systems can monitor head position, eye gaze and blink rates. In the future, cars will build profiles of individual drivers so the car can tell when driving behavior deviates from normal. More sophisticated algorithms may track ocular parameters to help automakers understand the cognitive condition of humans behind the wheel — something that may be a better measure of their abilities than head position.
“Theoretically, you could be looking at the road and still be distracted, so we’re trying to understand that,” Veoneer’s Herbert said. “We see this as being an ever-evolving performance in regard to understanding the driver.”
The evolutions come at a fortuitous time. Wolfe Research, another automotive consulting firm, estimates that advanced assist systems popularly known in the industry as Level 2 Plus will be a $6 billion market by 2023.
For the engineers, the benefit is in seeing driver monitoring play a role in continuing a shift from passive safety systems to active ones that result in significant declines in collisions and traffic deaths. Autonomous vehicles were supposed to usher in that era, but the enabler may be instead a marriage between man and machine.
“It isn’t just driver monitoring,” Herbert said. “But it’s a driver and a car taking in all this environmental information and making new decisions. And I think what you’ll see is a paradigm shift because of all the technology associated with this collaborative side of driving.”