Click Image to Enlarge
Eye-tracking systems used to detect driver drowsiness or inattention will be ready for production within a few years, but will most likely be packaged as part of a larger sensor net of interior and exterior camera and radar devices. Ford's Model U concept makes heavy use of the camera-based safety devices that will pave the way for eye-tracking.
The auto industry’s high-tech vehicle safety effort is paying a lot of attention to paying attention, since driver drowsiness and inattention are contributing factors in a large percentage of accidents. For example, the National Highway Traffic Safety Administration (NHTSA) estimates drowsiness plays a key role in about 100,000 crashes annually. Which means detecting when drivers are not paying attention to the road and developing ways to keep them alert could pay enormous dividends. Recent developments promise to put the technology in consumers’ hands within a few years.
Eye-tracking. The key to determining where drivers are looking is tracking eye movements. The most cost-effective way of doing this uses a vision system to take video of the eyes and analyze their movements. The hardware involved is relatively simple: two LED light sources and a small camera mounted somewhere in the instrument panel. The LEDs use an infrared beam instead of visible light, and–along with the camera–are focused on what Huan Yen, manager, Advanced Information and Entertainment Systems at Delphi (Kokomo, IN) calls the “headbox”: the three-dimensional space where most people hold their head while driving.
Once the headbox is defined the next step is to teach the camera how to locate the eyes. This requires a lot of sophisticated software algorithms. “The camera is fairly dumb and it needs clues to find the eyes,” says Yen. But not just any clues. For example, if the camera is instructed to look for oval shapes it could end up tracking a driver’s mouth or nostrils. To accurately locate the eyes, system developers use criteria like “monovision pupil tracking,” which utilizes the differences in color and reflectivity between the pupil and iris.
The system tracks drowsiness or lack of focus based on several eye movement criteria. “Point of gaze” is determined to tell if the driver’s eyes are focused at the proper point on the road ahead, and “gaze variability” measures how quickly the eyes move from looking at one object to another. “When you are alert your gaze tends to jump around between objects like signs, trees and other cars on the road,” says Yen. “But if you are tired you have gaze fixation, and as time goes on your field of view narrows.” Percentage of eyelid closure (known as “perclose”) and blink rate also help determine if the driver is getting sleepy, since people blink more when they are tired and there is a gradual increase in eyelid closure.
Cheaper hardware, faster software. Advances in eye-tracking research have accelerated in the past two years as the costs of computer power has dropped, allowing software engineers to develop more sophisticated and robust algorithms to precisely track eye movements. Vision-based digital signal processors also have gotten faster and less costly, and as any camcorder owner knows, camera technology continues to improve. Until recently, developers had to rely on expensive CCD (charge-coupled device) imaging solutions to gain the clarity needed for eye-tracking, but research has now shifted entirely to the use of CMOS (complementary metal-oxide semiconductor) imagers. CMOS technology is the basis of most of the world’s integrated circuit production, so the newer cameras can take advantage of both the rapid-fire advances in the semiconductor industry and huge economies of scale. Yen puts it succinctly, “With CMOS you can get cheap high-resolution cameras.”
Roadblocks. But as is always the case with automotive electronics, is “cheap” cheap enough? Developers agree that one of their biggest challenges is coming up with a camera that can be mass produced for a few dollars a unit yet still meet the rigorous temperature, vibration and reliability requirements of the automotive environment. And though tremendous strides have been made in the last couple of years, the technology still requires a good bit of fine tuning. Serge Boverie, who coordinates Siemens VDO Automotive’s eye-tracking R&D project, says that the main inhibitors to system performance are mundane things like: fast head movements, eyeglasses and bright sunlight. But there is optimism that these problems can be overcome quickly with a mix of new hardware like next generation CMOS high dynamic range sensors that limit the saturation effect of sunlight and allow cameras to “see” the eyes more clearly, as well as through further algorithm refinements.
|Wake Up Call|
Realizing that technologies like eye-tracking could have a huge impact on reducing traffic accidents caused by drowsiness and inattention, government entities have been developing research projects to speed developments along. One of the most ambitious is the European Union's AWAKE (System for effective Assessment of driver vigilance and Warning According to traffic risK Estimation-a long reach for a credible acronym). The goal of the project is to develop several interconnected modules that monitor both the driver and the traffic situation and issue warnings based on potentially dangerous situations. Key participants
include Siemens, DaimlerChrysler and Fiat.
The future of eye-tracking is also dependent on cost and technology advances in other areas of automotive electronics research. In order to be most effective, eye-tracking must work in concert with the exterior video and radar sensor net currently under development. (For more information see, “Danger Ahead,” AD&P, December 2002.) External sensors that can determine the number and proximity of other vehicles give the eye-tracking system far more capability to determine if the driver is looking where he should and respond to the situation. “Our ultimate goal is to have a more intelligent system that can assess both the driver’s situation and the traffic situation, and change the driver warning threshold accordingly,” explains Yen.
Determining the best way to alert drivers to a potentially dangerous development is an open question. Heads-up displays, audible messages and vibrating seats and steering wheels have all been researched, but no clear winner has emerged. (Though it would seem that a heads-up display projected on the windshield to tell a driver that he is not looking through the windshield is a clear loser.) Not surprisingly, different people prefer different warning mechanisms. And while from the safety standpoint redundant warnings might be preferred, the concomitant increase in system costs would be less than welcome.
But even if the technological hurdles are surmounted and an attractive system cost is achieved, the question remains: Do people want the electronic equivalent of their third-grade teachers sitting in the car hectoring them about not paying attention? Certainly not as a standalone system, but the likelihood of that is fairly remote anyway. The more feasible scenario would make eye-tracking part of a larger safety sensor package designed to give the driver more information about exterior threats, and only secondarily act as a distraction policeman. As to when cars with eye-tracking technology will show up in dealerships, Yen estimates it will happen within two to five years.