Sandia LabNews

Sandia researchers help to make cars smarter


Image of pic3.jpg

Editor’s Note: The following story is part of a series of articles on Sandia’s Cognitive Science and Technology Program.

Cars already automatically lock doors when they sense motion and turn on warning lights if they detect potential engine problems.

But they are about to get smarter.

Sandia’s augmented cognition research team is designing cars capable of analyzing human behavior.

The car of the future that the team is developing may, for example, deduce from your driving that you’ve become tired, or during critical situations, tell your cell phone to hold an incoming call so you won’t be distracted.

The project started about five years ago with funding by the Defense Advanced Research Projects Agency (DARPA). Four years ago Sandia partnered with a major commercial automobile manufacturer, and three years ago did actual experiments on European roadways.

“We utilized data that already existed on the car’s computer to collect a wide range of physical data such as brake pedal force, acceleration, steering wheel angle, and turn signaling,” says Kevin Dixon (6341), principal investigator. “And specialized sensors, including a pressure-sensitive chair and an ultrasonic six-degree-of freedom head tracking system, measured driver posture.”

Five drivers were fitted with caps connected to electroencephalogram (EEG) electrodes to gauge electrical activity of the brain as they performed driving functions.

The researchers collected several hours of data in unstructured driving conditions that were imput into Sandia software, referred to as “classifiers,” that categorized driving behavior. These classifiers could detect certain driving situations such as approaching a slow-moving vehicle or changing lanes in preparation to pass another vehicle.

The system detects the difficulty and stress of the task the driver is attempting. It then tries to modify the tasks and/or environment to lower the stress and improve specified performance parameters.

Similar experiments were conducted for off-road driving where conditions were much less structured than typical roadways.

“The beauty of this is that we aren’t doing anything new or different to the car,” Kevin says. “All the software that can make the determination of ‘dangerous’ or ‘safe’ driving situations would all be placed in the computer that already exists in the car. It’s almost like there is another human in the car.”

More recently, the researchers conducted experiments at Camp Pendleton with Marine Corps personnel driving a modified military vehicle. Once again the driver and a passenger sitting in the passenger’s seat were fitted with EEGs. The software classifier determined how difficult the driving situation was and the best person of the two to perform a task. For example, during a difficult driving maneuver, it might be best for the passenger to receive radio transmissions in order to not distract the driver.

“Every year tens of thousands of people die in automobile crashes, many caused by driver distraction,” Kevin says. “If our algorithms can identify dangerous situations before they happen and alert drivers to them, we will help save lives.”

Return to Lab News home page