Given the sheer numbers of things that go on in cars these days, it is, perhaps, a touch worrying that scientists still aren’t quite sure how the human brain reacts to distraction. It’s not really their fault: Sophisticated camera rigs and eye glance analyzing tools have only shown up in the last decade or so.
The amusements, too, are new. There are the time-honored diversions—yelling at your kid in the back, adjusting the radio, scarfing down a sandwich. And then there’s the veritable amusement park of novel options: following the in-car navigation system, texting mom, ‘gramming your commute.
Nearly 40,000 people died on American roads last year, and experts believe the damage done by distraction has spiked.
If only the divertissements inside your car worked with you, if they knew exactly when you needed to keep two eyes on the road—and didn’t beckon you to do the opposite. Getting them to do that is the goal of researchers with the Massachusetts Institute of Technology’s Age Lab and Touchstone Evaluations, a human factors engineering firm based in Michigan. Funded by major auto and tech players like Denso, Honda, Jaguar Land Rover, Google, and Panasonic, the researchers are working to accurately to model how humans act inside cars, and shape their behavior to keep them safe.
“How can I keep the driver’s awareness of the situation high while they search for something to listen to on their new infotainment system?” says Linda Angell, a former General Motors engineer who heads up Touchstone. “How can I structure this task in a way that their eyes are on the road, and give them frequent enough breaks, and cue them to look at the road once in a while?”
Last week, the team released a paper that seeks to capture human „attentional awareness” in mathematical terms—with an algorithm. One day soon, they hope auto suppliers and designers will use this knowledge to build products that will aid drivers in, you know, not killing themselves and others.
Attentional Awareness FTW
Lawmakers and parents like to talk about “driver distraction,” but it’s not a simple idea. There’s no on or off switch for driver focus. Attention, like so many things, is a spectrum, and it combines many elements.
“Most of the research in the past has been either visual, audible, or haptic—they haven’t been combined all into one,” says Douglas Patton, Denso’s head of engineering.
In 2012, government-sponsored researchers rigged up 2,600 regular drivers’ vehicles with cameras and sensors in six states, then left them alone for more than a year. The result is a large, objective, and detailed database of actual driving behavior, the kind of info that’s very useful if you want to figure out exactly what causes crashes.
The MIT researchers and their colleagues took that database and added another twist. While many scientists looking to crack why a crash happened might look at the five or six seconds before the event, these researchers backed it all the way up, to around 20 seconds beforehand.
“Upstream, further prior to an event, we begin to see failures in attention allocation that are indicative of less awareness in the operating environment in the crash events,” says Bryan Reimer, an engineer who studies driver behavior at MIT. In other words: The problems that cause crashes start well before the crunch.
It all comes down to eye glances. Sure, the more time you spend looking off the road, the likelier your chance of crashing. But the time you spend looking on the road matters, too. If your glances at, say, the texts in your lap are longer than the darting ones you make back to the highway in front of you, you gradually lose awareness of where you are in space.
Usually, drivers are pretty good at managing that attentional and situational awareness, judging when it’s appropriate to look down at the radio, for example. But smartphones and in-car infotainment systems present a new issue: The driver isn’t really deciding when to engage with the product. “If the phone goes brrrrring, you feel socially or emotionally compelled to respond to it,” says Reimer. The problem is that the cueing arrives with no regard to when’s a good time.
AttenD
The algorithm that researchers tested in this paper—one called AttenD, which dates back to 2009—turns out to be pretty good at predicting when crashes happen based on what drivers were doing in 20 or so prior seconds. That means that maybe, one day soon, scientists could use this kind of math to build and then test products that are safe to use in the car.
New, more human-friendly tech could, say, declutter the car’s instrument panel in situations that require more attentional awareness. Getting ready to make a left turn at a large intersection? Maybe it’ll hold off buzzing about that new text. Driving on the highway in heavy rain? Maybe it won’t let you navigate through a menu to queue up a podcast.
This research could also help regulatory agencies come up with badly needed standards for things like semi-autonomous vehicles, or spur automakers come up with them on their own. „We’re hoping to come up with some kind of numeric grading system,” says Patton, the Denso engineering chief. A five-star anti-distraction product could one day adapt to the kind of driver behind the wheel (a teen, an older person, someone with a heart condition).
Work like this isn’t quite ready for the big time. „What makes me nervous about models like this is that people start using numbers and I don’t think we know what numbers mean,” says Charlie Klauer, an engineer who studies distracted driving in novice drivers at the Virginia Tech Transportation Institute. She emphasizes that cracking human attention can’t all be on designers—drivers will need to be educated on the dangers of fiddling with stuff behind the wheel, and cops will need to enforce existing anti-texting laws. So, early days.
But this kind of research only becomes more important as vehicles with automated features hit the road in greater numbers. Automakers like Tesla, Mercedes-Benz, Audi, and General Motors already or will soon offer vehicles with partially-automated features that handle highway driving.
Even in these cars, human drivers remain vital. They need to know when they should retake control from the robots. And that means paying attention.
Sursa: Wired