Skip to content
SCIENCE

Your eyes are the key to distracted driving, not your brain

Relying on peripheral vision is the problem, not the cognitive load of multitasking.

Story text
The key to avoiding distracted driving is simple, according to new research from MIT: just keep your eyes on the road and look where you're going. That might sound horribly obvious, given "look where you're going" is one of the earliest lessons we learn as we become mobile. But this new study reinforcing that lesson was focused on a slightly more complicated question: is the problem with distracted driving one of trying to concentrate on two separate tasks at the same time, or could it be a matter of where your eyes are pointing? When I learned to drive in the early 1990s, distracted driving wasn't really on anyone's minds. But then cellphones became ubiquitous, and smartphones followed, and so texting drivers became another thing we have to watch out for on the roads. It's not like the auto and tech industries aren't aware of the problem. Just about every new car sold today provides a way for a driver to connect their phone for hands-free calling. Apple CarPlay, Android Auto, and MirrorLink all exist to cast certain apps from a smartphone to a car's infotainment screen. On top of that, new cars are increasingly packed full of advanced driver assistance systems (ADAS) that will alert the driver about potential collisions or if the car is drifting out of its lane on the road. Unfortunately, none of that seems to be making much difference. People still use their cellphones when they drive, even if they know it's bad.

Is it where our eyes look, or what our brains are thinking about?

To test whether the problem is what we're thinking about or where we're looking, the authors of the study, led by MIT postdoc Benjamin Wolfe, designed the following experiment. Volunteers would view video clips from the point of view of a car driving around Boston, shown on a screen that simulated the view one would get actually driving that car on those roads. For the first set of tests, each participant was asked to look at different parts of the view—straight ahead, 30˚ off to either side, or 20˚ below center—as short clips of driving were played. While keeping their eyes on each required spot on the screen, they were told to indicate if the brakes lights on a vehicle in the lane ahead were illuminated (by pressing the space button on a keyboard). In a second set of tests, the participants were again told to indicate if they saw brake lights in the lane ahead, but there was an added complication. This time, they were told to keep their eyes on a green cross superimposed over different parts of the screen as the driving clips were played. In some tests, they were to indicate if one of the arms of the cross turned white using the corresponding arrow key. In other tests, they were to keep track of whether an arm of the cross turned white, but only to indicate that after the next time it happened—the different instructions allowed Wolfe and his colleagues to test for both immediate and delayed responses in attention. If the main issue with distracted driving is the cognitive workload, we'd expect to see participants perform better on the first set of tests and worse on the tests with the highest cognitive load (the delayed response to changes to the green cross). However, that's not really what happened. Where participants were told to look was actually the main factor affecting the accuracy of detecting brake lights; it was also the main factor affecting reaction times. Participants did best when they were looking at the center of the roadway and worst when they were looking below the center of the screen. To make sure they really were onto something, the researchers conducted a third set of tests on a new group of participants. The primary task was again to detect brake lights in the lane ahead while looking at specific regions of the screen. But this time there was a different secondary task: identifying if a visual pattern was rotating clockwise or counterclockwise, either immediately or after the next change of rotation.

We're slow to detect things in our peripheral vision

Yet again, the main factor in both brake light detection and response time was where the participants were told to look. The best performance occurred when they were instructed to look at the center of the screen, and the worst performance came when looking 20˚ below the center of the screen. Meanwhile, the effects of increasing cognitive load on reaction time were modest. Looking at a region of the screen other than the center increased reaction times by 458 milliseconds on average. But higher cognitive loads only contributed a 35ms increase to reaction times. "We're not saying that the details of whatever you’re doing on your phone can't also be an issue. But in distinguishing between the task and the act of switching gears itself, we’ve shown that taking your eyes off the road is actually the bigger problem," Wolfe says. "If you're looking down at your phone in the car, you may be aware that there are other cars around. But you most likely won't be able to distinguish between things like whether a car is in your lane or the one next to you." It's definitely an intriguing set of findings, and "keep your eyes on the road ahead" remains good advice for anyone getting behind the wheel. But people often ignore good advice, which means we'll inevitably be offered technological solutions instead. This research is certainly a strong argument in favor of heads-up displays in cars, although perhaps those need to be kept relatively simple, too. There's evidence that commingled visual attention also increases distraction. When it comes to the placement of infotainment screens think more Mazda 3, less Model 3—putting the display as close to the driver's sightline as possible, even if that means no more touchscreen interfaces. In fact, perhaps we should avoid looking at infotainment screens entirely by using better voice commands in cars, an industry trend that's already picking up steam. Either way, these results are yet more evidence that effective driver monitoring systems should include gaze-tracking or facial recognition—a torque sensor on a steering column is simply not good enough. Attention, Perception, & Psychophysics, 2019, DOI: 10.3758/s13414-019-01795-4 (About DOIs)