A New York Times article, released on Tuesday, indicated that the driverless cars may be threatened by humans more than vice versa. During one test, the Google car slowed and its test driver gently used the brakes as it moved closer to a sidewalk to avoid harming a pedestrian. Though the street-walker was unharmed, the driverless car was hit from behind by a human-driven car behind it. In an analysis of the accident, Google researchers concluded that it would not have occurred had the driverless car not been assisted by a “safety driver.”
NAVIGATING A HUMAN, IMPERFECT WORLD
If true, what does this mean? Probably just that, as Mark Lelinwalla at Tech Times put it, “[Google’s] vehicles are too safe and perhaps play by the rules too much.”
But Google’s cars are still far from being safe enough to move without assistance. As Jim Kerstetter commented on the New York Times’ Bits blog, “For now, autonomous cars — or their programmers — will have to learn to deal with angry bike messengers, people who drive too close, distracted drivers and the other obstacles of an imperfect and very human world.”
A POSSIBLE SOLUTION
The experts have stated that driverless cars are currently too passive to safely share the roads with humans, who often drive aggressively.
Speaking to the Times, autonomous vehicle expert Donald Norman of the University of California, San Diego’s Design Lab, explained the dilemma. “The real problem is that the car is too safe,” said Norman. “They have to learn to be aggressive in the right amount, and the right amount depends on the culture.”
In other words, cars would have to learn how to imitate the driving style of other cars in any given environment.
DRIVERLESS CAR COMPLEXITIES
Engineering driverless cars to safely share the roads with “regular” ones operated by humans has proven to be a complicated task. Three years ago, a study found that cars with a system that detects lane changes from other cars and warns the driver crash slightly more often than cars without it.
It’s fitting, then, that recent Google tests of driverless car safety were made for “smoothing out” the interactions between people and the car’s programming, according to the Times.