google

 

As it expected, humans are kind of terrible at that. Which is a real problem for driverless-cars. But an obstacle for researchers is the fact that driverless cars are engineered to always follow the law. As an effect, according to a recent report from Bloomberg, “driverless cars are now seeing a crash rate twice as high as cars with humans at the wheel”. The report notes that they’re all “minor scrape-ups for now”, that they’re always the human-driver’s fault (usually human drivers hit the slower-moving computer-driven cars from behind), and that none of these accidents have caused any injuries.

“Most if not all people drive at least slightly above the speed limit,” professor Raj Rajkumar, the co-director of the General Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab in Pittsburgh, told MTV News. “There’s an unspoken rule that we can drive ten to fifteen miles above the speed limit and not get speeding ticket. So when driverless cars are driving at the speed limit on the highway, other cars are zipping by.”

“If a driverless vehicles is driving faster than the speed limit, and a cop stops it, who gets the ticket — the non-driver in the driver’s seat, or the engineer?” he asked. “The engineer obviously doesn’t want to become liable for that. So we opt to follow the letter of the law and leave it to society to decide whether we need to adapt our speed limits to account autonomous cars following the letter of the law.”

“At the end of the day,” Rajkumar said, “the objective of the technology is just to make it as safe as possible for the passenger — much safer than human drivers — and to try to create technology that looks ahead and behind, considers speed limits, and does not crash into anybody, or anything.”