It seems that autonomous vehicles are not quite as safe as many of us think. The problem is that they play by the rules and humans don't. According to Forbes:
"The self-driving car is supposed to lead to a world without accidents, but it is achieving the exact opposite right now: According to Bloomberg, autonomous vehicles have racked up a crash rate double that of those with human drivers.
The glitch? They obey the law without exception. That may sound like the right way to program a robot to drive a car, but good luck trying to merge onto a chaotic, jam-packed highway with traffic flying along well above the speed limit.
As accidents have piled up -- all minor scrape-ups -- arguments among programmers at places such as Google and Carnegie Mellon University are heating up: Should they teach the cars how to commit infractions from time to time to stay out of trouble?"