The self-driving car, that cutting-edge creation that’s supposed to lead to a world without accidents, is achieving the exact opposite right now: The vehicles have racked up a crash rate double that of those with human drivers.
They obey the law all the time, as in, without exception. This may sound like the right way to program a robot to drive a car, but good luck trying to merge onto a chaotic, jam-packed highway with traffic flying along well above the speed limit. It tends not to work out well. As the accidents have piled up – all minor scrape-ups for now – the arguments among programmers at places like Google Inc. and Carnegie Mellon University are heating up: Should they teach the cars how to commit infractions from time to time to stay out of trouble?
I expect that self driving cars will certainly be normal in the future but until then human interaction is unpredictable and subject to moods. Until every vehicle on the road is made to be self driving then we will see accidents.
Another factor would be humans themselves and their behavior in walking or riding bikes and such. Depending where there is traffic and congestion cars may be rerouted to avoid such areas.