Monday, July 22, 2013

How to police self-driving cars

As you know from a previous post, I am a huge advocate of self-driving or autonomous cars.  In short, they would be exponentially safer than a human driving a car, and would free up time for its human passengers to do other activities.  After the question of safety, which I am not going to debate here, the next question is always around the law: who is accountable should a self-driving car have an accident?  I think the difficulty people are having with answering this question is that it is such a foreign concept to us that we can't wrap our brains around it.  Thus, I was inspired to write this post in order to simplify our thinking and perhaps clear the way for future discussions on this topic.  I'm going to make the case that car manufacturers, or more specifically, the company producing and installing the software into autonomous vehicles, should be held responsible for any accidents caused by vehicles using said software.  

Before you get all defensive on me, hear me out.  If a brand new car, driven off the lot, spontaneously explodes and injures the driver and a person in another car, the car manufacturer is at fault.  I think we can agree on that much.  More broadly, a malfunction of an unmodified vehicle that causes harm to someone (or really, had the potential to harm someone) is a product defect and thus the responsibility falls on the car manufacturer.  Of course, a car owner could void this responsibility by tinkering with the vehicle, in which case the car owner would be responsible if he/she created a defect in an otherwise certified safe car.  If another company did the tinkering that lead to the harm, the vehicle owner would hold that company accountable.  All of this is in practice today, and has been for many years.  

Now, let's apply it to autonomous vehicle features, without going fully self-driving.  New (high end) cars have some of these features already.  Yes, you heard me right: mass-produced cars are already doing some of the thinking for us.  Some drivers are already entrusting their lives into the hands of a car that makes its own decisions; and the rest of us are driving on the same roads as those cars.  The primary example I will focus on is the Volvo that stops itself before impact with a large object.  My brother-in-law was so excited when he test drove the car, bragging about how the salesman encouraged him to drive it directly at a brick wall.  Against all intuition, he successfully approached the brick wall at such a speed as to trigger the car to automatically apply the brakes.  Amazing!  Does this excite you?  Or scare you?  And more importantly, what happens if the car's software has a glitch?  One could imagine how a glitch could trigger the car to randomly apply the brakes on the freeway, causing the car behind it to plow into it.  Who is at fault for the cause of the accident?  Sure, the other driver might get a ticket for driving too close to be unable to stop in time, but I would argue that stopping in the middle of the freeway with no reason could be questioned; and its certainly not the Volvo owner's fault, right?  The test drive showed that no matter what the driver is doing, those brakes are going to be applied if that's what the car has decided to do.  Thus, responsibility must be taken out of the driver's hands.  

Now let's take a different angle.  Let's suppose that the Volvo driver is suddenly faced with stopped traffic ahead and an out-of-control runaway semi-truck barreling down a hill at him.  The driver, seeing this, attempts to swerve around the stopped traffic, maybe even attempt to hit the guard rail and plow through it, rather than be squashed between a semi and the car in front of him.  Certainly, if the driver is able to rationally see the guard rail as a favorable collision to being squashed, he should be able to maneuver the car in order to do so, and potentially save his life.  If then the Volvo's autonomous braking feature prevents the driver from driving into the guard rail, and he is instead squashed, would there not be a public outcry and lawsuits against Volvo?  

Parking is another aspect we're seeing early autonomy in.  Some features are just park assist features, like guide lines on the camera.  But some cars are actually controlling and manipulating the steering wheel.  If this doesn't scare the nay-sayers, I don't know what it will take to scare them.  Yet parking is one of the most accident-prone parts of driving, because we are dealing with tighter tolerances.  Tighter tolerances is also a good reason to have the computer take over, too, since computers are more precise and accurate than people are.  Regardless of the rationale, there will still be some damage done by an auto parking car eventually, and I am sure that the car manufacturer will be found at fault for this error that could have been prevented by a human driver.  

I'd be willing to bet that, long before our cars drive themselves completely, we'll see at least one such accident go to court and favor the driver.  That court case is going to be crucial in setting the precedent for future autonomous car accidents.  Car companies installing autonomous features have to be ready for this. 

This, in no way, should deter car manufacturers from adding these features to their cars and moving forward with vehicle autonomy.  Smart features that enhance the safety of a vehicle are going to be primarily selling points.  There will be late adopters who will resist, and that's true of all technologies.  But the majority of the population will gradually become comfortable with these features, and the features will become selling points or requirements in future purchasing decisions.  Instead of car manufacturers shirking away from the technology because of the messy legal fears, they need to embrace the responsibility with the added value that can be derived from the features.  This will serve as motivation to ensure utmost safety of their features - glitch-proof, do-no-harm mentality needs to be instilled, while the return on investment will still be positive.

Still, there will be accidents.  Some will be tragedies that would not have happened had a human driver been in control.  And the nay-sayers will point to them being the beginning of the end.  But my bet is that we will see a vast improvement in safety overall: fewer fatal accidents, fewer pedestrian accidents, fewer accidents causing injuries.  Fewer intentional "accidents", too, by the way, caused by road rage and suicidal / homicidal incidents that go unrecorded for lack of evidence.  All across the board, our roads will get safer and safer as more people drive cars with more and more autonomous features.  Autonomous cars are better for the society as a whole, and will greatly reduce the inherent risk of getting into a vehicle.  Let's not forget how fatal driving can be today - we must never forget that.  The most important thing, when there is an accident caused by an autonomous feature, is to collect all the data we can about that accident and prevent it from happening again.  Pioneers of all sorts are often made into sacrifices and martyrs, and I wish just as much as anyone else that this didn't have to be the case with early adopters of the self-driving car, but I know it is bound to happen.  It will happen, we will settle it in court, and we will move on.  

No comments:

Post a Comment