Every law student knows that to succeed in a claim for negligence, the Claimant must prove someone was at fault for the incident that led to them suffering loss and damage.
In the context of a road traffic accident, what this involves is investigating the circumstances of the accident so as to establish whether the driver(s) involved either did or did not do something and that accordingly, they fell below the standard to be expected of the reasonable driver and can then be said to have acted negligently and caused the accident.
In other words, liability turns on assessing human error by reference to what the reasonable person on the Clapham omnibus (or more latterly, the Boris bike) would have done.
With the technology and automation becoming ever more prevalent in the car, what is the position going to be in years to come when human error may have little or no part in the cause of an accident?
The blog hopes to provide a brief overview as to autonomous vehicles, the issues that may arise and the government’s current response to ever increasing automation in our cars.
WHAT IS AN AUTONOMOUS VEHICLE
Autonomy in cars is with us already. We have cruise control be it simple or adaptive, ABS and dynamic stability control.
Subject to developments in their operating systems, Tesla cars are apparently capable of full autonomous driving now.
What constitutes autonomous driving is already a matter of degree.
Recognising this, SAE International have already published a system that classifies the different levels of autonomy, both present and future.
That classification, J3016_201609, was last updated and revised in September 2016 and is outlined below.
|Level 0||No Autonomy||You drive.
Acceleration, braking and steering controlled all times by a human driver
|Level 1||Driver Assistance||Hands on the wheel.
At certain times, control of the steering wheel or the pedals can be given to the car, for example adaptive cruise control or park assist
|Level 2||Partial Autonomy||Hands off the wheel, eyes on the road.
At certain times, control of the steering wheel and the pedals can be given to the car. The driver however retains ultimate control. Tesla vehicles have reached this stage already.
|Level 3||Conditional Autonomy||Hands off the wheel, eyes off the road – sometimes
At certain times, the car can decide when to change lanes and how to respond to incidents on the road but the human driver remains as the fall back system in terms of control.
|Level 4||High Autonomy||Hands off the wheel, eyes off the road, mind off the road – sometimes
At all times, the vehicle can drive itself and can, but does not need to, ask for human assistance if unable to handle an event
|Level 5||Complete Autonomy||Steering wheel optional.
Full control at all times and neither requires nor needs human assistance
No production vehicle has yet to reach either Levels 4 or 5.
Having regard to the SAE classification, it is very easy to see how the most factually straightforward of road traffic accidents could raise some very complex questions as to how the accident was in fact caused.
Who, for example would the Claimant bring their claim against? The ‘driver’ of the vehicle may be no more than a passenger in reality such was the extent of the control being exercised by the car. They may not have been at fault in any way and have merely relied on what those who sold or made the car told them in terms of its autonomous functions and what they needed to do whilst in the car.
Must then Claimant contemplate a product liability claim against the manufacturer of the car?
If they must, what if the manufacturer of the car simply purchased an operating system from a specialist supplier and then simply installed that into the car?
Would the Claimant then have to pursue that specialist supplier or those responsible actually writing for the operating system? If so, must the Claimant consider obtaining expert evidence as to how the algorithms that control the car were not only devised but then tested?
Making a claim after being involved in an accident with an autonomous vehicle could then become far more complicated (and costly) than it is presently.
The Vehicle Technology and Aviation Bill
Currently, motor insurance is very much centred on the driver in that anyone who drives a vehicle, must be insured against the risks of so doing.
Recognising that in the event of an accident involving an autonomous vehicle where, as in the example above, the ‘driver’ may have been no more than a passenger and as such, may not have been at fault at all, the Government has published the Vehicle Technology and Aviation Bill (‘VTA’)
The intention of the VTA is simple: where there has been an accident with an autonomous vehicle and that has led to damage and loss, any claim should be dealt with in the context of motor insurance. The Claimant should not have to contemplate seeking recourse and compensation by way of complex and involved product liability litigation against potentially a number of different parties all of whom may seek to blame the other.
In terms of liability, the VTA it envisages that where an accident is caused by an autonomous vehicle and damage is suffered as a consequence either by the insured or another person, the insurer is liable for that damage.
In other words, the VTA keeps it as simple as possible for those who suffer damage as a result of an autonomous vehicle and the Claimant proceeds solely against the insurer of the vehicle.
The VTA recognises how important the operating system of an autonomous vehicle is (or is likely to be) in that there is specific provision for insurers to exclude or limit their liability in the accident has occurred as a ‘direct result’ on the part of the insured to either:
- Modify the operating system in such a way that is prohibited by the policy; or
- Fail to install an update to the operating system as required by the policy
Where accidents are caused by prohibited modification or failure to update software on the part of the insured, the insurer may seek to recover the amount they have paid from the insured.
To that end, the meaning of ‘direct result’ may become contentious insofar as it requires an investigation into how either a modification or failure to install an update can be said to have caused the accident.
Significantly, the VTA makes it clear that once the claim has been dealt with, the insurer is free to pursue recovery of the sum paid from the ‘person responsible’.
This is likely to become the real battleground with insurers pursuing manufacturers and/or software developers insofar as a software failure may have caused the accident.
The rise of autonomous vehicles will undoubtedly change life on the roads. It is encouraging to see legislation being brought forward to deal with the challenges and to ensure the system will be as ‘user friendly’ for those directly affected when there has been an accident involving an autonomous vehicle.
Readers should note that this is not intended to be definitive legal advice but rather a basic guide to the issues raised by autonomous vehicles.
Gerard McDermott QC Ltd