The government has ambitious plans for autonomous transport but this rapidly evolving technology threatens to drive a coach and horses through our legal framework. Rachel Rothwell reports

THE LOW DOWN

As technology companies and car manufacturers join a global race to get fully automated vehicles on our roads, the UK is ahead of the game in signing off legislation to deal with the inevitable and complex liability issues thrown up by driverless technology. But lawyers acting for injury victims are frustrated that the government’s flagship Automated and Electric Vehicles Act will only govern the futuristic landscape of highly advanced automation. In the meantime, victims injured by the semi-autonomous driving technology that is already out there, or coming soon – and which still needs input from a human driver, making it more likely to lead to accidents – will face a struggle to bring product liability claims against powerful vehicle manufacturers, using existing laws that are simply not fit for purpose. 

Being at the forefront of automated vehicle (AV) technology is a big priority for the UK government. 

Last November, ministers swept back the curtain on a grand plan for autonomous public transport. By 2021, full-sized ‘self-driving’ single decker buses will run a 14-mile route between Fife and Edinburgh; while self-driving taxis will be running in parts of London, following two separate trials in select London boroughs.

The schemes are being bankrolled by £25m of state funding. UK companies such as Oxford University spin-off Oxbotica and Jaguar Land Rover are battling it out against US rivals including Google, Uber and Tesla to be the first to develop fully autonomous driving technology. 

Last July, despite the Brexit vortex, parliament found time to pass the Automated and Electric Vehicles (AEV) Act, setting out a legal framework for a driverless future – beating the US in getting legislation on to the books. 

But the road towards the development of automated vehicle technology has not been without its hairpin bends and potholes. Last March, a pedestrian pushing a bicycle was killed in Arizona (a state that permits the testing of self-drive technology) by an Uber autonomous car in self-drive mode, with a human monitor sitting inside. In the same month, a Tesla car in its so-called ‘autopilot’ mode (which still requires the full attention of the driver, who should have their hands on the wheel) slammed into a crash barrier in California, killing the driver. Data records showed he had his hands off the wheel for six seconds. Meanwhile in 2016, the sensors failed on another Tesla in autopilot, which did not detect an 18-wheel truck and trailer crossing the highway, killing the Tesla driver. 

In the UK, a Tesla driver was arrested last April after astonishingly moving over to the passenger seat on the M1 while in autopilot. He was uninjured, but banned from driving. 

Level pegging

Fully autonomous vehicles have the potential to transform our lives in many ways; not just the way we travel, but also the urban landscape, the way goods are manufactured and transported, and even the level of independence we can maintain into old age. Ultimately, they should revolutionise road safety, with 90% of road traffic accidents currently down to human error (according to the European Commission). 

But a car that can truly drive itself for all or part of a route, with no need for human intervention, is possibly at least a decade away. In the meantime, vehicles will become increasingly autonomous, but they will still need a human being ready at the wheel.

There will always be concerns over whether an algorithm can be programmed to act in an ethical way, and what standard should be applied to the driving of driverless cars 

Lucie Clinch, Stewarts Law

The Society of Automotive Engineers has set out five widely accepted ‘levels’ of automation against which technology can be judged. In levels one and two, the AV tech may be providing assistance, such as brake and acceleration support or lane centering, but the human inside is still considered to be driving the vehicle. This technology already exists on our roads. 

Level three is more of a grey area and there is some debate as to whether we have already reached it. The car is considered to be driving itself, but it needs a human driver who can take control when required. Level four and above will be a big step forward, as no fallback human will be needed. A level four vehicle can drive itself in limited conditions. Level five is a car that can drive itself anywhere, any time. 

According to some experts, the big problem with the AEV act passed last summer is that it only applies to levels four and above. Nicholas Bevan, a solicitor and expert in motor liability and insurance, explains: ‘Section 1’s wording is capable of embracing levels three to five. But during its passage as a bill, transport minister Baroness Sugg indicated that it would not cover level three vehicles. Level three requires the presence of a fallback driver to be receptive to a system’s alert or some other need to resume control.’ 

For personal injury lawyers, this is a huge disappointment. The AEV act is hugely helpful to accident victims, because it imposes a strict liability on the AV’s insurer to compensate the victim for injuries caused by a vehicle in self-drive mode. If the accident was caused by a failure of the self-drive technology, it is then open to the insurer to sue the manufacturer. But this will only apply when we enter the futuristic world of levels four and above. 

In the meantime, AV injury victims will be left struggling to sue manufacturers themselves, using existing product liability laws which are not fit for purpose. And the existing and near-future technology of levels two and three, which require human drivers to take an active role, is potentially very dangerous. 

Self drive 2

WHEN MOBILITY NOT FLASH WHEELS COUNTS

According to White & Case partner Christian Theissen three areas where automated vehicles will have a big impact in the near future are car sharing, electric vehicles and autonomous trucks. 

He says: ‘The focus of many car manufacturers is slightly shifting away from offering cars to individual car owners, to becoming “mobility providers”, for example through taxi joint ventures, ride-hailing apps and so forth. This is largely triggered by the decreased interest of younger generations in owning cars… where enough alternatives are available, for example owned/shared bikes, public transport and car-sharing pools.’ 

Another trigger is the issue of urban traffic, he says. With growing urban populations, a lack of parking spaces, and pressure to lower harmful emissions, electric vehicles are perfect for short distances if the right charging infrastructure is put in place. So they are bound to become much more popular, which is a natural fit with automation. 

‘At the same time, the margin for car-sharing services is significantly higher if the cars are autonomous: the cars can be constantly floating and thereby generating income, instead of being parked somewhere for hours.’ Theissen also expects to see another shift in investment focus, towards automation of the supply chain from manufacturer through to retailer and end-user. Indeed, just last month Daimler Trucks revealed plans to invest €500m in a global push to bring highly automated ‘level 4’ trucks to the road ‘within a decade’.  

Bevan remarks: ‘An ordinary driver is guided by the Highway Code for braking distances. But these are wholly inappropriate for someone who is supervising an automated vehicle. A supervisor’s first reaction will not be to the hazard when first perceived, as with a normal driver, but to the subsequent realisation that the automation is not reacting as it should. There may also be a further episode of double guessing or of disbelief. And the more reliable the automation has proven to be, the more you are inclined to do a double take before acting.’ 

An ordinary driver is guided by the Highway Code for braking distances. But these are wholly inappropriate for someone who is supervising an automated vehicle. A supervisor’s first reaction will not be to the hazard when first perceived, as with a normal driver, but to the subsequent realisation that the automation is not reacting as it should.

Nicholas Bevan, solicitor and expert in motor liability and insurance

Indeed, studies in the aviation sector have shown that the more automation is used, the more pilots start to over-rely on it, to the detriment of their piloting skills. 

Then there are the liability issues. Lucie Clinch, senior associate at Stewarts Law, says: ‘The semi-autonomous position, outside the act, will create complex liability disputes. What if the vehicle chose to change lanes at the wrong time, and the person “monitoring” did not prevent it from doing so – is the driver or the AV to blame? What if the AV brakes unpredictably, causing a rear-end shunt, injuring the driver. Would the driver and his insurers deny liability? Might they suggest the blame lies with whoever performed the last service, failing to calibrate sensors or update software? 

‘The victim may be required to consider a highly technical Consumer Protection Act claim for product defect, in tandem with a negligence claim including the servicer – incurring costs and the risk of pursuing three or more defendants. These are serious considerations for injured parties, particularly in the period where AVs are on the roads, but the AEV act does not apply when the vehicle in question is not fully autonomous.’

Clinch adds: ‘If it is the driver who is making a claim for injuries, might he or she face contributory negligence arguments, for not preventing the braking or lane change themselves, or for going to an unofficial servicer?’ 

Bevan agrees that the current law is unfit to deal with these issues. He says: ‘Bringing a claim will be a considerable nightmare for a claimant. How do you prove negligence and allocate liability between the parties? How do you prove that the system was in error in any way? The sheer Herculean scale of the challenge of establishing causation and liability in the face of the unlimited resources of a manufacturer, where the claimant will have to litigate within the proportionality rule, will be an exacting and uncertain endeavour.’ 

The solicitor adds that the Consumer Protection Act is ill-equipped to deal with this type of claim; particularly as it contains a ‘state-of-the-art’ defence. That could be used by AV manufacturers to avoid liability on the basis that they could not have known that the cutting-edge AV technology would react in the way it did. 

Caught in the act

As part of the government’s enthusiastic focus on AVs, the Law Commission is in the throes of a three-year project examining the legal framework for AVs. Its preliminary consultation, which closes on 8 February, examines some of the more difficult liability issues thrown up by the AEV act. 

‘The Law Commission has done a good job in the reconnaissance of the unknowns,’ remarks Alex Glassbrook, a barrister at Temple Garden Chambers. ‘The AEV is a foundation stone act; it doesn’t overreach. It advances the correct distance into the debate, building a base and then handing over to the Law Commission’. 

On liability, the commission identifies a number of potential difficulties, including the ‘rather complex’ way contributory negligence is dealt with in the AEV act, and the position relating to causation. 

Julian Chamberlayne, a partner at Stewarts, points out that the act refers to an accident ‘caused’ by an AV. He says: ‘So even though this is presented as a strict liability regime, insurers may try and argue that an accident that was not reasonably avoidable by the AV was not “caused” by it. The legislation could have simply said “includes an accident involving an AV”, but instead used the more blame-loaded terminology “caused”.’ 

The commission suggests that individual causation questions (for example where an AV swerves to avoid an erratic cyclist, but then hits a parked car) should be left to the courts. 

Under the AEV act, insurers will also be able to avoid paying out on a policy if the insured has installed software updates that it does not allow, or if they have failed to install ‘safety critical’ updates. But there is currently no clarification on who is responsible for alerting vehicle owners to the need for such updates – the manufacturer, the services garage, or the insurer. Meanwhile Christian Theissen, a partner at White & Case, identifies another software-related question. What if a software update significantly changed the features of the AV that the consumer had purchased? 

Self drive

In the hot seat

Fears that AV injury victims will be left struggling to sue manufacturers, using existing product liability laws which are not fit for purpose

He says: ‘Will the original manufacturer have the right to compel a car owner to accept an update? For instance, if an update makes the software safer – for example because it makes the software act more defensively in traffic – a manufacturer would want it installed to mitigate its liability. But what if the consumer wants to keep the more “aggressive” car he initially chose to buy?’

Clinch adds: ‘There will always be concerns over whether an algorithm can be programmed to act in an ethical way, and what standard should be applied to the driving of driverless cars, especially as they are promoted as being safer drivers than the human equivalent. Will the manufacturers compete with each other on safety, and if so, will purchasers choose models that prioritise occupant safety over the safety of others?’ 

One further aspect of the AEV act particularly perplexes lawyers. Its liability provisions are limited to AVs driving on ‘roads or other public places in Great Britain’ – so not abroad, or on private land. Bevan is scathing: ‘If we join the government in its futuristic vision, with this amazing technology at our disposal, then surely we will want to make proper use of it. Disabled Aunty Matilda will expect a door-to-door delivery of a fully automated Level 5 pod to take her to bingo… You’ll want your Amazon delivery right to your door. So why restrict only this species of product liability to roads and public places?’

For all the commendable focus on how this transformative technology will be dealt with in the long term, the final plea from claimant lawyers is that the needs of people injured by AVs already on our roads, or coming soon, must not be forgotten. As Bevan puts it: ‘The government has decided to map out the legislative framework of the high ground of highly advanced automation, which has not yet been achieved. But it is leaving the pathway to these advances completely unmapped: and as we have seen from recent fatal accidents involving AV technology, a very treacherous route it is likely to be.’ 

Screen shot 2019 01 24 at 14.26.21

 

Rachel Rothwell is a freelance journalist

Topics