Gazette IT columnist Joanna Goodman considers lawyers’ use of ‘the cloud’ and their ability to harness ‘big data’.
Last year, the Cloud Law European Summit at the Law Society demonstrated that while cloud computing has become an established part of the business and legal IT architecture and cloud offerings are becoming more sophisticated and user-friendly, regulators, vendors and businesses are still grappling with the same issues that they faced a few years ago.
As Dave Ewart of Workshare observed, cloud computing has democratised IT, giving all businesses access to enterprise-grade software on a pay-as-you-go basis with the option of moving their IT investments from Capex to Opex. Cloud supports business agility as it is flexible and scalable. However, keynote speaker Professor Ian Walden was among those who drew attention to the regulatory minefield around cloud computing and the data-related issues it raises.
These divide into two areas: the relationship (contractual and operational) between cloud providers and their customers, and the regulatory landscape, which includes data protection and privacy issues. Cloud purchasers still need to know where their data is and who has the right to see it, and negotiate service level agreements (SLAs) that allow them to get their data back in a usable form should they wish to migrate it back in-house or (as is more likely) to a different cloud provider.
Professor Walden cited some interesting statistics: 75% of CIOs had no idea how much cloud was being used in their companies. This illustrates that the cloud (among other things) is blurring the boundaries of the enterprise and creating security issues. Other presentations covered data security and privacy, the complex regulatory landscape, including the US Patriot Act, and proposed EU and UK regulations.
A recurring theme was the need to establish minimum acceptable requirements, given that informational standards as well as data protection and privacy requirements vary significantly by jurisdiction. The fact that there are currently seven EC working groups and 150 draft standards for cloud (including some duplicates) makes the position even more confusing.
In terms of data privacy, ‘people need to know exactly what they are giving up’, said Neil Brown of Vodafone. Bojana Bellamy from Hunton & Williams made the telling observations that organisational accountability needs to go beyond legal compliance and – on data security – that not all encrypted data is personal and not all personal data is encrypted.
Frank Jennings of DMH Stallard LLP (he is now partner at Wallace LLP), who chairs the Cloud Industry Forum’s code of practice board, agreed that the key issues have not changed and underlined the importance of recognising that migrating data to the cloud goes beyond having a comprehensive SLA. ‘If there is an outage, you need to get back up and running as quickly as possible. It’s no use waving a contract around: you need a back-up plan,’ he said, highlighting the need for open data standards to ensure data can be retrieved in a usable format.
Jennings offered some interesting considerations specifically for law firms. Law firm CIOs are not afraid of cloud and lawyers are pleased to have mobile and remote access to their firm’s IT resources. The key issues are around managing clients’ data and keeping it confidential. Jennings discussed the risks of public cloud and suggested that hybrid cloud is probably a better practical solution, citing ISO/IEC 27018, the first (voluntary) international standard governing the processing of personal information by cloud service providers.
Legal privilege means that communications between solicitors and clients must remain confidential, so perhaps the rules are slightly harsher on lawyers, he added.
Big data and psychohistory
Big data analytics involves applying algorithms to large volumes of anonymised, unstructured data from different sources and platforms to identify correlations, patterns and trends that can be used to forecast future behaviours, or outcomes relating to particular products and services.
Last year, the Gazette interviewed David Howarth (20 October 2014), a legal academic who co-leads the government’s ‘Big Data for Law’ project, the first public-private big data research project. Its remit is to apply big data analytics to vast amounts of current legislation. As well as making it easier to track and interpret, this will ultimately change the way legislation is created.
It follows the thinking of Howarth’s book, Law as Engineering: Thinking about what lawyers do which argues that law is like engineering because it involves creating things that change the social world: regulations, statutes, constitutions and treaties.
Technology advances have produced the ability to predict consumer behaviour using big data analytics. But the concept has been around for a long time. Isaac Asimov’s 1951 novel The Foundation explores the fictional principle of psychohistory – the application of historical behaviours to shape the future – or 1950s terminology for what we call big data.
However, Moritz Hardt’s article on Medium on how big data is unfair is particularly resonant when we consider big data as a means of making and applying laws.
Hardt was motivated by Gillian Tett’s Financial Times article ‘Mapping crime – or stirring hate’ which argues that the benefits of Chicago’s predictive policing initiative outweigh its potential harm in terms of racial bias because it is based on multi-variable equations, which are inherently unbiased. Hardt’s response is that machine learning is based on identifying statistical patterns in historical data which incorporate historical racial and cultural biases.
On a more positive note, he acknowledges that ‘people are catching on to the fact that fairness is a pretty big problem with data-driven decision-making’.
The perceived unfairness of big data analytics supports Asimov’s theory that although mathematical forecasting can predict collective behaviours, machine learning and multivariable equations cannot identify or predict the reactions of individuals – and particular individuals can have a potentially disproportionate influence on decision making and outcomes. Howarth’s book cites Linklaters’ opinion to Lehman Brothers as a catalyst of the 2008 financial crash.
This is worth considering when applying data analytics to the creation and application of legislation. A potential solution would be to drill down into the detail of the data, rather than simply applying big data principles. Lex Machina includes specific judges and districts in the key variables it applies to its predictive analytics for IP litigation.
Although big data can certainly be applied to many aspects of the legal process, Howarth acknowledges that law is a people business: the legal system is shaped by and shapes society.
To turn around Hardt’s point, using multi-variable equations to analyse historical data could potentially be used to avoid repeating past mistakes by identifying recurring patterns and changing behaviours or decisions before it is too late. Howarth refers to this when he writes that lawyers need to take responsibility for the consequences of the opinions they give their clients – that in engineering terms, lawyers are in danger of confusing product reliability with product safety.
Perhaps big data can help lawyers and others avoid the next financial crash before it happens.