A shift from man power to machine. Hand tools to power tools. Scaling up production and efficiency – where new machines are built to automate the human energy required for a manual process.
You may think I am referencing the present due to technological advancements of today, but in fact, I am outlining the Industrial Revolution of the 18th and 19th centuries, and the invention of machines like the “Spinning Jenny” – designed to enable a person to simultaneously produce multiple spools of thread.
Today, we use electricity to light our rooms instead of lighting candles. We use smart phones to communicate with our families instead of writing letters. We use the internet to find answers, the right answers, instead of researching in libraries. Technology is at the center of everything we do.
And so it is of no surprise that we find ourselves in a technological industrialization with regards to skilled professions. The “machines” are replaced with “algorithms” and such algorithms can now create customized adverts, medical diagnoses, and even solve crime – to name but a few! Algorithms are taking over the world and no profession is immune.
And yet, there has always been pushback with technology and with innovation in general. Industrialization has never been popular on the ground - machines replace the working man.
Imagine now if we could apply algorithms to the legal profession and the resistance we would see. A recent judgment in the English Courts, Pyrrho Investments Ltd & Anor v MWB Property Ltd & Ors, approved the use of such an algorithm in the disclosure process. Although technology has been helping with the disclosure process for many years, this ground-breaking judgment approved the use of an algorithm for the first time in a tool called predictive coding.
During disclosure the end goal is to find the right documents – those relevant to the court case in question – and produce them to the opposing side. Since the dawn of electronic data, the haystack which contains the right documents continues to grow, yet the number of needles remains the same. Predictive coding allows a lawyer to tell the algorithm what the lawyer considers to be the right documents and the algorithm then begins hunting through the haystack. It cuts down the manual process of lawyers sifting through each strand of hay, lowers costs, and speeds up the entire process. This is beneficial to everyone, in some way, but predominately the person paying for the disclosure exercise!
Before the judgment, like with any industrialization, the resistance was ample. Cynics have stated that an algorithm cannot think like a human and therefore it will not perform as well, and the law requires human judgment that cannot be replicated by an algorithm. In addition, resistors have questioned how they will know if the algorithm has understood them correctly. These are practical concerns, which disintegrate once the algorithm is understood and a process is put in place.
If we examine other algorithms in our life, we can see that humans remain the drivers. Netflix only knows that I like “dramas with a strong female lead” because I have provided the data and watched similar shows. Facebook only knows that I want a pair of gold shoes for my sister’s wedding because I have been googling it and sifting through websites. To be successful, algorithms need input from humans to learn. Predictive coding was never intended to replace the lawyer, it was intended to empower the lawyer by illuminating the dark data.
Further, commitment to “getting it right” should be key. An algorithm does not think like a human, it does not have a human’s intuition, and it does not look to understand the grey areas or worry about the hard decisions. It is not supposed to. Like Netflix, the algorithm is designed to make suggestions for a human to look at. An algorithm does not need to sleep. An algorithm does not need to eat. An algorithm does not suffer with an attack of self-doubt. An algorithm does not miss its family or feel pressured over deadlines. The algorithm is friend, not foe. And, with the human driving, it will cut through the haystack far quicker, more accurately and at a much lower cost.
Like any legal process, predictive coding will need the appropriate checks and balances. The algorithm will need a defined methodology to make sure that it has understood and is under control. But that type of methodology and quality control is not any different whether it is a human or an algorithm looking at the documents. The presumption that a human has understood better than technology is no longer the correct one.
We are progressing on a journey of technological industrialization. We are past the point of no return and it is clearly time for the legal profession to embrace progress – even the courts say so! Like we witnessed with Netflix, trust in the algorithm’s suggestions will develop until we come to depend on the recommendations. Opinions aside, rudimentary economics will drive application of this algorithm in the same way such economics thrust the Spinning Jenny into use. It is time to embrace the industrialization of litigation.