Law and order exist for the purpose of establishing justice and when they fail in this purpose they become the dangerously structured dams that block the flow of social progress.

-Martin Luther King, Jr.

I believe that law societies and bar associations are failing in their responsibility to meaningfully address a coming wave of AI technologies.  Law exists to serve the public.  When the law is used exclusively to further the interests of something or someone other than the public an injustice is done.  Perceptions by the public express feelings that lawyers use the law to benefit themselves which lends some credence to the possibility that the law is already being used this way.

The truth is, however, that multiple stakeholders have interests in how law is practiced.  Too much influencing power in the hands of a single stakeholder shifts the balance away from a fair and equitable provision of services.  If lawyers have too much influence over legal services, the public is harmed.  The same is true if law societies over-regulate or bar associations lobby legislative changes counter to wider public interests.  I also believe that the public risks harming any balance if it pushes for legal technologies in a way that seriously undermines the interests of lawyers.

Legal AI technologies have the ability to dramatically shift whatever balance exists today.  The direction and scale of this shift is still uncertain.  It is my belief that legal technology has the potential to shift this power too far in one direction.  This shift could go either way, too much regulation means excluding meaningful technologies and too little risks making people obsolete.  Without some meaningful lobbying or regulation over such technologies, the potential exists that lawyers will be unable to compete in a rapidly evolving legal marketplace.  Technology will win and the profession will lose.

Whether the legal profession can adapt to such changes or if they are even a bad thing is impossible to foresee.  Legal futurists such as Darin Thompson argue that automation is only really suited for “high volume, repeatable work”.  Such an assumption underestimates the speed at which AI technologies have advanced in the last decade (autonomous cars/bus drivers, data mining/student research, middle-class skilled administrative positions, and document discovery were pipe dreams a decade ago).  I believe that the public interest is served by having a healthy and well-staffed legal profession.  The family doctor crisis in Kamloops and other rural communities provides an example of what happens when the public is unable to draw from pools of needed experts.  Gaps in expert pools already exists today in some legal areas.  Poverty and family law come to mind as prime examples.

Legal apps could serve these gaps because whatever reasons drive lawyers into certain fields are, like the lack of doctors, not drawing enough there to serve the public in a cost-effective manner.  If, however, these apps move into areas which are already served by lawyers the public risks pushing people out of the profession and exacerbating already existing gaps.  I stress, this is a risk not a guarantee.  There is also the likely possibility that automation will bring justice to people in ways the profession manifestly fails to do effectively today.   What needs to continue is more discussions about the role of automation in legal services.  Lawyers can no longer afford to ignore this debate, and we as professionals need to be honest and not blind to our role in the evolving legal marketplace.