Consumers to Creators: Taking Control of our Tech

Parvej Sidhu

Access to justice is often conceptualized as a gap requiring a bridge. Artificial intelligence (AI) is helping by bridging another gap, between the justice system and the tech world. By drawing on the lawyer’s knowledge and the software developer’s expertise, AI is helping legal professionals complete their work faster and with greater accuracy, but also helping the public address their legal needs on their own. The ground-breaking Civil Resolutions Tribunal in BC is an excellent example of the latter.

I’ve been learning how to build this kind of AI in Professor Katie Sykes’ class, “Designing Legal Expert Systems: Apps for Access to Justice.” It’s been a welcome exercise in creativity and an exciting introduction to artificial intelligence (made possible by very beginner-friendly software from Neota Logic). It’s also, however, made me question my relationship with technology. In particular, I’ve been thinking about another kind of gap, found between what we wish technology could do for us and what we’re actually using it for in our day-to-day lives.

It’s not always obvious that our relationship with technology evolves as fast as the technology itself, partly because we don’t really make a lot of conscious choices about how heavily we’re going to rely on it. None of us woke up one morning, for instance, and decided to designate our cell phone as our hand-held computer, GPS, and mobile personal assistant. Most advances in tech, whether they be in health, communications or artificial intelligence, creep up on us. When we do make choices, they’re constrained by what we are offered on the market as consumers. I think this translates to a lot of wasted potential. The carefully curated features of the latest “smart” devices out there are hardly a response to our cries for help. Many smart products are designed to solve “problems” that don’t exist for a majority of this planet, if at all. I am reminded of this every time my washing machine decides it needs to lock my clothes inside it and I’m forced to unplug it to win them back.

In the course of solving problems that don’t exist, technology also creates problems we’ve never seen before. Earlier this year, news broke on artificial intelligence that can detect, with considerable accuracy, someone’s sexual orientation just from their photographs. My initial awe quickly gave way to concern about the gross violations to human rights and privacy that would result if this AI were abused.  In these murky waters, our relationship with technology devolves further, and we’re relegated from consumers to mere subjects.

As consumers or subjects, what can we really do about useless, invasive or unsettling uses of AI? It’s clear to me that the engineer-consumer divide in how we interact with tech isn’t conducive to socially responsible or responsive innovation. To my mind, challenging this dichotomy is a good place to start, and those of us building “apps for access to justice” have been given the opportunity to do just that. In the legal context there is enormous potential and incentive to harness the power of AI to serve our own needs as well as the needs of our colleagues, our clients, or the public in general. These are specialized needs, and they require tomorrow’s lawyers to experiment as creators and innovators if they are ever going to be met.

Access to justice is a real problem, and real solutions are possible with the use of tools like artificial intelligence. The first step in discovering those solutions is to recognize the role we have to play as creators in control of our tech.

 

What about the really poor and marginalized?

I have to preface this post by saying I am a AI skeptic. That’s not to say that I don’t see the value in certain technologies and apps, especially where they help lawyers save time and therefore help the client save money. However, I would argue that this kind of technology is more likely to help the middle- to upper-class client save money.

What about those people who are in the lowest socioeconomic rungs where access to this technology is either impossible or impracticable? Even gaining access to the internet at a public library may be beyond the grasp of certain clients who either cannot physically get to the library or understand the technology in the first place. Persons with disabilities and senior citizens readily come to mind here. I think that we’re moving in the right direction with classes such as Designing Legal Systems and the CRT Knowledge Engineering courses offered at TRU Law. However, the more we use computers and such technology to increase access to justice for some people, are we dramatically reducing it for others? Are we in fact increasing the gap in access and leaving the already marginalized behind?

Food for thought.