Image
Icon

Directory

IconAccounting & Tax
IconAccreditation Bodies
IconActuaries
IconAssociations and Institutes
IconAuditors
IconBBBEE Consulting and Verification Agencies
IconBusiness Process Management
IconBusiness Process Outsourcing
IconCompany Secretarial Services
IconCompare Medical Scheme Benefits
IconCompliance
IconConsumer Protection
IconCorporate Governance
IconCredit Bureaus
IconDebit Order Collection Facilities
IconEducation and Training
IconEmergency Medical Rescue
IconExpatriate Cover
IconFAIS
IconHealthcare Consultants
IconHuman Resources
IconInformation Technology and Software Partners
IconLegal
IconManaged Healthcare Service Providers
IconMedical Aid Administrators
IconMedical Aid Schemes
IconMedical Schemes Trustees Liability Insurance
IconMedical Service Providers
IconOmbud
IconPolicy Administration
IconPublications
IconRegulatory Authorities
IconSurveys & Research
IconTraining Courses & Workshops
IconWellness Programs
Image
  Subscribe To »

Legal chatbots: something for nothing?

Published

2017

Fri

04

Aug

 

By Jeff Pettit (US)
Norton Rose Fulbright

In June, we introduced the topic of chatbots and highlighted some key risks and concerns associated with this growing area of technology. One business in particular, DoNotPay, made headlines recently by announcing that it would begin building legal chatbots for free.

The claim? In a July 14, 2017, posting to the online publishing platform Medium, Joshua Browder, founder of UK-based DoNotPay, writes, “Starting today, any lawyer, activist, student or charity can create a bot with no technical knowledge in minutes. It is completely free.” Sound too good to be true? To be sure, DoNotPay is not the first company to develop law-related chatbots—these bots are already popping up all over the world. But because this technology is still fairly new, chatbots that are attempting to automate services previously performed by licensed attorneys will almost certainly attract scrutiny.

While platforms such as DoNotPay and ChatFuel seek to lower barriers to entry by providing free tools to get a chatbot up and running, corporations and attorneys who want to integrate these chatbots into the realm of legal services could create costly legal exposure. Before submitting your idea for automating legal services, it is important to stop and consider a few threshold questions:

  1. Who owns “your” bot? Property rights in the digital realm can be a murky area, especially when a user’s original content is mixed with a digital platform’s intellectual property. Previously on the blog we have discussed US IP ownership concerns as they relate to APIs, Instagram, and Facebook “likes.” Generally, the terms and conditions of the platform will address ownership rights and define the relationship between platform and user. Currently, DoNotPay does not appear to require that a user agree to any particular terms and conditions when submitting an idea for a chatbot. In the July 14 Medium posting, Browder asks and answers the ownership question this way: “Who owns the rights to the document? You do. However, you give us permission to make a bot for you and send the link.” Would this simple statement be sufficient under US law to allow you to claim a copyright interest in the resulting software code that DoNotPay generates? The answer is far from certain. What if a DoNotPay employee were to take your submission, develop it into a commercial product, and then monetize that product without your consent or involvement? Would you have any recourse? Again, there is very little certainty.
  2. Once the bot is up and running, who will own the input from users? Legal chatbots may need to collect sensitive personal data from the intended users. In this case, who owns (and is legally responsible to protect) the personal information collected by the bot? Privacy compliance should be a serious concern, as this is an area of heavy regulation that is fraught with liability. In today’s digital economy, “big data” and the “internet of things” are valuable commodities. Will the platforms claim any right to profit from the information collected by these bots? Are these types of arrangements being properly disclosed to the intended users so that consumers understand how their personal information is being used? These issues should be carefully addressed with the platform host upfront.
  3. Does this chatbot put you at risk for the unauthorized practice of law? In the legal realm, chatbots are sometimes praised for adding an element of efficiency to routine legal documents with a limited number of variables provided by the client. But this kind of legal document production could violate a jurisdiction’s unauthorized practice of law (“UPL”) rules. Most, if not all, jurisdictions require legal service providers to be licensed, but what constitutes UPL varies greatly from jurisdiction to jurisdiction. For instance, a jurisdiction could provide that a deed conveying an interest in real property may only be prepared by an attorney licensed in that jurisdiction. In the U.S., the American Bar Association reported in 2012 that 23 states are actively enforcing UPL rules. Courts are also expressing skepticism for attempts at automating the generation of legal documents. See, e.g., Janson v. LegalZoom.com, Inc., 802 F. Supp. 2d 1053, 1065 (W.D. Mo. 2011) (“A computer sitting at a desk in California cannot prepare a legal document without a human programming it to fill in the document using legal principles derived from Missouri law that are selected for the customer based on the information provided by the customer.”). Broadly speaking, the more interactive the service in question, the higher the risk that the service may run afoul of UPL rules. And this sort of personalized interaction seems to be a primary goal of the chatbot technology.

The questions raised in this post are just the tip of the iceberg. Chatbots certainly have the potential to be a major technological disrupter in the legal industry, but there remains a host of risks to consider before integrating this technology into practice.

 
Source: Norton Rose Fulbright
 
« Back to previous page Print this page » |
 

Breaking News »

Healthcare Broker Fee Increase

Regulation 28(2), published in terms of the Medical Schemes Act, now effectively reads as follows: "Subject to sub-regulation (3), the maximum amount payable to a broker by a medical scheme in respect of ...
Read More »

  

South African Constitutional Court rethinking the “once-and-for-all rule”

By Kristen Wagner, Candidate Attorney and Deon Francis, Partner, Clyde & Co www. clydeco. com In the recent decision of MEC, Health and Social Development, Gauteng v DZ [2017] ZACC 37, the Constitutional ...
Read More »

  

MSD Announces Local Regulatory Approval of New Oncology Drug

Johannesburg, South Africa: MSD known as Merck & Co. , Inc. , (NYSE: MRK) in the United States and Canada, welcomes the regulatory approval of their new immuno-oncology drug, Pembrolizumab. In recognising the ...
Read More »

  

Is your compliance officer doing the right things?

By Associated Compliance (Pty) Ltd Compliance Practice 6377 There is global acceptance that the role of a compliance officer is to ensure that a company is conducting its business in full compliance with all ...
Read More »

 

More News »

Image

Investment »

Image

Life »

Image

Retirement »

Image

Short-term »

Advertise Here
Image
Image
Image
Image
Advertise Here

From The Glossary »

Icon

Audited Policy:

A policy on which an audit of the insured’s books or records must be made in order to ascertain the insurance premium due. The premium, usually based on the insured’s payroll, is determined from the audit.
More Definitions »

 
 
By using this website you agree to the Terms of Use.
Copyright © Stoker Risk & ICT (Pty) Ltd 2004 - 2017.
All Rights Reserved.
Icon

Advertise

  Icon

eZine

  Icon

Contact IG

Icon

Media Pack

  Icon

RSS Feeds