‘World’s first robot lawyer’ short-circuited by prosecutors, faces class action suit

DoNotPay chatbot app raises questions, and hackles about the use of AI for legal services
Robotic Hand Assisting Person For Signing Document

In recent years, artificial intelligence has been swiftly permeating the legal world. Legal research tools like Westlaw and LexisNexis have used a type of AI, called Natural Language Processing (NLP), for more than a decade, and now ChatGPT has passed the bar exam. But when I first heard the term “robot lawyer,” I just had to find out what that was all about.

Enter DoNotPay, a New York-based tech company that dubbed itself the “world’s first robot lawyer.”

As a child of the ‘80s, I recalled several science fiction films that featured “futuristic” technology like self-driving cars, video calls, and military drones, which have all come true. This line of thought inevitably brought to mind the film, “The Terminator,” and its tech corporation, Cyberdyne Systems, which was responsible for the development of Skynet, a self-aware AI bent on eradicating humanity. I envisioned an Arnold Schwarzenegger lookalike wearing a suit bursting through the courtroom doors.

However, I learned that DoNotPay’s robot lawyer is just a legal service chatbot (aka lawbot) app, and its creator pulled a publicity stunt on Twitter.

DoNotPay CEO and founder Joshua Browder created the company in 2015 as web-based software to help consumers contest parking tickets. It later became an app that adopted the use of OpenAI’s GPT-3 platform and expanded to include other legal services, such as generating demand letters and tracking down money from unclaimed inheritances and forgotten refunds.

The landing page of the DoNotPay website boasts they can help consumers “fight corporations, beat bureaucracy, find hidden money, and sue anyone.”

In January 2023, Browder announced that on Feb. 22, DoNotPay’s “robot lawyer” would represent a defendant fighting a parking ticket in an actual courtroom. Through the use of Apple AirPods in the defendant’s ears, the AI would listen to the case and provide real-time advice to its client.

A few days later, Browder took to Twitter to raise the stakes with the following statement:

“DoNotPay will pay any lawyer or person $1,000,000 with an upcoming case in front of the U.S Supreme Court to wear AirPods and let our robot lawyer argue the case by repeating exactly what it says. We have upcoming cases in municipal (traffic) court next month. But the haters will say ‘traffic court is too simple for GPT.’ So, we are making this serious offer, contingent on us coming to a formal agreement and all rules being followed. Please contact me if interested!”

Browder’s plans never came to fruition. In late January, he tweeted that he was pulling the plug after receiving “threats” from “State Bar prosecutors.” He claimed one of the prosecutors told him that if he proceeded, he could face six months of jail time for the unauthorized practice of law.

He later told the Twitterverse that the company is “postponing our court case and sticking to consumer rights.”

‘People are overestimating it’

Attorney John Weaver, who wrote a book called, “Robots Are People Too: How Siri, Google Car, and Artificial Intelligence Will Force Us to Change Our Laws,” says he doesn’t think robot lawyers will become a thing anytime soon. Weaver is on the Board of Editors for RAIL: The Journal of Robotics, Artificial Intelligence & Law and writes a column, “Everything Is Not Terminator.”

“People are overestimating what it [AI software] can do in the next two years and underestimating what it can do in the next ten,” Weaver says. “But I suspect that leap from using technology as an assistant in court to actually licensing attorneys that are not human beings is going to be a bridge too far for the foreseeable future.”

Weaver says there are a lot of ways using software like DoNotPay for legal services could go wrong.

“Chef’s kiss as a publicity stunt – very well done,” Weaver says of DoNotPay’s robot lawyer scheme. “But I would say there’s a word of caution – lessons from this story for two groups. One, for bar associations and courts, there’s a certain population where these services and products are appealing. Courts and bar associations should think about how they want to respond to that. The other cautionary tale is for the consumers that use these to think carefully about the quality of the service or representation they are receiving. What data is being used to train it? Do the parties behind these services and applications have ulterior motives? Are they really in the business of providing legal services or are the legal services they claim to provide just a loss leader to fund and support their actual business model?”

On March 3, Chicago-based law firm Edelson PC filed a complaint against DoNotPay in San Francisco Superior Court, seeking a class action lawsuit. The complaint, filed on behalf of former DoNotPay customer Jonathan Faridian alleges the company is practicing law without a license and that it misleads the public with respect to its services.

In the complaint, attorney Jay Edelson says, “Unfortunately for its customers, DoNotPay is not actually a robot, a lawyer, nor a law firm. DoNotPay does not have a law degree, is not barred in any jurisdiction, and is not supervised by any lawyer.”

Browder denies any wrongdoing and says he will vigorously fight the lawsuit. He subsequently took to Twitter once again saying Faridian’s claims have “no merit,” and that DoNotPay is “not going to be bullied by America’s richest class action lawyer.”

Colossus system

“This is just a guy playing P.T. Barnum with something, and it sounds like it backfired on him,” Amherst attorney Kirk Simoneau says. “I don’t think we are close to Skynet.”

Simoneau is all for AI that helps with more immediate access to information, such as Westlaw with its NLP, but for certain practice areas like his own that involve persuasion and being in court, he believes the effectiveness of AI starts to wane.

“Here’s a really good example,” Simoneau says. “I was at the law school this morning with the Webster Scholars. We do a training every year on the DOVE Project and today was the day the students presented their cases. They had a pretend trial, pretend witnesses, the whole shooting match. All those students had the exact same information. They were all given the same intelligence – if you will – the same law, the same statutes, the same exact fact pattern, the same cases. Every single one of those students presented it differently and with different levels of effectiveness.

Simoneau adds, “AI can be super helpful in the legal profession to quickly search through every case that’s out there for the relevant precedence. But then what do I do with it? I don’t think machine learning and AI are going to be able to take over the ‘what you do with it?’ part very effectively.”

In the personal injury field, people have been using AI for decades with a system called Colossus. The program uses algorithms to look for prior verdicts and uses them to place a value on the injury.

“The computer program says, ‘Well, your client has a broken arm? Here is what the broken arm is worth – we are going to pay you based on what the medical costs are across the country.’ It’s all AI-driven,” Simoneau says. “But what does the computer do when you say, ‘Well, wait a minute. My client with a broken arm is deaf, and they use their arm to communicate using sign language.’ The computer program doesn’t know what to do with that.”

This article is being shared by partners in the Granite State News Collaborative. For more information visit collaborativenh.org.

Categories: Law, News, Technology