
On Thursday, a judge in the United States handed down sanctions against two attorneys from New York for submitting a legal brief that contained six bogus case citations that were generated by an artificial intelligence chatbot called ChatGPT.
Judge P. Kevin Castel of the United States District Court for the Southern District of New York ordered the attorneys Steven Schwartz and Peter LoDuca, as well as their legal practice Levidow, Levidow & Oberman, to pay a total fine of $5,000.
The judge came to the conclusion that the attorneys had engaged in dishonest behavior and had done “acts of conscious avoidance as well as false and misleading statements to the court.”
In a statement released on Thursday, Levidow, Levidow & Oberman claimed that its attorneys “respectfully” disagreed with the court’s determination that they had behaved in bad faith.
Related;ChatGPT prompts: How to optimize for sales, marketing, writing, and more
“We made a good faith mistake in failing to believe that a piece of technology could be making up cases out of whole cloth,” the company stated in its statement.
According to Schwartz’s attorneys, he declined to comment on the matter. A request for comment was sent to LoDuca, but he did not immediately respond. His lawyer stated that they are now evaluating the ruling.
Related;Mercedes is adding ChatGPT to its infotainment system, for some reason
Schwartz stated in May that he had utilized ChatGPT to help research the brief in a client’s personal injury lawsuit against Colombian airline Avianca (AVT_p.CN) and had inadvertently included the fake citations in the document. The case was a personal injury claim brought by the client against Avianca. On the memorandum that Schwartz drafted, LoDuca’s name was the only one that appeared.
In March, attorneys for Avianca informed the court for the first time that they were unable to find several of the instances that were listed in the brief.
Bart Banino, an attorney for Avianca, stated on Thursday that regardless of the use of ChatGPT by the lawyers, the court reached the “right conclusion” by dismissing the personal injury complaint. Banino was commenting on the court’s decision to dismiss the case. Due to the fact that Avianca’s application to dismiss the action was submitted much too late, the judge granted the motion in a separate order.
In the penalties ruling issued on Thursday, the court stated that there is nothing “inherently improper” about attorneys employing AI “for assistance,” but he also stated that regulations governing attorney ethics “impose a gatekeeping role on attorneys to ensure the accuracy of their filings.”
Related;Apple Reportedly Limits Internal Use Of AI-powered Tools like ChatGPT And GitHub Copilot
In addition to this, the judge stated that the attorneys “continued to stand by the fake opinions” after the court and the airline questioned whether or not these opinions actually existed. His order further stated that the attorneys were required to inform the judges, all of whom were real, who were recognized as the authors of the phony cases of the sentence.
Follow our socials Whatsapp, Facebook, Instagram, Twitter, and Google News.