A lawyer now faces the possibility of sanctions after using ChatGPT to do research for a case. The research presented by the lawyer — Steven Schwartz contained several citations of past court decisions that didn’t even exist.
Schwartz and his law firm have been suing the Airline Avianca on behalf of his client who claims he was injured on a flight to New York City. When the airline asked the judge to dismiss the case, Schwartz used ChatGPT to help him prepare his 10-page brief explaining why the case should continue — only to suffer consequences and embarrassment.
It cited non-existent court decisions including but not limited to: “Varghese v. China Southern Airlines,” “Martinez v. Delta Airlines” and “Miller v. United Airlines.” He said that ChatGPT insisted the fake court cases were real, and that he was not aware ChatGPT could be factually incorrect.
I’ve mentioned in the past that if ChatGPT doesn’t know the answer to a question, it will generate a fake answer in many cases (I must add that it is quite convincing at times).
Schwartz said that he ‘greatly regrets’ using ChatGPT and ‘and will never do so in the future without absolute verification of its authenticity’. The hearing regarding the sanctions on Schwartz will be held on June 8.
Related Articles
Neuralink Brain Implant Approved For Human Trials