Artificial intelligence (AI) tricked lawyers in a South African court in the coastal province of KwaZulu-Natal, with serious consequences: the AI invented facts.
The search with ChatGPT, an AI chatbot, was intended to facilitate the work of a team of lawyers and find supplementary case examples for their arguments in a dispute before the High Court in Pietermaritzburg.
The AI did just that. The legal representatives submitted a notice of appeal in which they cited several authorities and case studies.
The judge conducted an independent search using ChatGPT to verify one of the citations. To his utter amazement, many of the cited cases were not included in any recognized legal database.
The court ultimately ruled against the plaintiff, stating in the written judgment: "The court has gained the impression that the lawyers placed false trust in the veracity of AI-generated legal research and, out of laziness, failed to check this research."
Facts plucked out of thin air
Tayla Pinto, a lawyer specializing in AI, data protection and IT law, sees a growing threat to the profession: "When asked how this happened and where the citations came from, the legal counsel admitted to using generative AI," Pinto told DW. "This shows that the problem of lawyers not knowing how to use generative AI responsibly and ethically is growing."
According to Pinto, there are three cases in South Africa in which the legal advisors involved used AI to create their court documents: In June, there was a similar misapplication of AI in the case brought by mining company Northbound Processing against the South African Diamond and Precious Metals Regulatory Authority.
This was also the case in 2023 in a defamation trial and the Pietermaritzburg High Court case, which caused a stir in court in 2024 and is now being reviewed by the Legal Practice Council the provincial bar association.
A human-made problem
The case was brought by Philani Godfrey Mavundla, who was suspended as mayor of the Umvoti municipality in KwaZulu-Natal. At first instance, he even prevailed against the responsible regional authority.
However, the latter lodged an appeal - and his lawyers apparently blindly relied on the truthfulness of the case studies provided by the AI before the High Court.
This is not a technological problem, says lawyer Pinto. "We've always used technology in the form of calculators, spell and grammar checkers and so on. Now it's becoming a man-made problem."
She added: "Given the way and pace at which AI is developing, if we are to use AI, we must ensure that we do so in a way that is ethical, responsible and consistent with the duties we have undertaken as a legal profession."
The court dismissed Mavundla's application to appeal the community leadership case on the grounds of a low prospect of success and criticized the pleading of the case as flawed and unprofessional.
The judge ordered Mavundla's law firm to pay the costs of additional court appearances. With this order, the court expressed its disapproval of the law firm's conduct in submitting unverified and fictitious legal evidence.
A copy of the judgment was sent to the Legal Practice Council in KwaZulu-Natal for investigation and possible disciplinary action against the lawyers involved.
Abuse of AI-generated content
Very few formal complaints have been lodged, although a number of matters are now starting to be referred to the Legal Practice Council (LPC) to look into, confirms Kabele Letebele, spokesperson of the Legal Practice Court in Johannesburg.
The LPC continues to monitor developments and trends around artificial intelligence, he says. "At this stage the LPC holds the view that there is not yet a need for a new ethical rule and that our existing rules, regulation and code of conduct are adequately to deal with complaints that regulate the usage of AI - even though the debate on this continues within the LPC", Letebele says to DW.
According to Letebele, legal practitioners are cautioned against blindly citing caselaw picked up using AI tools, as instances where there are inaccuracies will be deemed as negligence and as potentially misleading to the court.
He stresses, that the LPC Law Library is available to the legal practitioners at no cost, the practitioners are able to verify and find latest information regarding case laws and legal researech required when preparing legal matters.
In addition, there are awareness webinars conducted for legal practitioners to highlight issues that the LPC is pickung up and how they can ensure to avoid being in contravention of rules, regulations and code of conduct of the LPC.
Judges, prosecutors and court officials need to be aware that briefs and arguments can now contain not only human errors, but also AI errors.
"Judges rely heavily on the submissions of lawyers during court hearings, especially on legal aspects," says Mbekezeli Benjamin, human rights lawyer and speaker at Judges Matter, in a DW interview. The organization advocates for more transparency and accountability.
Benjamin expresses great concern when lawyers rely too heavily on the use of AI, whose susceptibility to error could mislead the court. "This significantly weakens the judicial process because, unfortunately, it creates mistrust among judges regarding the accuracy of the statements made by lawyers in their arguments," he says.
Amend code of conduct for legal practitioners
Lawyer Tayla Pinto sees no need for specific regulation of the use of AI for judicial research, but does see a need for special attention to the review of references submitted using AI and compliance with ethical standards.
However, Benjamin, says warnings within the legal profession to review the use of AI tools in production is not sufficient. "The Chamber should issue clear guidelines, including an amendment to the Code of Conduct, to regulate how AI should be used in judicial proceedings. But also make it clear that excessive reliance without reviewing AI content constitutes professional misconduct."
Benjamin also calls for a revision of the profession's code of conduct so that the inappropriate use of artificial intelligence can be punished as a breach of duty with a hefty fine or even exclusion or removal from the register of legal professionals.
The South African Law Society also warns that even the inadvertent submission of false information can ruin a career.