Careful supervision is necessary to identify hallucinations in work produced using artificial intelligence tools.
Solicitors must work to high ethical standards and ensure their work is accurate and compliant with their professional obligations under the Legal Profession Uniform Law’s Australian Solicitors’ Conduct Rules. This extends to work produced with the assistance of artificial intelligence (AI), especially generative AI (Gen AI).
Seeing Things? Spotting AI Hallucinations
AI usage is fraught with risk for solicitors because it can produce references to fictitious cases and legislative citations, as well as to fabricated secondary sources, such as legal texts, articles or reports that do not exist. These are often referred to as “hallucinations”.
AI may also produce citations that are correct but are not an authority for the proposition they are said to support or do not take account of legal developments or jurisdictional differences. Hallucinations and errors in AI content can be difficult to identify because the output may appear fluent, persuasive and reliable – on top of being included alongside genuine material.
A concerning number of reported cases in Australian courts and overseas have emerged where solicitors have sought to rely on hallucinated citations.
In addition to facing disciplinary action, this exposes solicitors to personal costs orders arising from the use of false citations or other errors in AI-generated work product and the associated time, inconvenience and delays caused to parties and courts who must deal with the issues.
AI Errors Can Lead to Liability
In the Federal Court decision of Murray on behalf of the Wamba Wemba Native Title Claim Group v Victoria [2025] FCA 731, a law firm was ordered to personally pay the costs of other parties to the proceeding on an indemnity basis. This arose from the reliance on fake citations and references to secondary material in court documents, which resulted in significant wasted time and cost to the parties and court.
The Court held that:
“The error was centrally one of failing to check and verify the output of the search tool, which was contributed to by the inexperience of the junior solicitor and the failure of [the principal] to have systems in place to ensure that [the solicitor’s] work was appropriately supervised and checked” [at 15].
Responsibility rests with principals to supervise and review all client work – including documents produced with the assistance of AI – and apply their own forensic judgment to ensure that the work is accurate and relevant. It is not sufficient to check that citations and references are correct. Solicitors and their supervisors have a duty to review referenced cases, legislation and source material carefully to ensure that they:
- are an authority for the intended proposition or principle
- reflect current law and have not been distinguished or overturned by a higher court
- are relevant to the client’s case and circumstances
AI should not be used as a substitute for personal research by traditional methods such as authorised law reports, electronic case citators, legal reference books or ebooks and government legislation websites.
Guidelines for Responsible AI Use
The Supreme Court of Victoria, the County Court of Victoria and the Victorian Legal Services Board and Commissioner (VLSB+C) have produced guidelines1 for the use of AI that all solicitors should read and comply with if AI is used in their practice. The LIV’s AI Hub also provides further resources and guidance for solicitors.
Suggested safeguards and systems for supervising client work generated with the assistance of AI include:
- understand how individual AI products operate – and their shortcomings and risks – and regularly monitor the output of products used by the firm
- implement clear policies and procedures for the firm use of AI, including what products may be used and when, as well as how the firm will supervise their use and check work product for accuracy and relevance
- review policies regularly to ensure they are best practice and compliant with court and VLSB+C guidelines
- ensure all staff receive regular training on how to properly use approved AI tools and check work product
- have robust systems in place to monitor compliance with firm policies for AI and supervision – this includes protecting against the risk of solicitors using different AI outside of the firm’s authorised AI tools, such as on their personal devices.
The bottom line is that solicitors and their supervising principals should check all client work and court documents for accuracy and relevance including carefully reviewing the underlying law. This should be done for all work – whether AI is used or not.