How should lawyers use AI?
We’re at the start of a long journey of generative artificial intelligence (AI) becoming a major part of our lives.
Lawyers are no exception. In our jobs, lawyers are using AI for various tasks every day.
In recent months and weeks, a lot of new material has been published by various courts seeking to put a framework in place for appropriate use of AI by legal practitioners.
In light of these new developments, it’s a good time to analyse the materials and see what lawyers need to know, especially in Victoria and NSW.
Summary
AI is already systematically used for discovery, legal research, data analysis, contract review, drafting, legal writing and other tasks with specific legal applications like Harvey (discussed here), Lexis + AI , CoCounsel by Thomson Reuters and general AI such as Chat GPT and Google‘s NotebookLM. This will increase.
There are many advantages and limitations of AI. On the upside, AI can reduce costs, improve efficiency, analyse large amounts of information, and improve productivity.
On the downside, because our duty as lawyers is to apply our independent judgment, care and skill, the integrity of the legal system is undermined if we delegate to, or substitute, our critical and independent professional thought to a computer function which may not be reliable, accurate, unbiased, complete or correct.
The main concern of the courts is to ensure that inaccuracy and laziness do not enter the legal process.
Victorian AI Guideline
Back in May 2024, the the Supreme Court of Victoria quietly published a ‘Guideline for litigants: Responsible use of artificial intelligence in litigation’ (Guideline).
The Guideline makes clear that AI tools are not the product of reasoning, nor are they a legal research tool. They use probability to predict a given sequence of words.
The Guideline does not have the status of a practice note.
It clarifies the court’s expectations around the use of AI, but does not contain any requirements for practitioner conduct.
The Guideline merely says that parties should or are encouraged to disclose the use of AI to enable the proper understanding of the provenance of a document.
I suspect that as the use of AI increases, this may be upgraded to a requirement to make such a disclosure, as is now the case in NSW.
The strongest wording in the Guideline is that if you sign or certify a document, you are representing that it is accurate and complete.
It says at [9]: “Reliance on the fact that a document was prepared with the assistance of a generative AI tool is unlikely to be an adequate response to a document that contains errors or omissions”.
Regarding evidence like affidavits and witness statements, the Guideline says at [10]: “particular caution needs to be exercised”… “to ensure that documents are sworn/affirmed or finalised in a manner that reflects the person’s own knowledge and words”.
I suspect that in light of more forceful regulations in other jurisdictions (see NSW and Singapore below), Victoria might revisit the Guideline as the use of AI increases.
Victorian Law Reform Commission consultation paper
Before we leave Victoria, the VLRC published a consultation paper on 17 October 2024 about AI in Victoria’s Court’s and Tribunals. It has a good summary of the regulatory framework for legal professionals, especially the paramount duty to the court and the duty to clients at [7.42] to [7.50].
NSW AI Practice Note
On 21 November 2024, the NSW Supreme Court released Practice Note SC Gen 23 – Use of Generative Artificial Intelligence.
On 3 December 2024, the Chief Justice of NSW held a ‘Generative AI Practice Note Briefing’ to discuss the NSW practice note, which can be viewed on Youtube.
The NSW approach is to allow the use of generative AI in some instances (submissions, outlines of argument, etc) and not in others (affidavits, witness statements, etc).
The Practice Note is noteworthy because it introduces new requirements of parties and practitioners, including amendments to the Uniform Civil Procedure Rules, such as:
AI may be used to generate written submissions or outlines of argument, but the author must verify that the authorities and sources relied upon exist, are accurate, and relevant to the proceedings.
Affidavits, witness statements and character references must contain a disclosure that AI was not used to generate its content (unless leave is sought for an exceptional purpose).
Experts may use AI in the preparation of reports (for example to analyse information, large data sets, or conduct statistical analysis) but must disclose how the AI was used, and keep records of the program and variables used.
Generally, AI must not be used to produce, alter or embellish evidence.
Materials subject to Harman undertakings, suppression orders, etc, must not be entered into any AI program.
The NSW practice note commences on 3 February 2025, including the disclosure requirements (that AI was not used to generate affidavits and witness statements), the verification requirement (regarding authorities cited in submissions), and applications for leave to use AI for expert reports.
Separately, in July 2023 the NSW bar association published a paper called ‘Issues Arising from the Use of AI Language Models (including ChatGPT) in Legal Practice’. It has a good section on the barristers ethics rules of independence, competence and diligence.
Singapore AI Practice Note
For those wanting more of a deep-dive to see how it’s done elsewhere, on 23 September 2024 the Singapore Supreme Court’s Generative AI Practice Note ‘Guide on the Use of Generative Artificial Intelligence Tools by Court Users’ has guidelines for:
Ensuring accuracy;
Protecting intellectual property rights and confidential or sensitive information;
What you may be required to do if you use AI in court documents; and
Consequences for failure to comply with the practice note (including disregarding evidence, and personal costs orders against lawyers and disciplinary action against lawyers).
Conclusion
I’ll leave it to you to read the practice notes and watch the NSW briefing. Obviously Victorians are only strictly bound by the Victorian Guideline (for now) and the usual professional conduct and ethics rules. No doubt we will all be watching this space as it evolves. Below is a general framework I’ve gleaned from reading these materials.
General principles for the use of generative AI:
Do:
Do exercise your independent forensic judgment at all times.
Do personally verify all AI-generated sources of legal authority for existence, accuracy and relevance.
Do read cases, statutes and other sources of law yourself. Don’t accept an AI summary or answer at face value.
Do use a person’s own knowledge to generate affidavits, witness statements, and other evidence.
Do exercise caution when entering any information into an AI generator to ensure it is not breaching an obligation of confidence, sensitivity, intellectual property or the Harman undertaking.
Don’t:
Don’t use AI for research without exercising your professional skill, care and judgment.
Don’t generate affidavits or witness statements with AI.
Don’t generate expert reports with AI (unless there is very good reason, in which case it should be fully disclosed)
Don’t use AI to generate material ‘in the style of’ a certain person.
Don’t represent that work generated by AI is yours.
Disclosure: this article was written by me.