Skip to content

Expert Witness Standards: Recent Changes Solicitors Should Know

Published March 31, 2026

The past six months have produced a rapid convergence of judicial guidance, professional body standards, and regulatory consultation that together amount to the most significant shift in expert witness practice since the Woolf reforms. For forensic accountants who give evidence in court, the message is straightforward: if you use artificial intelligence in any part of your work, you must disclose it. The question is no longer whether disclosure rules will arrive, but how quickly they will be formalised.

This article sets out the three key developments, explains what they mean for forensic accounting practice, and offers practical guidance for solicitors who instruct expert witnesses.

Three Developments in Quick Succession

Three separate but connected developments between October 2025 and February 2026 have reshaped expectations around AI use by expert witnesses.

October 2025: Judicial AI Guidance. Lord Justice Birss, Chancellor of the High Court, published updated guidance on AI use within the judiciary. While primarily directed at judicial use of AI tools, the guidance set an important tone. It noted that "the use of AI by the judiciary must be consistent with its overarching obligation to protect the integrity of the administration of justice." This framing applies equally to expert witnesses, who share that obligation under CPR Part 35 and FPR Part 25.

January 2026: Academy of Experts AI Guidance. The Academy of Experts published comprehensive guidance on AI use by expert witnesses, carrying a foreword by Lord Neuberger of Abbotsbury, former President of the Supreme Court. The guidance introduces a structured risk framework and makes clear that the expert remains personally responsible for every element of their work product, regardless of any AI tools used in its preparation.

February 2026: CJC Consultation on AI in Court Documents. On 17 February 2026, the Civil Justice Council launched an eight-week consultation (closing 14 April 2026) examining whether new rules are needed to govern AI use in preparing court documents, including expert reports. The consultation proposes specific amendments to Practice Direction 35 that would require experts to declare AI use in their statement of truth.

Taken together, these three developments create a regulatory direction of travel that forensic accountants cannot afford to ignore.

Academy of Experts AI Guidance

The Academy's guidance is the most detailed of the three developments and merits close examination. It introduces a three-tier risk classification for AI use by expert witnesses.

Prohibited uses include the complete outsourcing of the expert's work to an AI tool, uses prohibited under applicable law or regulation, and any use that would breach the expert's overriding duty to the court. The rationale is self-evident: an expert who delegates their analysis entirely to AI cannot honestly sign a statement of truth confirming that the opinions expressed are their own.

High-risk uses require disclosure and justification. These include generating substantive content for the expert's report (including the expert's opinion), undertaking material analysis on which the opinion will be based, and recreating scenarios such as counterfactual modelling used as the basis for analysis. For forensic accountants, this category captures a significant portion of modern practice. Transaction categorisation software that uses machine learning, anomaly detection algorithms applied to bank statement analysis, and predictive modelling within DCF valuations all fall within this tier.

Low-risk uses encompass administrative tasks: transcription, formatting, scheduling, and general administrative support. These do not require disclosure, though the guidance sensibly notes that the boundary between administrative and substantive use is not always obvious and should be assessed honestly.

The core message is that the expert remains ultimately responsible for their work product. AI cannot substitute for independent professional judgment. Where AI tools contribute to the analysis, the expert must understand what the tool did, verify its output independently, and be prepared to explain and defend the methodology under cross-examination.

CJC Consultation and Proposed PD35 Amendments

The CJC consultation, chaired by Sir Colin Birss, goes further than the Academy guidance by proposing amendments to the procedural rules themselves. The key proposal for expert witnesses is an amendment to Practice Direction 35 so that the statement of truth must include confirmation that the expert has "identified and explained any AI which had been used, other than for administrative uses such as transcription" and has "identified the AI tools used."

The consultation document cites "recent cases in which experts inadvertently relied on inaccurate AI-generated content" as the rationale for these proposals. While no specific cases are named in the public consultation paper, the concern is real and familiar to anyone working in the field. Large language models can generate plausible but fabricated case citations, inaccurate financial data, and confident but wrong conclusions. An expert who incorporates such material into a report without independent verification risks professional embarrassment and, more seriously, misleading the court.

The consultation also proposes parallel disclosure requirements for witness statements (where litigators would declare that AI was not used in preparation) and for pleadings. The focus on expert reports is sharper, however, reflecting the particular danger that AI-generated analysis could be presented as the expert's own professional opinion without adequate verification.

The consultation closes on 14 April 2026. Even before formal rule changes take effect, we expect courts to treat the CJC's proposed standards as a benchmark of good practice. Experts who proactively comply will strengthen their credibility; those who do not may find themselves explaining an uncomfortable gap.

What This Means for Forensic Accountants

Forensic accounting sits at the intersection of several high-risk AI use cases identified in the Academy guidance. Our work commonly involves data analytics tools that increasingly incorporate machine learning. Consider the typical workflow in a forensic investigation:

Transaction categorisation. Modern accounting software and forensic tools use pattern recognition to categorise thousands of bank transactions. Where a machine learning algorithm classifies a payment as "personal expenditure" rather than "business expense," that classification may directly affect the forensic accountant's opinion on hidden income or lifestyle inconsistency. Under the Academy guidance, this is a high-risk use requiring disclosure.

Anomaly detection. Statistical tools that flag unusual transactions for further investigation are increasingly common in fraud work. Benford's Law analysis, for example, can be performed manually or by software. Where the software incorporates AI-driven pattern recognition beyond simple statistical tests, the methodology must be disclosed and the expert must be able to explain how the tool identified anomalies.

Valuation modelling. A discounted cash flow model that uses AI-assisted revenue forecasting, whether through analysis of market data, competitor performance, or economic indicators, falls squarely within the high-risk category. The expert must disclose the AI tool used, explain how it informed the forecast assumptions, and demonstrate that they independently assessed and validated the output. This is particularly relevant in business valuations where the sensitivity of the DCF model to revenue assumptions is well understood.

Report drafting. Using AI to draft sections of an expert report is a high-risk use. The expert must ensure that every statement of opinion, every factual claim, and every calculation has been independently verified. The signed statement of truth attaches personal responsibility: "I confirm that I have made clear which facts and matters referred to in this report are within my own knowledge and which are not." That obligation is not diminished by the involvement of an AI drafting tool.

Practical Compliance Steps

Drawing on the Academy guidance and the CJC's proposed amendments, we recommend that forensic accountants acting as expert witnesses adopt the following practices now, ahead of any formal rule changes.

Audit your software stack. Identify every tool you use in forensic work that incorporates AI or machine learning. This includes accounting software, data analytics platforms, document review tools, and any generative AI used in research or drafting. Many tools incorporate AI features that are not prominently labelled. Check with vendors if necessary.

Develop a standard AI disclosure paragraph. Include a section in every expert report that identifies any AI tools used in the preparation of the report, describes the purpose for which each tool was used, confirms the steps taken to verify AI-generated output independently, and states that the opinions expressed remain the expert's own professional judgment.

Document your verification process. Where AI tools contribute to the analysis, maintain a clear record of how the output was verified. For transaction categorisation, this might mean a sample check of categorised items against source documents. For valuation modelling, it means documenting the assumptions tested and the sensitivity analysis performed on AI-generated inputs.

Retain AI working papers. Keep records of AI prompts, inputs, and outputs as part of the expert's working papers. These are potentially disclosable and should be treated with the same rigour as any other working paper underlying the expert's opinion.

Prepare for cross-examination on AI use. Opposing counsel will increasingly explore whether and how AI was used in preparing an expert report. The expert should be comfortable explaining their AI use, their verification process, and why the AI output was or was not relied upon. An expert who cannot explain the methodology of a tool they used will struggle to defend their opinion under scrutiny.

What Solicitors Should Ask When Instructing Experts

The changes described above have direct implications for how solicitors manage expert instructions. We suggest the following additions to standard practice.

Address AI in the letter of instruction. Include a provision specifying that the expert must comply with the Academy of Experts AI guidance and, where applicable, identify any AI tools used in preparing the report. This sets expectations from the outset and provides a contractual basis for requiring compliance.

Ask about the expert's AI policy. Before appointment, ask the prospective expert whether they have an AI use policy, what tools they use, and how they verify AI-generated output. An expert who has thought through these questions demonstrates the kind of methodological rigour that strengthens their evidence. One who has not may present a risk.

Consider the cross-examination angle. When reviewing a draft expert report, consider whether the opposing side could challenge the expert's credibility on AI grounds. If the report uses AI-derived analysis without adequate disclosure, flag this before the report is finalised. It is far better to address the issue proactively than to have it exposed in cross-examination.

Monitor the CJC consultation outcome. The consultation closes on 14 April 2026. The resulting rules, if adopted, will apply to all expert reports in civil proceedings. Solicitors should keep their expert witness panels informed of developments and update standard instruction letters accordingly.

The direction of travel is clear. Transparency about AI use is becoming a professional obligation for expert witnesses, not merely a matter of best practice. Forensic accountants who embrace this shift early, by developing clear policies, documenting their processes, and disclosing AI use in their reports, will strengthen their standing as credible, trustworthy experts. Those who treat it as an inconvenience risk finding that their evidence is discounted or their credibility undermined at the worst possible moment: in the witness box.

Key Takeaways

  • The Academy of Experts (January 2026) introduced a three-tier risk classification for AI use by expert witnesses, endorsed by Lord Neuberger
  • The CJC consultation (closing 14 April 2026) proposes amending PD35 to require AI disclosure in the expert's statement of truth
  • Forensic accounting tools involving transaction categorisation, anomaly detection, and valuation modelling commonly fall within the "high-risk" category
  • Experts must be able to explain AI methodology and defend their verification process under cross-examination
  • Solicitors should address AI use in letters of instruction and ask prospective experts about their AI policies

If you need to instruct a forensic accountant expert witness who maintains transparent AI compliance practices, contact Jack Ross or call 0161 832 4451.

Last updated: March 2026

Frequently Asked Questions

There is no formal rule requiring disclosure yet, but the Academy of Experts guidance (January 2026) sets a professional standard that courts are likely to treat as best practice. The CJC consultation proposes amending Practice Direction 35 to make AI disclosure a formal requirement in the statement of truth. Prudent experts are already including voluntary AI declarations in their reports.

Under the Academy of Experts framework, high-risk uses include generating substantive report content, undertaking material analysis on which the expert's opinion is based, and creating counterfactual scenarios for analysis. In forensic accounting, this covers machine learning-based transaction categorisation, AI-driven anomaly detection in bank statements, AI-assisted revenue forecasting in DCF valuations, and any use of generative AI in drafting opinion sections of the report.

Using AI for drafting is not prohibited, but it is classified as a high-risk use. The expert must independently verify every factual claim, calculation, and opinion statement. The signed statement of truth confirms the opinions are the expert's own. Complete outsourcing of the report to AI is prohibited. The practical requirement is that the expert must be able to explain and defend every part of the report as their own work, regardless of the tools used in preparation.

The CJC consultation closes on 14 April 2026. After that, responses will be analysed and recommendations made to the Civil Procedure Rule Committee. There is no fixed timetable for implementation, but given the strong support from the judiciary and professional bodies, amendments could take effect within 2026. Courts are likely to expect voluntary compliance with the proposed standards in the interim.