Written by - James Conomos
Artificial intelligence is no longer just a business tool. It is increasingly part ofthe evidentiary landscape, and it is beginning to affect how disputes are runand how digital material is assessed in litigation. Emails drafted with AIassistance, AI-generated reports, deepfake-style audio or video, altered voicerecordings and synthetic documents are now realistic possibilities incommercial disputes.Australian courts are now confronting a practical question: can digitalevidence still be trusted in the same way it once was?For businesses, this is not an abstract concern. Many disputes turn on whatwas said in an email, what was agreed in a message thread, or what wascaptured in a recording. If those materials are challenged as manipulated orfabricated, the dispute can quickly shift away from the underlying commercialissue and into a technical contest about authenticity and proof. Understandinghow courts approach authenticity is becoming critical.The Rise of Synthetic ContentAI tools can now generate convincing text, images, audio and video inseconds. Many of these tools are used legitimately and responsibly ineveryday business operations. However, the same technology can also beused to fabricate documents, mimic voices, alter recordings or create realisticbut false digital material.The issue is not simply that this can be done, but that it can be done cheaplyand without specialist expertise. A fabricated email chain can be made to lookgenuine. A recording can be edited to alter meaning without obvious signs oftampering. A document can be subtly amended before it is produced inproceedings. These are no longer remote or hypothetical scenarios.In commercial litigation, where digital documents often form the backbone of aclaim or defence, this creates a new layer of risk.Authenticity and Proof in Australian CourtsThe rules of evidence have not fundamentally changed. Whether proceedingsare governed by the Uniform Evidence Acts (including the Evidence Act 1995(Cth) and equivalent State legislation) or Queensland’s Evidence Act 1977(Qld), parties must still establish that the material they rely upon is what itpurports to be and can be treated as reliable.Historically, authenticity disputes have often been resolved through relativelystraightforward evidence about authorship, storage and alteration. In manycases, the reliability of a document was assumed unless there was a clearbasis to doubt it.AI alters that landscape. Manipulation may not leave obvious traces. AIgenerated content can appear coherent, professional and entirely plausible,even if it is false. This does not mean courts will accept AI-affected materialwithout scrutiny. If anything, it increases the importance of demonstrating theprovenance and integrity of digital records.Metadata, audit trails, server logs and, in some cases, forensic expertevidence are likely to become increasingly significant.The Burden of Proof Has Not ChangedEven as technology evolves, the legal principles remain steady. The partyrelying on a document or recording continues to bear the burden of provingauthenticity and relevance. What is changing is the complexity of proving it.A dispute that once focused squarely on commercial facts may insteadbecome dominated by questions about document history, file creation, accesspermissions, version control and whether AI tools were used to generate ormodify the material. That shift can add cost, delay and uncertainty toproceedings, even where the evidence ultimately proves genuine.For businesses involved in high-value disputes, strong document governanceand secure record-keeping are becoming strategic safeguards.Deepfakes and Altered RecordingsOne of the most concerning developments is the rise of deepfake audio andvideo. In commercial disputes, recordings of meetings, negotiations or phonecalls can be decisive. If a party alleges that a recording has been altered orsynthetically generated, the evidentiary dispute may become as significant asthe underlying commercial claim.Even where allegations are unfounded, the widespread availability of AI toolsmakes authenticity easier to challenge. This can delay proceedings,complicate settlement discussions and require additional expense fortechnical analysis or expert reports.Courts are alert to these risks, and challenges to electronic recordings arelikely to become more frequent as the technology evolves.Courts Are Already RespondingAustralian courts have begun addressing AI-related risks directly, particularlyin relation to litigation documents and submissions. In September 2025, theSupreme Court of Queensland issued Practice Direction No. 5 of 2025,acknowledging the increasing use of artificial intelligence in litigation whilewarning that generative AI tools may produce apparently plausible butinaccurate or fictitious material. The Court emphasised that responsibility foraccuracy and integrity remains with the party and their legal representatives.The message is clear: AI may be used as a tool, but it does not diluteprofessional and evidentiary obligations.What Businesses Should Be Doing NowThe most effective response to authenticity challenges is preparation.Businesses should not wait for litigation to consider whether their digitalrecords would withstand close scrutiny.Robust document management systems, secure storage practices anddisciplined version control are increasingly important. Preserving metadataand maintaining clear audit trails can be critical if authenticity is questioned.Internal awareness also matters. Staff should understand that AI tools canintroduce risk, particularly in sensitive communications, contract drafting orinternal reporting. Clear policies about when AI may be used and how that useis documented can significantly strengthen a business’s position if evidence ischallenged.These measures do not eliminate risk, but they enhance credibility - andcredibility often shapes outcomes.AI as Evidence in Its Own RightIn some disputes, AI systems themselves may become the subject ofevidence. A claim may turn on how an algorithm reached a decision, whetherit relied on flawed data, or whether an automated process operated asintended.In those cases, transparency and documentation become critical. Businessesdeploying AI tools in operational or decision-making contexts should be ableto explain how those systems function, what data they rely upon and whatsafeguards are in place. Without that clarity, defending a claim may becomeconsiderably more difficult.Early Legal Strategy MattersDisputes involving digital evidence and allegations of manipulation are rarelystraightforward. They often require early strategic decisions aboutpreservation of records, forensic investigation and expert engagement. Actingtoo late can result in loss of metadata, incomplete audit trails or an inability toclearly explain how evidence was created and stored.Early advice and timely evidence preservation can shape the direction andsometimes the outcome of a dispute.A New Evidentiary LandscapeAI is not replacing the legal system, but it is reshaping how evidence iscreated, challenged and proved. As synthetic content becomes moresophisticated and accessible, authenticity is likely to become a more frequentbattleground in commercial litigation.Strong governance, secure systems and proactive planning are increasinglyessential. Where the authenticity of digital evidence is questioned, preparationmay determine whether a business can prove its case, defend its position orresolve the dispute efficiently.If your organisation is navigating a dispute involving digital material, or isseeking to strengthen its internal safeguards before one arises, careful legalplanning now may avoid far greater difficulty later.As the evidentiary landscape continues to evolve, preparation matters. If youwould like to strengthen your internal safeguards or seek advice on a currentdispute, our team is here to assist. Contact us to start protecting your positionwith confidence.
