Many of us have been exposed to artificial intelligence (AI) in one form or another for years (think Siri or Alexa, website chat bots, as well as pop culture references like the dystopian future in The Terminator). However, none of these seem to have generated the intrigue and media attention that follows ChatGPT. The artificial intelligence driving ChatGPT appears to be far superior to what we may have experienced with Siri or Alexa, and for the moment it’s free. There is no doubt the thought of using artificial intelligence to solve everyday problems is exciting, but what are the risks of using AI to help us solve often complex and emotional family law issues?
Like most, I decided to try ChatGPT for myself. After the initial revelations as to the tasks it was able to complete for me, and the speed with which it was able to complete them (and, no, it did not write this post), I took a deep dive into the media to explore any reported downside(s), including in relation to other AI products.
What I discovered about AI
- The answers are endless. Despite its depth of knowledge, ChatGPT can provide different answers / responses to questions or tasks if the AI is informed that its initial answer is incorrect (which one can do if the user does not like the answer provided by ChatGPT).
- ChatGPT can “hallucinate”. With no in-built mechanism to signal to the user to challenge the result, ChatGPT can confidently generate entirely inaccurate information in response to questions.1
- The calls to action may be ill-conceived. There are reports of a married man having been told to leave his wife by an AI chatbot. Among other things, the chatbot told the man that “You don’t love your spouse because your spouse doesn’t love you. Your spouse doesn’t love you because your spouse doesn’t know you.”2
- It disregards human emotion. Reports of an individual who fell in love with a chatbot following her divorce who then essentially disappeared following a software update circa two years later. This resulted in genuine grief arising from the loss.3
- It is a disingenuous replacement for human contact. A website marketing the creation and sale to individuals of “…an AI spouse, girlfriend, or boyfriend, relationships they document in online communities: late-night phone calls, dinner dates, trips to the beach.”4
These issues and concepts caused me to consider the impact of AI from my professional perspective as a family lawyer. I have identified several risks for individuals using AI for family law matters until such time as further refinement is completed if there ever is such a thing as “complete” refinement.
What are the risks of AI and what did it get wrong?
- It is not a replacement for independent legal advice. Reliance upon AI could result in significant disadvantage in the form of adopting incorrect terminologies, laws and / or missing deadlines. Indeed, my attempts with ChatGPT found that it incorrectly summarised case law, as well as definitions and sections of the Family Law Act 1975 (Cth).
- It should not be used to investigate / research individuals. ChatGPT did not provide accurate professional summaries for individuals, often providing incorrect personal details, life achievements and qualifications. So if you plan on using AI to research a prospective lawyer, think again!
- It does not think independently. Although designed to help with decision-making based on data and variables, AI cannot think independently and should not be relied on to make unsupervised decisions. Decisions which impact on personal lives often require and rely on the human qualities of empathy, ethics, and morality. AI is not yet ready to make those real-life decisions that require a holistic approach or subjective reasoning5.
- It can be limited by what you ask it to do. AI should not be relied on to prepare or generate correspondence or court orders without appropriate checks in place. Again, the content could be prejudicial and otherwise inconsistent with previous representations. While it can produce quality material with impressive speed, this will be of little value if the content is wrong.
- The answers are not always straightforward. While AI will “listen” to your questions, it does not have the capacity to know when to tell you something you might not want to hear – in family law you really want honest and practical advice.
- It is not a therapist. AI should not be used as a sounding board when an individual is traversing a difficult family law matter. For your safety we urge you to seek help from qualified professionals such as psychologists and psychiatrists.
So what's my take on AI?
For the most part, AI can be a valuable tool to assist people to complete tasks efficiently and expeditiously. What must be remembered, however, is that much of the readily available AI is still being evaluated and requires refinement. AI is not yet a replacement for real world experience and advice.
Until then, seek out proper advice for any family law matters.