
Risks of Using ChatGPT in Family Law Matters
20 March 2025
By Anthony Worrall
Artificial intelligence (AI) is becoming more common in everyday life, including legal matters. While AI tools can provide general guidance, using them as a party in a family law case in Western Australia can lead to serious risks and unintended consequences.
-
Inaccuracy and Misinterpretation
AI systems rely on algorithms and databases that may not reflect the latest legal precedents or local family law rules. Australian family law, particularly in Western Australia, operates under the Family Court Act 1997 (WA), which differs from the federal Family Law Act 1975 (Cth). AI-generated advice may not account for these differences, leading to incorrect or misleading information.
-
Lack of Human Judgment
Family law cases are highly emotional and complex, involving parenting arrangements, property settlements, and financial disputes. AI lacks the ability to understand the nuances of personal relationships, child welfare concerns, or the specific circumstances of a case. A legal professional considers human factors and legal strategy, something AI cannot replicate.
-
Privacy and Data Security Risks
When using AI tools, users often input personal details, financial records, and sensitive information. If the AI platform is not secure, this data may be stored, shared, or even misused. In a legal dispute, leaked or misinterpreted information can be harmful.
-
No Legal Standing in Court
Courts only recognise submissions from individuals, lawyers, or legal representatives. AI cannot file documents, negotiate settlements, or appear in court on your behalf. Relying on AI instead of a lawyer could result in procedural mistakes or even loss of rights.
Conclusion
While AI can assist in researching legal topics, it should never replace professional legal advice. If you are involved in a family law matter in Western Australia, consult a qualified family lawyer to ensure your rights and interests are properly protected.
…That was a bit dull wasn’t it? That was what ChatGPT produced for me when I asked it for a 300-word article for a general audience warning against the risks of using AI as a party in a family law matter in Western Australia.
It has all the hallmarks of an AI piece, including repetitive language and sentence structure and an over reliance on vague concepts to try and make a point.
However, a lack of creative flourish is not the biggest issue to be aware of when using AI as a party in family law matters. If you are recently separated and are seeking even surface level knowledge about how family law works from AI, there is a good chance that it can give you information that is plainly wrong.
Quite fittingly, the following statement under point 1 (Inaccuracy and Misinterpretation) above proves this point for me:
“Australian family law, particularly in Western Australia, operates under the Family Court Act 1997 (WA), which differs from the federal Family Law Act 1975 (Cth).”
This isn’t entirely right. De facto couples in WA are subject to the Family Court Act 1997, however married parties are subject to the Commonwealth Family Law Act 1975. If ChatGPT can’t even identify the correct governing legislation, then how can you be confident that any tips you’re getting are correct as well?
As the article correctly raises at paragraph 3 as well, there are significant privacy risks in providing your personal details to ChatGPT. As it is constantly developing and training its own models, ChatGPT collects all data inputted by users and retains that information indefinitely. As its predictive capabilities improve, it can make deductions from the information provided to it, including inferring things from what people tell it about themselves.
For example, if you are seeking ChatGPT’s help to draft a letter to your ex’s lawyer that includes information concerning you, care arrangements for your children, your suburb of residence and your children’s names and ages, ChatGPT can infer to an accurate degree what school your children go to and a greater picture of what their weekly schedule looks like.
But why does that matter? As the Office of the Victorian Information Commissioner has outlined, it is what can be done with this information. As information is inferred but not directly provided, the line of what is protected by privacy law becomes obscured and these private details simply may not be protected. This is very problematic considering that OpenAI have had several data breaches in the last few years.
If the above information were to be leaked and not subject to any data privacy protections, then your children’s schooling and regular whereabouts would be available to be collected and used by bad actors, which could have a range of consequences.
ChatGPT is a very effective and time-saving tool for a lot of things, but you should think twice before involving it in your family law matter. It may simply not give you the right information and there is no knowing how the information you provide may be used in the future.
Disclaimer
This article is not legal advice and the views and comments are of a general nature only. This article is not to be relied upon in substitution for detailed legal advice.
Please contact us if you would like to republish this article.