Use of automated systems – applying for jobs, completing tax returns, ordering tickets – is widespread, particularly by government agencies. For information gathering and routine administrative functions, there are undoubted benefits. But when decisions are made by an automated or expert system, there are lurking administrative law dangers.
Traditionally decision-making has been undertaken by humans. Challenging decisions by government is predicated on that belief. Our administrative law institutions are also built on that expectation. These expectations need adaptation to navigate the artificial intelligence (AI) landscape.
- do not easily translate complex legislation or principles from cases;
- may not be programmed to include rules from different parts of an Act, generic legislation eg interpretation Acts, FOI Acts, or competition laws, or capture courts’/tribunals interpretation of legislation;
- may fail to take account of unwritten common law such as statutory presumptions, or context such as beneficial intent; and
- the business rules may unlawfully fetter discretionary powers.
Risks may trigger administrative law grounds
Issues which have arisen include:
- Is it a ‘decision’ if not made by Secretary/delegate/agent: ‘black box problem’?
- Has there been a breach of procedural fairness if there is no explicit mental engagement by the decision-maker; or if the citizen is not given an opportunity to explain discrepancies eg Robodebt
- Has the system identified the correct question? Applied the correct principle of law? Unnecessarily fettered a discretionary power to comply with policy? Breached reasons requirements Strategies to mitigate risks
Ways to avoid adverse consequences are listed in the ARC’s Automated Assistance in Administrative Decision Making: Better Practice Guide (2007).
The guides recommend:
- expert systems should not automate the exercise of discretion;
- pre-testing of systems on users;
- ensure provisions in agency specific legislation deem decisions by expert systems to be lawful;
- teams designing expert systems should comprise those with expertise in systems, as well as legal and policy experience;
- train officers to explain automated decisions and to make decisions manually when system malfunctions;
- regular system reviews and updates, periodic maintenance;
- build in reasons provisions;
- ensure systems is FOI/Privacy-compliant; and
- Is it legible? (EU General Data Protection Regulation).
Following these tips will protect the system against administrative law challenges and ensure the advantages of the automated system as a useful tool of government are realised.