Bipartisan committee members are among many who are concerned about the federal government's growing reliance on computers for decision-making. They fear that automation may compromise crucial protections that human judgment offers.
A Senate committee has questioned moves by the home affairs minister, Clare O'Neil, and the agriculture minister, Murray Watt, to expand the use of automated decision-making in immigration and biosecurity. The committee is urging the government to heed the findings of the Robodebt Royal Commission and the Commonwealth Ombudsman's artificial intelligence guidelines.
Both ministers have issued new regulations to devolve some decisions normally made by them and their officials to AI computer programs. Also known as delegated legislation, regulations enable ministers to extend the reach of existing laws by decree, bypassing a parliamentary vote.
In its latest monitoring report published last week, the committee for the scrutiny of delegated legislation warned the moves could impede the powers of ministerial discretion designed to guard against one-size-fits-all decision-making.
“The committee considers that the use of an automated decision-making process may operate as a fetter on discretionary power by inflexibly applying predetermined criteria to decisions that should be made on the merits of each individual case,” it said.
The immigration regulation relates to existing national security restrictions on certain visa-holders undertaking courses of study into critical technology. Currently, the minister has legal discretion in assessing exemption applications. The new regulation allows a computer to adjudicate instead.
Another regulation allows AI to determine when those in charge of – or suspected of having knowledge about – vessels, aircraft or other “conveyances” entering Australian waters, and red-flagged on biosecurity grounds, can be compelled to hand over documents or other information.
A separate Senate committee has also previously raised concerns about the introduction of automated decision-making across government, including the Treasury’s move last year to allow the use of AI in assessing applications for registration as a financial adviser.
Another immigration measure introduced automation into the process for assessing applications under the Pacific visa scheme.
On the new migration regulation, the committee asked O’Neil which aspects of the exemption decisions the computer program would decide, and how and where discretion could still be applied. It wanted more detail on what will inform the program’s decisions, why automation is considered “necessary and appropriate”, what safeguards are in place to ensure the powers of ministerial discretion are still exercised, and how a merits-review process will work.
In response to questions previously raised by the committee about the biosecurity regulation, Watt said government was “considering opportunities for legislative reform” arising from the robodebt royal commission.
He allayed some of the committee’s concerns, but it has asked for more information to be added to the regulation’s explanatory memorandum. It also pointed out some errors in the memorandum. Watt undertook to have them corrected.
The committee pointed to the commonwealth ombudsman’s 2019 guidelines on automated decision-making, which said the existence of discretionary powers does not preclude using automation, but that programs must properly reflect them.
The guidelines said discretionary powers were important because they were “a tool to avoid unfair or unjust outcomes” that ensures legislation “is sufficiently flexible”.
“Agencies should be particularly careful that the system does not constrain the decision-maker in exercising any discretion he or she has been given (under relevant legislation, policy or procedure) or lead to a failure to consider relevant matters which are expressly or impliedly required to by the statute,” it warned.
The acting committee chair, the Liberal senator Paul Scarr, said automation was “becoming a recurring issue”.
“We’re having to repeatedly raise the same scrutiny concerns,” Scarr told Guardian Australia. “The more substantial the decision, the greater impact it has on people’s rights and liberties and the more concerned we are.”