Following the release of the Commonwealth Ombudsman’s report, What if the Computer is Wrong?, which warned of over-reliance on automation in visa decision-making, The Koala News asked the Department of Home Affairs (DHA) whether similar automated systems are used in the student visa program. The questions it asked are as follows:
- Is automated processing utilised in the student visa system and/or other programs, either in an onshore or offshore context? If so, for what programs and for what applicants or applicant cohorts?
- Are computer programs of any kind used to determine or influence the outcome of a Student visa application in either an onshore or offshore context?
- To what extent is a human involved in making each student visa application decision?
- To what extent is a human involved in each decision to grant a student visa?
- To what extent is a human involved in each decision to refuse a student visa?
- Are any AI systems utilised in the assessment of student visa applications? If so, can you explain its role and how it interacts with human assessment?
- Noting the ombudsman report, are similar automated features used in the student visa program in either an onshore or offshore context?
In response, the Department confirmed that computer-assisted processing has been a feature of Australia’s visa system for more than two decades, including within the student visa stream.
“For more than 20 years, the Department has used computer-assisted processes to assess aspects of a visa application against the relevant legislative criteria set out in the Migration Regulations 1994 and the Migration Act 1958,” a DHA spokesperson said.
“This includes decisions to grant visas for particular visa programs, including student visas.”
The Department added that while automation supports parts of the process, decisions to refuse a visa are always made by a human officer.
“A visa officer will make a decision to refuse a visa application where it does not meet the legislative criteria to be granted – this is never undertaken through computer-assisted processing,” the spokesperson said.
Human oversight, but automation informs scrutiny
The Department confirmed that automation is used to apply “risk settings” that determine the level of scrutiny applied to particular caseloads or applicant cohorts.
“The Department uses risk settings to determine the level of scrutiny to apply to a particular caseload, including the use of computer-assisted processing. These settings are closely monitored and adjusted as necessary.”
This suggests that while human case officers remain responsible for the final decision, computer systems may influence how quickly, thoroughly, or manually applications are reviewed — particularly for low-risk or high-volume streams.
The Department did not confirm the use of artificial intelligence (AI) specifically but noted that it “routinely looks for opportunities to adjust and modernise the visa system to more readily support changes to immigration settings.”
Context: Ombudsman warns on automation pitfalls
The Ombudsman’s report, published in September, found that the Department unlawfully cancelled a visa using an automated process that did not meet the legal threshold for a valid decision. The case sparked wider questions about the role of computer-assisted decision-making in immigration operations.
The report recommended greater transparency and accountability in how automation is used, particularly where technology may influence or trigger legal outcomes that affect individuals.
In that context, the Department’s confirmation that similar computer-assisted systems are active in the student visa program will likely raise further discussion across the international education sector — especially as universities, VET institutes, ELICOS providers and agents report significant fluctuations in student visa outcomes and processing times this year.
Sector watching closely
The Department’s statement may reassure some that visa refusal decisions remain in human hands. However, the response does not clarify whether this decision involves ‘deciding’ to trust a recommendation from AI to refuse the visa or if it involves a human actually making an informed and reasoned decision on the merits of the application.
Stakeholders have long raised concerns about how automated risk algorithms and triage systems might disadvantage certain cohorts or source countries. While the Department of Home Affairs maintains that these settings are “closely monitored,” the criteria behind them remain undisclosed — a persistent frustration for education providers left to interpret policy shifts through changing visa outcomes rather than clear, transparent guidance.
As one university representative told The Koala News following the Ombudsman’s report:
“Automation may improve efficiency, but the system still needs to be explainable. When the computer flags risk and the outcome changes, everyone deserves to understand why.”
Transparency is the next test
With automation playing a clear role in visa processing, the next question is how much visibility applicants and institutions should have into how those systems operate.
The Ombudsman’s report and the Department’s response mark a turning point in that conversation. The challenge now will be ensuring that the balance between efficiency and accountability — between machine assistance and human judgment — remains firmly in public view.
Final word
The Department’s response sheds light on the fact that automation is part of student visa decisions — but it also echoes the questions raised by Mr D’s case in the Ombudsman’s report. That decision showed what can happen when a system built for efficiency slips outside its legal bounds. While Home Affairs insists human officers make the final call, Mr D’s experience is a reminder that even “assisted” decisions can have very human consequences.
The Koala News precursor story “‘What if the Computer is Wrong?’ Ombudsman Warns of DHA Automation Pitfalls” can be seen here.