Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - AU
The Guardian - AU
Comment
Georgia van Toorn

Australians who need care because of age or disability shouldn’t be reduced to an algorithm

Man using walking aid
‘Once the domain of health professionals, these care decisions used to rely on a combination of clinical expertise and the basic human ability to recognise and respond to the needs of others.’ Photograph: Rosemary Roberts/Alamy

A quiet transformation is taking place across Australia’s care systems. Every day, hundreds of elders and people living with disability are assessed for essential supports, such as home care, mobility aids, home modifications and therapies that help them live safely with dignity at home and in their communities.

Once the domain of health professionals, these decisions used to rely on a combination of clinical expertise and the basic human ability to recognise and respond to the needs of others. Computers have neither of these qualities. Yet in the current era of AI hype and the fetishisation of all things automated, we are increasingly turning to computers for guidance on fundamentally human questions of care, vulnerability and need.

The shift towards algorithmic decision-making, as Guardian Australia revealed in February, can be seen in the new Integrated Assessment Tool (IAT), introduced on 1 November 2025 under the Albanese government’s Aged Care Act. The IAT is a rules-based algorithm that sorts aged care applicants into one of eight funding levels, determining both the amount of home care they receive and where they are placed in the queue for services.

The tool was intended to enable a faster, fairer and more consistent process for determining eligibility for subsidised aged care. It works like a computerised questionnaire, using scored questions and rules to place applicants into categories of need. The assessments are conducted face to face, but the assessor’s job is mostly limited to feeding information into the algorithm.

In this regard, the IAT mirrors tools used in the national disability insurance scheme, where human discretion is also being reduced in favour of standardised, algorithmic assessments. Across both systems there is a perverse role reversal, where the deeply human question of what it means to age or live with disability is relegated to a machine, while the professional assessor is increasingly robotised, reduced to a mere ancillary of the algorithm.

In both disability and aged care, the promise of algorithmic efficiency has been overshadowed by stories of delay, frustration and systemic neglect. Aged care clinicians and carers have described the tools as “cruel” and “inhumane”, stripping away clinical expertise and leaving elderly people with inadequate support. In one case, a South Australian woman feared losing her independence after government assessment reduced her funding.

From the middle of this year, changes to the NDIS will mean a person can have their support needs algorithmically reclassified and supports cut with no rights to appeal against the final decision. Aged care has gone even further, removing altogether any mechanism for human override.

The risks of automation in these contexts are stark. When human judgment is removed, outcomes are determined entirely by rules and scores. If the data is incomplete, variables fail to capture what really matters, or factors are weighted incorrectly, the system misreads the situation, and the person is left under- or unsupported.

Such algorithmically generated decisions appear fair but systematically disadvantage those whose lives cannot be neatly reduced to numbers. These are typically people with complex, fluctuating, or atypical support needs. Cultural or language barriers, limited capacity, lack of resources, or poor assessment practices can magnify the problem, resulting in a distorted or partial view of individual circumstances. What the system captures is taken as truth, even when it fails to reflect the person’s lived reality.

We need only to look to the United States and elsewhere to see where this leads. In the state of Arkansas, an algorithm was introduced to ration care for people living with severe impairment. In a system prioritising savings over care, recipients saw their support hours drastically cut by the algorithm. In a Senate committee hearing, lawmakers heard evidence of “people lying in their own waste, going without food, going without any sort of community contact”.

All public resource systems face a tension between consistent process and fair outcomes. By almost entirely removing human discretion, we risk creating systems that sacrifices nuance for the sake of consistency.

Consistency of process does not guarantee fair outcomes. In fact, it can entrench inequality and amplify harm. Excessive standardisation creates impersonal processes that overlook individual needs and the complexities of lived experience. This is called algorithmic mis-recognition: a form of moral injury we encounter when our lived experience is ignored, erased or treated as irrelevant by the very systems meant to support us.

Social services demand a different approach. They require systems that are attentive to lived experience and fairness of outcomes, which means providing the right support for a person’s unique life circumstances. Well-governed systems support human judgment and accountability, using technology in limited and safe ways to inform, not replace, decision-making.

The IAT is a warning. Care cannot be reduced to rules and scores alone. Ageing and disability are human experiences, and decisions about care have profound, life-altering consequences. When we relinquish human judgment and capacity to override automated decisions, we put lives at the mercy of a flawed system.

  • Georgia van Toorn is a senior lecturer in the School of Social Sciences at the University of New South Wales and an associate investigator at the ARC Centre of Excellence for Automated Decision-Making and Society

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.