Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - AU
The Guardian - AU
National
Melissa Davey Medical editor

Algorithm-based tool for home support funding is cruel and inhumane, Australian aged care workers warn

Silhouette of an elderly man holding onto a walking aid
‘The government valued the algorithm more than people with skills’ … Clinicians and carers say an algorithm-based aged care assessment tool is failing elderly Australians. Photograph: Rosemary Roberts/Alamy

Aged care clinicians and carers say an algorithm-based assessment tool that determines federal home support funding packages is “cruel” and “inhumane”, stripping away clinical expertise and leaving elderly people with inadequate support.

The integrated assessment tool (IAT), introduced in November, is used across aged care to determine eligibility and classification for services, including residential care.

Mark Aitken, a registered nurse for 39 years who spent 16 years in aged care roles including assessing elderly people for support and funding, said he quit his job in regional Victoria just four months into using the tool.

The way the IAT assesses home support eligibility has become a central concern, with the government’s IAT user guide showing the tool generates a classification of need that must be accepted by assessors to secure support.

There are only limited circumstances in which the decision of the IAT can be overridden by assessors, and these do not include disagreeing with the classification of need generated.

Sign up: AU Breaking News email

“We weren’t allowed to use it (the override button), and even my manager, who had 25 years of experience in aged care assessment, wasn’t allowed to use it,” Aitken said.

“There was no ability for anyone to say: ‘The algorithm has it wrong, we need a human to adjust this’.”

He gave the example of an elderly woman he assessed as being well supported as she was living with family, with good cognitive skills, for whom he would have recommended a “decent” mid-range level of home assistance.

The IAT instead classified her at a much higher level of need with high priority.

Aitken compared this with an assessment he conducted for a woman in her 70s with advanced dementia and who he believed was experiencing neglect, with no services in place.

He assessed her as high risk and in urgent need of substantial support, but the algorithm classified her as lower need with no priority, potentially leaving her waiting up to another year.

“Eight times out of 10, the outcome was different to one that I would have recommended, or my colleagues would have recommended,” Aitken said.

It follows previous controversies over automated decision-making tools being used by the government, including the robodebt welfare scandal, and concerns about algorithm-driven disability funding through the NDIS.

The IAT user guide does not explain how the algorithm weighs risk, need or complexity, and Aitken said this information was never revealed to assessors.

The best public interest journalism relies on first-hand accounts from people in the know.

If you have something to share on this subject, you can contact us confidentially using the following methods.

Secure Messaging in the Guardian app

The Guardian app has a tool to send tips about stories. Messages are end to end encrypted and concealed within the routine activity that every Guardian mobile app performs. This prevents an observer from knowing that you are communicating with us at all, let alone what is being said.

If you don't already have the Guardian app, download it (iOS/Android) and go to the menu. Select ‘Secure Messaging’.

SecureDrop, instant messengers, email, telephone and post

If you can safely use the Tor network without being observed or monitored, you can send messages and documents to the Guardian via our SecureDrop platform.

Finally, our guide at theguardian.com/tips lists several ways to contact us securely, and discusses the pros and cons of each. 

When he asked at a government seminar about the evaluation framework, including what data was being collected, how accuracy would be assessed, and whether results would be publicly reported, he said he felt “shut down”.

“I left my job because I didn’t want to be part of a system that removed the ultimate decision-making about support from real, experienced people who care,” he said.

“The government valued the algorithm more than people with skills, intelligence and knowledge.”

He said some assessors began “gaming” the system, inputting information they knew would generate the level of care the person needed even if that information did not accurately reflect their situation.

“People shouldn’t have to put in fake information,” Aitken said. “I just started to feel like it was going to be another robodebt, I became very uncomfortable, and just felt the tool wasn’t ethical.”

Independent MP Dr Monique Ryan has questioned the IAT algorithm during question time, and was told by the minister for aged care and seniors, Sam Rae: “The IAT classification algorithm does not replace assessor input”.

Ryan told Guardian Australia his response “misses the point”.

“The input might be right, but the algorithmic output can be entirely wrong.”

Ryan said she is increasingly hearing from constituents concerned that the IAT “is stripping the sector of clinical judgement and nuance”.

Ryan said while earlier policy guidance permitted aged care assessors to override the algorithm, “This is no longer the case”.

She said she is concerned by a lack of transparency “on how this tool weighs and balances vulnerabilities, complexities and other factors, and limited information on exactly how the tool was evaluated before it was rolled out”.

She described the IAT as “effectively robo-aged-care”.

Linda Nicholson, a Queensland support coordinator, said she recently helped a client on a level 3 care package to undergo an assessment due to clear evidence of escalating needs.

Her client lives alone in a remote area, has severe incontinence, cognitive decline and is at high risk of falls, with no family nearby.

Her client was denied an upgrade in support in January, after being assessed using the IAT.

“We were all shocked, including the assessor,” Nicholson said.

When Nicholson asked how to appeal the decision, she was told to write a letter and that she might receive a response in 90 days.

“Aged care assessors with years of experience are having their professional assessments overridden by a rigid algorithmic system that doesn’t account for individual complexity,” Nicholson said.

“This algorithm is inhumane; it’s a debacle, and it’s just cruel.”

A spokesperson for the department of health, disability and ageing said the algorithm “consolidates key information about an older person’s needs across several domains, providing a holistic view of their health and circumstances”.

  • Do you know more? Email melissa.davey@theguardian.com

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.