Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Crikey
Crikey
National
Cam Wilson

This Australian firm says its AI product can predict suicide attempts in prison

An Australian company claims it has developed artificial intelligence that can detect suicide attempts in prisons after training on real footage of incidents — and prisons around the world are already using the technology.

iOmniscient was one of dozens of companies exhibiting at the Security Exhibition & Conference in Sydney this week. Companies showcased new uses of artificial intelligence, facial recognition, licence plate scanners and robotics as surveillance methods that promise to keep people, businesses and their property safe. 

iQ-Prison is iOmniscient’s surveillance product for prisons that, according to the company, can do everything from “detecting aggressive behaviours” like fighting to “track inmates’ movements with facial analytics” by using a combination of CCTV and artificial intelligence technology. 

A brochure promoting iOmniscient’s prison artificial intelligence CCTV product (Image: iOmnscient)

A promise to detect persons “attempting to commit suicide” stands out as its most unusual claim. A salesperson told Crikey the technology had been trained on footage “of the 30 seconds leading up to attempts” to be able to recognise patterns. A 2020 brochure promoting the use says that Hong Kong prisons have implemented the system through its subsidiary Wildfaces.

iOmniscient’s iQ-Prison is one of the company’s many offerings to different industries including policing, schools, healthcare and transport — including Sydney’s driverless trains project, according to its promotional material. 

The company has courted controversy over the uses of its technology in the past. In 2017, Toowoomba Council faced a backlash for a trial using its products. A 2016 report found the company was selling surveillance technology to the Bahrain government as it battled mass demonstrations. In 2019, Bloomberg reported that there were suspicions the technology was being used by Hong Kong police — who had already used the company’s products for years — to crack down on the pro-democracy protesters. 

An iOmniscient salesperson brought up the Bloomberg reporting while talking about the company’s media coverage in the past.

“[The media] blamed us for the technology being used to track protesters. The police had purchased it to track lost children,” he said. “What can we do? It’s just a tool.”

New applications of artificial intelligence for surveillance was a trend at the show, with a number of companies offering products. The Artificial Intelligence Group spruiked its ability to create a “custom algorithm” to fit customers’ needs. Examples offered by the company include monitoring staff productivity, theft, even “abnormal human behaviour”.

“Think of it as using your existing CCTV system as eyes and including our systems as a superhuman brain, recognising patterns and reporting abnormal events,” its promotional material says.

Another company, Network Optix, shows off how its software can be used as a backend to support AI to recognise objects such as mobile phones or guns from real-time video footage.

Former human rights commissioner and UTS Professor Ed Santow recommends taking claims from companies about artificial intelligence and facial recognition with a grain of salt. 

When dealing with claims for more obscure uses of the technology like determining someone’s emotion through facial recognition, Santow disputes their effectiveness: “It’s largely junk science; it literally doesn’t work.”

Licence plate scanners are another popular form of technology promoted by security companies. This technology can turn a camera into a device that records the car type, model, colour, the state licence plate, time seen, even which direction the vehicle is headed. While heavily advertised to police — who’ve used the technology in Australia for nearly a decade — individuals and businesses are also now target markets for licence plate scanners.

“Engineered for roadways speeds, the system can be deployed in neighbourhoods, campuses, business districts,” one company’s promotional material said. “Help keep your community safer, easily and affordably,” read another. 

Other technologies promoted at the expo include high-tech safes, RFID scanners, body cameras, thermal fire detection cameras, biometric employee management systems and even robotics. 

Kabam Robotic’s surveillance robot Co-Lab Indoor Security Robot at the Security Expo (Image: Private Media)

KABAM Robotics’ Co-Lab Indoor Security Robot patrolled the expo floor throughout the day. Roughly five feet tall, the robot used 360-degree camera vision and Lidar (not dissimilar to radar but using light rather than sound) to navigate its way around the company’s booth at a slow pace, not unlike a Roomba. 

The company sells it as a way to constantly monitor spaces, seemingly in lieu of security staff: “Its concierge capabilities and PA System and Siren Alert functions make Co-Lab your perfect surveillance partner and an essential member of your security team.”

With AI keeping an eye on you, do you feel safer… or scared? Let us know your thoughts by writing to letters@crikey.com.au. Please include your full name to be considered for publication. We reserve the right to edit for length and clarity.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.