Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - AU
The Guardian - AU
Business
Josh Taylor Technology reporter

From zero to neo-Nazis: what under-16s may see under Australia’s social media ban, simply by not logging in

Examples of content presented to logged-out users of TikTok and YouTube include an AI-generated video branded with the gambling company Stake’s logo (L) and anti-immigration rally videos.
Examples of content presented to logged-out TikTok users include an AI-generated video branded with the gambling company Stake’s logo (L) and anti-immigration rally videos. Composite: Guardian design/TikTok

It was a news report related to Charlie Kirk’s assassination that flicked the switch.

In an experiment over a week in September, on an iPhone that had been wiped and factory reset, YouTube Shorts and TikTok were accessed without logging into either app. Both platforms allow users to endlessly scroll short-form videos without needing to log in.

On TikTok, initially the videos were a chaotic mix of non-English videos and AI slop – including one with the branding of gambling company Stake on it. But when the Kirk video was presented unprompted and viewed, the algorithm began shifting the type of content that appeared, resulting in a feed dominated by videos praising the neo-Nazi Thomas Sewell and the anti-immigration rallies, anti-Anthony Albanese content, including an AI-generated video of the prime minister being chased by trucks, and even a Port Arthur conspiracy theory video.

The process was replicated twice more by resetting the recommendation algorithm in the TikTok app.

On YouTube Shorts, a similar process resulted in a flood of videos related to K-Pop Demon Hunters, but when a news report on Kirk’s death was presented, the algorithm shifted to include much more rightwing and anti-immigration content, videos of violent public incidents, and footage of what appeared to be a hostage incident, with someone held at gunpoint.

Under Australia’s forthcoming social media ban, platforms will be expected to remove the accounts of users under 16 by 10 December, and prevent them from registering new accounts.

Instagram and X require users to log in to be able to access their algorithmic feeds. But YouTube and TikTok do not, and the government has made clear children will still be able to access platforms without logging in.

In a speech to the National Press Club in June, the eSafety commissioner, Julie Inman Grant, said the ban was necessary to combat what she called the “darker side” of social media, which included “algorithmic manipulation, predatory design features such as streaks, constant notifications and endless scroll to encourage compulsive usage, as well as exposure to increasingly graphic and violent online content”.

In guidelines released last week for the social media companies subject to the under-16s ban, companies deemed “higher-risk” because they have algorithmic content recommendations or other such features deemed harmful to children were told they would need to employ “more robust measures” to prevent age-restricted users from having an account.

But the test of TikTok and YouTube illustrates that children may still be exposed to gambling content, violent images, far-right material and conspiracy theories simply by not logging in to the sites.

Sign up: AU Breaking News email

The platforms tested do implement some safety checks. Some adult content could not be viewed without logging in, and when searching for terms around body image issues, both TikTok and YouTube returned no results and highlighted contact information for the Butterfly Foundation.

A spokesperson for the eSafety commissioner confirmed the logged-out experience was not subject to the ban.

“Under Australia’s social media minimum age legislation, which takes effect from 10 December 2025, age-restricted social media platforms will be required to take reasonable steps to ensure Australians under 16 do not have or maintain accounts,” the spokesperson said.

“The law does not prevent under 16s from accessing or viewing content without an account.”

The spokesperson said recommendation features may still operate but the platforms will be expected to not “undermine the intent of the law or expose young Australians to harmful or age-inappropriate content”.

“We will continue to monitor platform approaches closely and consider regulatory action where appropriate,” they said.

The spokesperson said the other codes established under the Online Safety Act would also apply to the platforms from December to keep children from being exposed to pornography and other high-impact material.

Nicholas Carah, an associate professor in digital media at the University of Queensland, said the business model of video-based sites such as TikTok and YouTube might be relatively unimpeded by the social media ban.

Under-16s would not be able to comment on the videos, but would be able to share them to other platforms not covered by the ban, such as messaging apps, he said.

“Young people will just use it in a logged-out state. They’ll find that the algorithm quickly adjusts to their interests, and there will be no change to consumption of media on those two platforms.”

He said logged-in accounts could filter inappropriate content that the user sees based on age or interests, but that was less possible for a user who was logged out.

“The logged-out state is a pretty big loophole,” he said.

“I think if you’re YouTube, you have a very clear commercial interest in using algorithmic recommendation in both the logged-in and logged-out state.”

Guardian Australia approached TikTok and YouTube for comment.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.