Meta has been leading the way with smart glasses technology since launching its Ray-Ban model in 2021. Now the company may have plans to introduce facial recognition to its AI-enabled glasses, according to an internal memo leaked to The New York Times.
Meanwhile other industry players like Google, Apple, Samsung and Chinese tech giants Alibaba and Xiaomi are also investing in a global market that is projected to reach USD$8.2 billion by 2030.
As a member of sex worker communities, and a researcher studying the intersection of sex work, law and technological surveillance, I am concerned about the integration of AI-enabled smart glasses with facial recognition technologies.
I am especially worried about the impact on sex workers and other vulnerable members of our population.
Photos and video without consent
Meta sold more than seven million pairs of smart glasses during 2025 and research analysts predict up to 20 million pairs of smart glasses will be sold globally this year.
Read more: Is someone watching you? Facial recognition tech is here and Canada offers little privacy protection
These glasses already enable wearers to take photos and videos in public and private spaces without consent. In 2025, a number of Instagram accounts uploaded videos of men entering massage parlours and soliciting women for sexual services. The videos, uploaded without the women’s knowledge or consent, garnered millions of views.
There have also been several more recent media reports of women around the world being filmed by wearers of these glasses, without consent.
Theoretically, smart glasses have a warning light in the frame, which indicates when the wearer is filming. But research shows the privacy mechanism of a single flashing LED light is insufficient to alert bystanders about filming. This LED safety measure is also easily hacked by the use of covers or device alterations that disable the light altogether.
Elevated risks for sex workers
Sex workers are particularly at risk of non-consensual filming while at work. The consequences of this can include being outed to friends and family as a sex worker, blackmail and job loss. This can lead to loss of health, financial and housing services. Covert filming can also lead to stalking, abuse and violence.
Migrant sex workers are likely at the greatest risk because they could face deportation for violating the terms of their immigration.
What difference does facial recognition make?
These risks are amplified by the possibility of facial recognition being integrated into Meta smart glasses. Facial recognition could theoretically allow Meta Glasses wearers to access information about people who enter their line of vision using a potential feature called Name Tag.
Many sex workers use social media, just like everyone else. Some use it for work, while others only use it to connect with friends and family. Some use it for both. The issue with facial recognition is that sex workers’ efforts to remain anonymous could be easily trampled, especially if Meta explores options to identify people via Facebook and Instagram accounts.
While Meta will likely include safety features if it does pursue the integration of facial recognition into smart glasses, the trend in technological advances in recent months has shown that AI often works outside the scope of what it is programmed to do.
Safety mechanisms do not always protect the community from becoming victims to sexualized deepfakes including pornographic images of children.
AI and facial recognition technologies have also been found to exhibit racial biases and allow increased racism, misidentification and wrongful arrest of people of colour. The racial biases exhibited through AI and facial recognition technologies are a result of AI learning from deeply flawed data rooted in systemic racism.
Read more: Grok fallout: Tech giants must be held accountable for technology-assisted gender-based violence
Tools of resistance
Sex workers’ rights are intimately tied with women’s and children’s rights. When a system or technology threatens the safety and well-being of sex workers, it also threatens women and children. Predatory technologies that allow vulnerable people to be secretly recorded and have their identities revealed without consent will lead to inevitable harms.
One way we can protect ourselves is through an app called Nearby Glasses. This allows users to scan their vicinity for Bluetooth signals from smart glasses with camera functionality. It alerts users about possible recording and also notes the manufacturer name of smart glasses detected. The developer designed this app as a form of resistance to expanding surveillance technology.
Is mass surveillance inevitable?
Legal scholar Woodrow Hartzog and his colleagues have characterized facial recognition technology as “the most dangerous surveillance tool ever invented,” posing unique threats to “privacy, civil liberties, human flourishing and democracy.” They speak of a slippery slope towards inevitable mass surveillance.
Meanwhile, Meta CEO Mark Zuckerberg recently declared: “It’s hard to imagine a world in several years where most glasses that people wear aren’t AI glasses.”
To protect the anonymity of sex workers and other vulnerable persons, it is imperative that we speak up and raise awareness of the consequences of such a world.
Brynn Colledge does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
This article was originally published on The Conversation. Read the original article.