Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Craig Sellars

Understanding the privacy risks of wearable tech

Litelok Gold Wearable.

We’re all already cyborgs. Whether you’re wearing glasses, sensing the dimensions of your car as you’re parking, or even “feeling” the texture of your food as you skewer it with a fork, you are directly experiencing the mind’s ability to extend itself and enmesh itself with technology – no cyberpunk implants required. Because we’re cyborgs, we have a natural inclination to pull tools and technology closer to ourselves, which has played out in the rise of everything from the Walkman to wearable fitness trackers to, most recently, the Apple Vision Pro.

But today’s smart devices do more than just expand our minds. Sitting as they do at the intersection of the physical and digital worlds, they generate data that combine to describe almost every aspect of a person’s online and offline life, creating precisely the kind of comprehensive asset that hungry advertisers, governments, and even career cybercriminals dream of. Wearable technology has vast promise, but it comes with unprecedented risks to our privacy. Only by understanding the dimensions of those risks can we preserve the benefits of wearables while promoting safety and transparency.

Wearable smart devices make us vulnerable by collecting health-related information, biometric data, location data, and more, while often commingling those data points with common digital information, including contact information, purchase history, or browsing behavior somewhere along the line. The hybrid nature of the data and devices ensures that the risks they pose are just as multifaceted, but it’s possible to think of them in three broad categories:

1. Technological

Technological vulnerabilities are typically most people’s first thought when assessing the potential dangers of new tech. It’s a good instinct – in many respects hardware and software security constitute the first and last line of defense for user data. It’s therefore necessary to pay close attention to the attack surface that wearables present, as it’s larger and more complex than one might expect.

First, there’s the device itself, which could be vulnerable to proximity-based attacks via however it communicates with the outside world. From there, wearables might transmit data to an intermediary device like a smartphone, which itself is vulnerable, after which the data makes its way to permanent, centralized storage on proprietary servers. Each step in this process can be attacked in creative ways via hardware and software, and it’s increasingly likely that bad actors will try due to the richness of the target. Thankfully, standards such as Bluetooth and WiFi have robust security mechanisms to prevent most such attacks, but they’re not perfect. Healthcare data breaches more than doubled from 2013 to 2023, and it’s likely that this trend will be reflected in healthcare-adjacent data, too.

2. Regulatory

As is so often the case, privacy protections have failed to keep pace with advancements in technology, and what protections do exist are piecemeal and surprisingly narrow. Most Americans have a vague sense that their healthcare data is protected (and they’re vaguely correct), while an informed minority know the Health Insurance Portability and Accountability Act (HIPAA) exists to safeguard healthcare data. What’s not commonly understood is that HIPAA applies only to healthcare providers and insurers, and only for personally identifiable records used by those entities. This means there’s a potential regulatory distinction between healthcare data and biometric data produced by a fitness tracker even if the data point being tracked is identical.

Generalized privacy regulations attempt to fill in this gap, but they’re mostly case-by-case. While the EU has one standard (GDPR), as does Canada (PIPEDA), the United States has a state-by-state patchwork of uneven regulation that remains difficult to navigate. The Federal Trade Commission has also tried to backstop health data privacy, citing both GoodRx and BetterHelp in 2023 alone. Absent more specific privacy protections, however, this type of enforcement will necessarily come after privacy has been violated, and almost always on the basis of “deceptive business practices” rather than due to inherent biometric data safeguards.

3. Educational

Just as regulators trail technology, so too does consumer understanding of what’s being tracked, how data can be used, and by whom, all of which are necessary to give informed consent. First and foremost, people need to get into the habit of thinking about everything on their wearables as potentially valuable data. Your daily step count, your heart rate, your sleep quality – all of the fun and useful insights your wearables generate – begin painting a comprehensive picture of you that can seriously erode individual privacy, and it’s all above-board.

This kind of data tracking becomes even more impactful when you think about today’s most powerful devices. The Apple Vision Pro by default knows where you are, what you’re browsing, the features of your environment, and even where you’re looking and how you move your body. So much data aggregation allows for deep, profound inferences about individuals that can could be used (hopefully not misused) in ways ranging from anodyne to alarming: more targeted ads based on implied preferences; increased insurance premiums due to lifestyle choices or poor treatment compliance; hacker groups revealing someone’s house is empty in real time; the list goes on.

Data rollups

Data rollups aren't confined to devices as powerful as the Apple Vision Pro, either. Consumers need to be made aware of how big tech companies can connect their individual dots across multiple devices and services. For example, consumers are broadly aware Google has location data from Android phones along with search and browsing history, but fewer know that Google acquired Fitbit in 2021, thereby making all Fitbit-generated data a de facto part of the Google ecosystem. There’s nothing intrinsically wrong with this, but consumers require an ecosystem-level understanding of the entities controlling their data to make informed choices.

None of this is to say that the situation is beyond repair. In fact, we have to fix it so that we can safely enjoy the benefits of life-changing technology. In order to do that we need solutions that are as comprehensive as the problems. After all, privacy is a team sport.

Comprehensive security

First, we need to embrace more comprehensive security and default encryption at every step, on every device, for all data. Blockchains have much to offer in terms of restricting device access, securing data, and leveraging decentralized infrastructure in order to reduce the honeypot effect of vast data troves.

Second, as noted above, informed visibility is a strict prerequisite for informed consent, so consumers must demand – and privacy-conscious companies must embrace – absolute transparency in terms of what data is collected, how it’s used (and by whom), and with what other data might it be commingled. It’s even possible to envision a world in which companies disclose the information they’re looking to derive based on the data points they’re aggregating, and consumers in turn have the ability to accept or reject the proposition.

That leads us to the final piece, which is nuanced control of one’s data. Among its many flaws, the standard model of extracting data from users generally presents them with binary choices: consent and participation, or opting out entirely. Consumers need finer-grained control over what data they share and for what purposes it may be used rather than being strong-armed into an all-or-nothing model. Once again, they should demand it, and privacy-conscious companies can earn immense goodwill by giving it to them.

Ultimately there’s nothing to be gained by assuming that privacy is doomed to become a quaint notion from a bygone era. We should instead be hopeful that we can find a balance between the unprecedented benefits of wearable technology and the risks they pose to privacy, but we can’t afford to wait around for regulators. Instead, it’s incumbent upon everyday people to educate themselves on threats to their privacy and speak up not just in favor of better regulation, but in defense of their right to own and control the data they’re creating. If enough people say it, the industry has to listen.

We've listed the best smart home devices.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here:

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.