Wearable Tech & Privacy · Jenntelligence.ai
The device on your wrist tracks your heart, your sleep, your location, and your daily routine. The glasses on your face may be recording everyone around you — without their knowledge or consent.
pairs of Meta smart glasses sold in 2025 alone — tripling prior years combined
fitness tracker records exposed in a single data breach (2021)
time Harvard students needed to identify any stranger using smart glasses and AI
Understanding which category of wearable you are dealing with determines which risks apply to you — and which apply to everyone around you.
The wearables market has expanded from niche fitness enthusiasts to hundreds of millions of everyday users. Smartwatches, fitness trackers, smart rings, smart glasses, and AI-enhanced earbuds now generate a continuous stream of intimate data. Most users have a reasonable sense of what they're getting in return — health insights, navigation, emergency detection. What they have a much hazier picture of is where that data goes afterward.
Two fundamentally different categories of wearable have emerged, and they carry different privacy implications. The first collects data about the person wearing the device: fitness trackers and smartwatches log your biometrics, location, and health patterns — data the wearer has at least technically consented to share, however buried in terms and conditions that consent may be. The second collects data about everyone the wearer encounters: smart glasses and AI-enhanced audio devices record the world around them. A person walking past someone wearing Meta Ray-Ban glasses has consented to nothing. They may not know they are being recorded. They have no practical recourse.
The asymmetry that matters: Health data from your fitness tracker is sensitive and its exposure can harm you. But data collected by someone else's smart glasses can affect you without your knowledge — and often without the wearer intending harm either. This guide covers both categories. The second is the more urgent emerging risk.
These are not obscure products. They are devices owned by hundreds of millions of people, with privacy implications most users have never been told about.
Meta Ray-Ban Smart Glasses
7M+ sold in 2025 · Oakley Meta also available
Cameras embedded in the frame record photos and video. Any use of AI features — including "Hey Meta, look and tell" — sends footage to Meta's servers. Voice recordings triggered by the wake word are stored in the cloud for up to a year by default, with no meaningful opt-out. In April 2025, Meta updated its privacy policy to make AI data collection the default for many features. A class action lawsuit filed March 5, 2026 (Bartone v. Meta Platforms Inc.) alleges false advertising — citing Meta's marketing language "designed for privacy, controlled by you" against documented practices. The lawsuit is pending; no verdict has been issued. Status as of April 2026. The UK Information Commissioner's Office opened an inquiry the same week.
The small LED indicator is the only signal the camera is active. It's easy to miss at conversational distance — which is the core problem.
Fitness trackers & smartwatches
Fitbit · Apple Watch · Garmin · Whoop · Oura
Continuously collect heart rate, blood oxygen, sleep patterns, menstrual cycles, skin temperature, GPS location, and movement. A 2025 peer-reviewed study published in npj Digital Medicine evaluated 17 wearable manufacturers and found that 76% received High Risk ratings for transparency reporting. Apple — widely marketed as privacy-forward — settled a $95 million class-action lawsuit in 2025 over Siri recording conversations via accidental activations on Apple Watches, without disclosing that human contractors would review those recordings. Fitbit, now owned by Google, explicitly states in its privacy policy that data may be used to deliver targeted advertising. A 2021 breach exposed 61 million fitness tracker records including Apple HealthKit data. Health data collected by consumer wearables is not protected by HIPAA.
Health data is permanent. You can change a password. You cannot change your cardiac signature, your biometric identifiers, or your menstrual cycle history. Once exposed, this data cannot be recalled.
Next-generation AI smart glasses
Halo X · Google × Warby Parker · Apple (rumored)
The Harvard students who built I-XRAY — the facial recognition demo that went viral in October 2024 — have since dropped out of Harvard to build Halo X: always-on AI smart glasses with a microphone that records every conversation. Google has announced a partnership with Warby Parker for AI-powered glasses. Apple is rumored to be developing a competing product. FCC filings from March 10, 2026 revealed two new Meta Ray-Ban models (codenamed "Scriber" and "Blazer") in development. The next generation will be harder to identify as recording devices and more capable than anything currently on the market.
The I-XRAY demo identified dozens of strangers — pulling their names, home addresses, and phone numbers — in under 60 seconds using only publicly available tools. The students never released the code, but stated the capability is not secret and bad actors already know it is possible.
Smart earbuds & AI hearing devices
AirPods · Galaxy Buds · AI-enhanced hearing aids
Modern earbuds with always-on microphones for voice assistants present continuous audio monitoring risks. AI-enhanced hearing aids — a fast-growing category — can process and transmit audio from every conversation the wearer has or overhears. The FDA's 2026 wellness device guidance clarified these devices occupy a legal gray zone: not medical devices, not standard consumer electronics, subject to inconsistent privacy rules depending on the manufacturer's country of origin.
Unlike smart glasses, earbuds have no visible recording indicator. Anyone in range of a hearing device may be recorded with no practical way to know it is happening.
I-XRAY was not a hack. It used only publicly available technology. That is precisely what made it alarming.
Harvard juniors AnhPhu Nguyen and Caine Ardayfio built I-XRAY by combining three existing systems: Meta Ray-Ban smart glasses, which livestream video to Instagram; PimEyes, a publicly available reverse facial image search engine; and large language models that cross-referenced search results with people-search databases to compile personal profiles. The result could identify any stranger — pulling their name, home address, phone number, and in some cases partial Social Security numbers — within 60 seconds of seeing their face.
The students demonstrated I-XRAY on strangers at Boston's MBTA stations without their knowledge. In one documented case, a student approached a stranger on a train, greeted them by name, and referenced their professional work — all information pulled from the glasses in real time. The demonstration video received over 20 million views. The students did not release the code.
What Nguyen said about the implications: "Some dude could just find some girl's home address on the train and just follow them home." The students were explicit that bad actors already know this capability exists. "The bad actors are already aware they can do this," said Ardayfio. The purpose of the demonstration was to raise public awareness — not to provide a blueprint.
I-XRAY used only Meta's glasses as the camera input — but Nguyen noted this was largely arbitrary. Any camera with sufficient resolution could be used. The glasses were chosen because they look like ordinary eyewear, making the demonstration more viscerally illustrative of the surveillance risk.
Meta's planned response — "Name Tag": In February 2026, the New York Times obtained internal Meta documents revealing the company's plan to add facial recognition to its Ray-Ban smart glasses under a feature internally called "Name Tag." The documents showed Meta intended to launch during a period when privacy groups were "focused on other concerns." Meta had shut down facial recognition on Facebook in 2021 after paying approximately $2 billion in biometric privacy settlements — the same technology it now plans to reintroduce on glasses worn in public.
What you can do about facial recognition databases right now: I-XRAY's identifying power depended on public people-search sites and reverse image search engines. Both can be partially mitigated. Opt out of PimEyes (pimeyes.com) and Facecheck.id directly. Remove yourself from people-search aggregators including FastPeopleSearch, Spokeo, and BeenVerified. This does not make you unidentifiable — but it meaningfully raises the cost of identifying you.
The following incidents are verified and sourced. The sources section at the end of this guide provides primary references for each.
Apple · Siri recordings on Apple Watch · human review undisclosed
Apple settled a $95 million class-action lawsuit over Siri recording conversations via accidental activations on Apple Watches and other devices — without disclosing that human contractors would review those recordings. Apple's privacy policy at the time of the recordings failed to explicitly state that audio captured by accidental activations would undergo human review. This case illustrates a recurring pattern: companies with strong public privacy reputations whose actual data handling practices diverge from their marketing. (Source: Reuters, 2025; npj Digital Medicine, 2025)
GetHealth breach · Fitbit & Apple HealthKit data exposed
Third-party health sync company GetHealth left a database of 61 million fitness tracker records unencrypted and unpassword-protected, exposing data from Fitbit and Apple HealthKit users. Exposed information included names, birthdates, weight, height, gender, and geographic location. This breach illustrates the central risk of third-party data sharing: even if the primary device manufacturer has adequate security, every third party the data is shared with is an additional vulnerability — and users typically have no visibility into how many such third parties exist. (Source: Fierce Healthcare, 2021)
Meta · contractor footage review · Swedish investigation
A joint investigation by Swedish newspapers Svenska Dagbladet and Göteborgs-Posten revealed that footage captured by Meta smart glasses — including videos of users undressing, using the toilet, engaging in sexual activity, and handling financial documents — had been reviewed by contractors at a Nairobi subcontractor. A worker told the journalists: "In some videos, you can see someone going to the toilet, or getting undressed. I don't think they know, because if they knew, they wouldn't be recording." Meta's US terms acknowledge footage "may be reviewed" by humans. The UK Information Commissioner's Office wrote to Meta demanding urgent clarification. A class action lawsuit was filed in US federal court March 5, 2026. (Sources: Svenska Dagbladet, Göteborgs-Posten, TechCrunch, ALM Corp, 2026)
Covert recording · University of San Francisco campus
The University of San Francisco issued a campus-wide warning in October 2025 after reports that a man wearing Meta Ray-Ban glasses was covertly filming women on campus. Multiple women separately told BBC News they had been filmed without consent by people wearing smart glasses in public. One woman reported having a normal conversation with a man wearing what appeared to be ordinary sunglasses, then later discovering a video of her had been posted online with nearly one million views.
Fitness app data exposing military operations and world leaders
Strava's public activity data has repeatedly exposed sensitive security information. In 2018, its global heatmap revealed US military base locations in Afghanistan and Syria, prompting a Pentagon review. In 2024–2025, Le Monde's #StravaLeaks investigation showed how Strava data could be used to predict President Macron's travel locations by identifying his security detail's running routes. In March 2025, a journalist identified former Canadian Prime Minister Justin Trudeau's jogging route at his official residence via a bodyguard's Strava profile. In March 2026, a French navy officer's uploaded run exposed the location of the aircraft carrier FS Charles De Gaulle in the Mediterranean. Strava notes these are user privacy setting failures — not system breaches — and has added privacy alerts over time.
Meta · Texas biometric privacy settlement
Meta paid $1.4 billion to settle a Texas biometric privacy lawsuit in 2024 related to facial recognition technology — the same system Meta had shut down on Facebook in 2021 after paying approximately $650 million in Illinois settlements. In February 2026, internal Meta documents obtained by the New York Times revealed plans to reintroduce facial recognition on its Ray-Ban glasses under a feature called "Name Tag," with launch timing chosen for a period when privacy groups were "focused on other concerns."
Smart glasses are the most visible example of a broader problem: consumer technology deployed at scale before any legal framework exists to govern it.
Recording in public is generally legal in the United States — a doctrine established long before any device existed that could record continuously, identify every face in frame within 60 seconds, and transmit that footage to a cloud server for AI training while looking like an ordinary pair of sunglasses. Courts haven't caught up because the technology moved faster than anyone expected.
Health data from consumer wearables sits in a similar gap. HIPAA covers healthcare providers, health plans, and their business associates — not consumer wearable companies. Your Apple Watch data, Fitbit sleep patterns, and Oura ring biometrics fall under each company's own privacy policy, which can change with notice. A 2025 peer-reviewed study of 17 wearable manufacturers found 76% received High Risk ratings on transparency reporting — meaning most wearable companies don't clearly explain what they collect or what happens to it.
The Illinois Biometric Information Privacy Act (BIPA) remains the most significant US law governing biometric data collection. It provides a private right of action — individuals can sue — and statutory damages of $1,000 to $5,000 per violation. The $1.4 billion Meta settlement and the $51.75 million Clearview AI settlement (2025) were both driven significantly by BIPA exposure. But BIPA applies in one state.
The EU contrast: The EU AI Act, effective February 2025, bans real-time remote biometric identification in public spaces with narrow exceptions for law enforcement. A person wearing smart glasses and running facial recognition on passersby in most European cities is operating in violation of the Act — subject to fines of up to €30 million or 6% of global annual turnover. The same behavior in most US cities is entirely legal.
The FDA's 2026 wellness device guidance clarified that fitness trackers and smartwatches are "general wellness" devices — not medical devices — meaning they face lighter regulatory requirements even as they collect increasingly clinical-grade biometric data. This classification means the same device that tracks your menstrual cycle, cardiac rhythms, and sleep staging faces lighter federal oversight than a blood pressure cuff sold in a pharmacy.
What is changing: The Clearview AI settlement of $51.75 million in 2025 demonstrated that BIPA enforcement at scale is possible. The Meta class action (Bartone v. Meta, March 2026) is the first major lawsuit specifically targeting smart glasses privacy practices. The UK ICO investigation of Meta is ongoing. Generation Z is pushing back culturally on smart glasses in public. Apps that detect the Bluetooth signals from Meta Ray-Bans have emerged. The regulatory picture is moving — but slowly relative to the technology.
An honest measure of what you now know — and what you can share with others.
Question 1 of 8
correct answers
These steps use rights and tools that already exist. None require legislation to pass first. Start with the ones most relevant to the devices you own.
Remove yourself from facial recognition databases
The databases that power tools like I-XRAY are public people-search sites and reverse image search engines — and most offer opt-out. Start with PimEyes and Facecheck.id, both of which allow free opt-outs. Then remove yourself from people-search aggregators: FastPeopleSearch, Spokeo, BeenVerified, and Whitepages. This does not make you completely unidentifiable, but it meaningfully raises the practical cost of identifying you through automated tools.
DeleteMe can automate ongoing removals from people-search sites →Audit what your wearables share — and with whom
Open the companion app for your smartwatch or fitness tracker. Look for "Privacy," "Data Sharing," or "Connected Apps." Revoke access for any third-party apps you no longer use. Check whether your data is shared with "research partners," "health partners," or "advertising partners" — many devices opt you into these arrangements by default. On iPhone: Settings → Privacy & Security → Health to review which apps can read your health data.
Disable cloud backup of health data where possible
On iPhone: Settings → your name → iCloud → turn off Health sync if you do not specifically need cross-device health access. On Android: review Google Fit or Samsung Health cloud sharing settings. The fewer cloud copies of your health data that exist, the fewer breach points exist. Data stored only on your device cannot be exposed in a company's server breach.
If you own Meta smart glasses, disable cloud media upload
In the Meta View app: Settings → Privacy → disable "Cloud media" to prevent footage from being automatically uploaded to Meta's servers. Disable voice recordings: Settings → Hey Meta → Voice History → delete existing recordings and limit future retention. Note: disabling cloud features significantly limits AI functionality. That is the tradeoff Meta has built into the product design.
Know your rights regarding recording in private spaces
Recording in public is generally legal in the US under existing law. Recording in a private space — an office, a medical appointment, a home — without consent may be illegal depending on your state. In two-party consent states (California, Florida, Illinois, Maryland, Massachusetts, Pennsylvania, Washington, and others), all parties must consent to being recorded. If you are in a private setting and someone is wearing smart glasses, you have the legal right in many states to ask them to stop recording or to leave.
Reporters Committee recording law guide by state →Request your data and exercise deletion rights
Under the California Consumer Privacy Act (CCPA), Colorado Privacy Act, Virginia Consumer Data Protection Act, and related state laws, you have the right to request a copy of the data a wearable company holds about you — and to request deletion. Go to each company's website and search "privacy request" or "data deletion." This right applies to residents of covered states regardless of where the company is located, and some companies honor it globally as a matter of policy.
Use your state Strava privacy settings — and encourage others in sensitive roles to do the same
If you use Strava or similar fitness apps: set your profile to private, enable "hide start and end" on routes, and use the Flybys privacy control to prevent strangers from seeing when your route intersected with theirs. These are not default settings. If you know military personnel, security professionals, or public officials who use fitness tracking apps, the Strava cases documented in this guide are worth sharing with them directly.
Read the privacy policy before purchasing any wearable
The 2025 npj Digital Medicine study of 17 wearable manufacturers found that privacy policies averaged 6,113 words. Ask three questions before accepting: Does it share data with third parties? Can it sell or transfer data if the company is acquired? Does it retain data after you delete your account? A 2025 ZDNet analysis identified Apple, Oura, Whoop, Withings, Coros, Dexcom, and Medtronic as having the most responsible data sharing practices. Xiaomi, Wyze, and Huawei received the highest risk scores in the academic study.
I'm Jennifer Stivers, founder of Jenntelligence.ai, a division of MarketMind Consulting. I have a psychology degree and spent my career in marketing — at Apple, at a venture-backed startup that went public, at organizations like Coursera and GlobalEnglish. I built these guides using AI tools. The research questions, editorial decisions, and responsibility for accuracy are mine. Every claim in this guide is drawn from primary sources, investigative journalism, peer-reviewed research, or regulatory filings — all listed in the sources section.
Every fact in this guide is drawn from the sources below. Where a specific claim is disputed or contested, that is noted in the relevant module.
A note on accuracy
These guides reflect my research and editorial judgment as of the date shown. Privacy law, wearable technology, and the legal cases covered here change quickly — sometimes faster than any guide can track. I update content when I become aware of significant changes, but I cannot guarantee real-time accuracy. Pending legal cases are noted as such and should not be read as verdicts. If you find something that needs correction, I want to know. Contact me here. Links to external sources are provided for reference; I am not responsible for changes to third-party content after publication.