Dark Mode Light Mode
MY THEATRE by Dariia Bila — Paris Fashion Week Debut
Ray-Ban Meta Smart Glasses: When Fashion Tech Becomes a Privacy Nightmare

Ray-Ban Meta Smart Glasses: When Fashion Tech Becomes a Privacy Nightmare

rayban meta glasses rayban meta glasses
cavebear42, CC BY-SA 4.0 via Wikimedia Commons

Something uncomfortable happened on February 27, 2026. Two Swedish newspapers, Svenska Dagbladet and Göteborgs-Posten, published an investigation that made a lot of Ray-Ban Meta owners stop and think about what they’d been wearing on their faces.

The glasses sold over 7 million units in 2025. Meta marketed them as “designed for privacy, controlled by you.” Turns out, that’s not quite how things work.

What Workers in Kenya Are Actually Seeing

The investigation focused on Sama, a Nairobi-based company that Meta hired to train its AI systems. Journalists talked to more than thirty employees. All of them spoke anonymously because they were worried about losing their jobs.

What they described was not subtle.

“In some videos, you can see someone going to the toilet, or getting undressed,” one worker said. “I don’t think they know, because if they knew, they wouldn’t be recording.”

Another put it more bluntly: “We see everything—from living rooms to naked bodies. Meta has that type of content in its databases.”

The list of things contractors have watched includes bathroom footage, people changing clothes, sexual activity, credit card numbers visible in recordings, and private conversations about relationships and alleged wrongdoing. One worker described watching a man leave his glasses on a bedside table, then seeing his wife walk in and undress—completely unaware she was being recorded.

The workers draw bounding boxes around objects, assign labels, check transcriptions. Standard AI training work. Except the material isn’t stock photos. It’s other people’s lives.

The Privacy Features That Don’t Work

Meta says faces get blurred before contractors see anything. According to the workers, that’s hit or miss at best.

A former Meta employee confirmed it to the Swedish reporters: “The algorithms sometimes miss. Especially in difficult lighting conditions, certain faces and bodies become visible.”

The glasses have a small LED that’s supposed to light up when recording. In theory, people around you can see it and know what’s happening. In practice? Bright sunlight washes it out. Crowded rooms make it easy to miss. Some people just cover it up.

There’s no real opt-out either. If you want the AI features—the translations, the object identification, the hands-free assistant—you have to agree to let Meta process your footage. The AI doesn’t work offline at all. Journalists tested it.

Meta’s terms do mention that “in some cases, Meta will review your interactions with AIs, including the content of your conversations with or messages to AIs, and this review may be automated or manual (human).”

Manual. Human. Workers in Kenya watching you in your bathroom.

The Lawsuits and Investigations Piling Up

Things moved fast after the story broke.

On March 5, Clarkson Law Firm filed a class action in the Northern District of California. Plaintiffs Gina Bartone from New Jersey and Mateo Canu from California bought the glasses believing Meta’s marketing. The lawsuit also names EssilorLuxottica, Ray-Ban’s parent company.

Ryan Clarkson, the firm’s managing partner, wasn’t gentle: “Meta made a promise to millions of consumers while knowing full well it could not keep it. Workers thousands of miles away have been watching footage from inside people’s bedrooms all along. That is not a technicality or an oversight—that is a system working exactly as designed.”

The UK’s Information Commissioner’s Office announced it would contact Meta for information about data protection compliance. They called the allegations “concerning.”

In the EU, 17 Members of Parliament from four different political groups formally asked the European Commission whether Meta is following GDPR rules. The problem? Kenya doesn’t have EU “adequacy” status. That means its data protection standards haven’t been recognized as equivalent to European law. The EU and Kenya only started talking about this in May 2024.

In Kenya, The Oversight Lab filed a petition with the Office of the Data Protection Commissioner. Sama is already dealing with human trafficking and labor exploitation claims from former Facebook content moderators. A Kenyan court ruled Meta can be sued there. That case is moving forward.

What This Means for Fashion Tech

Smart wearables are becoming normal. That’s the whole point—the glasses look like regular Ray-Bans. You can’t tell someone is wearing a camera on their face anymore.

This is the trade-off nobody talked about clearly enough.

There’s the bystander problem. People around someone wearing these glasses have no idea they’re being filmed. That little LED isn’t doing much. There’s the cross-border data problem. Your footage can end up on a computer in Nairobi whether you’re in Stockholm or San Francisco. There’s the human review problem that Meta buried in its terms of service.

Kleanthi Sardeli, a data protection lawyer with the privacy group NOYB, called it “a clear transparency problem.” Once footage gets fed into AI models, she said, “the user in practice loses control over how it is used.”

Petter Flink, a security specialist at the Swedish data protection authority, put it simply: users have “really no idea what is happening behind the scenes.”

Meta’s Position

The company has responded with carefully drafted statements.

“Ray-Ban Meta glasses help you use AI, hands-free, to answer questions about the world around you. Unless users choose to share media they’ve captured with Meta or others, that media stays on the user’s device. When people share content with Meta AI, we sometimes use contractors to review this data for the purpose of improving people’s experience, as many other companies do. We take steps to filter this data to protect people’s privacy and to help prevent identifying information from being reviewed.”

When the Swedish journalists visited ten eyewear stores, employees often didn’t know what data the glasses transmit, where it goes, or how recordings get processed. The people selling the product couldn’t explain what it actually does with your information.

If You Own These Glasses

Check your settings. Go to Privacy, then Data Sharing. Understand what you’ve agreed to.

Know that any time you ask the AI to “look” at something, that image could end up in front of a human reviewer.

Don’t leave them in private spaces. The bedside table is the worst possible spot.

Treat them like a live camera that someone else might be watching. Because sometimes, they are.

The Uncomfortable Question

One early adopter posted on social media after the investigation came out: “The day I found out my glasses were sending video to Kenya, I stopped wearing them.”

The post got two million views.

Fashion tech is supposed to make life easier. These glasses can translate languages, identify landmarks, take hands-free photos. The features are genuinely useful.

But someone in Nairobi is watching footage of strangers in their bedrooms to make those features work. The face-blurring doesn’t always blur. The privacy promises have asterisks. The terms of service are unreadable on purpose.

Is smart fashion worth it? That depends on how much you trust the system. And whether you think your bathroom should be part of someone’s workday.

 

Sources:

  1. Svenska Dagbladet and Göteborgs-Posten Joint Investigation (February 27, 2026) — Primary source for contractor testimony
  2. TechCrunch: “Meta sued over AI smart glasses’ privacy concerns” (March 5, 2026)
  3. Clarkson Law Firm Press Release: “Meta AI Glasses Class Action Lawsuit Filed” (March 5, 2026)
  4. Euronews: “Meta faces privacy lawsuit over AI smart glasses” (March 6, 2026)
  5. Digital Watch Observatory: “EU pressures Meta over alleged smart glasses privacy breaches” (March 2026)
  6. The Register: “Meta smart glasses face UK privacy probe” (March 5, 2026)
  7. Help Net Security: “Workers reviewing Meta Ray-Ban footage encounter users’ intimate moments” (March 5, 2026)
  8. HapaKenya: “Oversight Lab petitions ODPC to probe Ray-Ban Meta glasses” (March 9, 2026)
  9. TechCabal: “Kenyan workers say Meta Ray-Ban AI glasses expose intimate moments” (March 4, 2026)
  10. Decrypt: “Inside the Ray-Ban Smart Glasses Controversy Plaguing Meta” (March 2026) — Cited for 7 million units sold
  11. Engadget: “Meta hit with a class action lawsuit over smart glasses’ privacy claims” (March 5, 2026)

 

Disclaimer: This article is for informational purposes and does not constitute legal advice.

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Add a comment Add a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post
Dariia Bila

MY THEATRE by Dariia Bila — Paris Fashion Week Debut