
On March 31, 2026, the Office of the Data Protection Commissioner (ODPC) revealed it had launched suo moto investigations into Ray-Ban Meta smart glasses and the handling of personally identifiable information for Meta AI training purposes. The confirmation came after The Oversight Lab submitted a petition on March 6, which received over 150 signatures. During the same week, a Los Angeles jury found Meta and Google liable for designing addictive social media platforms and awarded $6 million in damages. A day earlier, a New Mexico jury ordered Meta to pay $375 million for hiding its knowledge of child exploitation on its platforms; three jurisdictions; three proceedings; one entity.
Let's begin by discussing the events in Nairobi. A joint investigation published on February 27 by two Swedish newspapers, Svenska Dagbladet and Göteborgs-Posten, as well as investigative journalist Naipanoi Lepapa, revealed that footage captured by Ray-Ban Meta glasses around the world was being routed to Samasource Kenya EPZ Limited (Sama);a subcontractor to Meta in Nairobi for annotation; labelling data to train machine learning models. Kenyan Sama employees were analysing first-person videos of customers using restrooms, undressing, having sex and handling bankcards. Their task was to label everything so Meta's AI systems could learn what they were looking at. While doing so, their phones were confiscated at the door, office cameras observed the observers, and workers who questioned the material were directed to the exit.
Evidently, the privacy issue has several aspects that a Kenyan audience should comprehend. An individual using the glasses in Stockholm or San Francisco did not inform the person being filmed that their image would appear on a screen at an EPZ off Mombasa Road. The individual being videotaped never agreed to anything. And the Kenyan worker completing the tagging has no real option in a market where data annotation pays between Sh20,000 and Sh50,000 per month and the alternative is unemployment. As a result, there are three groups of individuals whose autonomy was subjugated to Meta's AI training pipeline, and none of them had a say in how the pipeline was created.
And the law here is not ambiguous. Section 25(a) of the Data Protection Act, 2019, requires that personal data be “processed in accordance with the right to privacy of the data subject” and that such processing be “lawful, fair and in a transparent manner in relation to any data subject”. Section 48 governs cross-border transfers, requiring the Data Commissioner to be satisfied that the recipient country provides adequate data protection safeguards. Therefore even as the footage moved into Kenya, the principle holds: data crossed borders without transparency, without a data protection impact assessment and without informed consent. If the ODPC interprets these laws to their logical implications, Meta will face a situation that a form-letter answer cannot resolve.
Similarly, EU’s General Data Protection Regulation (GDPR) raises parallel questions. Article 14 requires that where personal data is not obtained directly from the data subject, the controller must inform that person within one month of the categories of data collected, the purposes of processing, and any intended transfers to third countries. None of that happened. The people recorded by the glasses were bystanders—filmed without knowledge, processed without notice and if this processing involved EU users’ data, both transparency and a legal basis are lacking. And since Kenya does not hold an EU adequacy decision the transfer of that footage to Sama required standard contractual clauses or equivalent safeguards. Whether those were in place is precisely what regulators are now asking.
Correspondingly, other regulators moved fast. The UK's Information Commissioner's Office described the assertions as "concerning" and wrote to Meta requesting answers, stating unequivocally: "Devices processing personal data, including smart glasses, should put users in control and provide appropriate transparency." In the US, the Electronic Privacy Information Center asked the California Privacy Protection Agency to look into the glasses' biometric information safeguards. EU legislators sent official queries to the European Commission. Based on the preceding, the regulatory pile-on is worldwide.
Consequently, what makes the ODPC investigation consequential is the fact that the data was processed in Nairobi, which is where the annotation occurred. Kenyan workers sat in front of those screens. The question is whether the Data Commissioner will treat this as an enforcement test it is. Section 63 of the Data Protection Act empowers the ODPC to issue compliance orders, impose fines of up to Sh5 million, and/or submit cases for prosecution. It remains to be seen if the Data Commissioner will use the resources provided to the office by the Data Protection Act 2019, or whether this probe will result in a thoughtful, courteous letter. And this will determine if Kenya's data protection regime is operational or cosmetic.
Mutua Mutuku & Quency Otieno
Privacy Practitioners
Comments 0
Sign in to join the conversation
Sign In Create AccountNo comments yet. Be the first to share your thoughts!