Meta believes smart glasses are the future of augmented reality (AR). It might have a point. Sticking in a camera in a pair of stylish glasses without anyone being any wiser is an impressive feat – one that was bound to be exploited in Terminator-level scenarios eventually. A pair of Harvard students have done that by turning Meta’s smart glasses into a privacy nightmare with some simple code and facial recognition software.
AnhPhu Nguyen, one of those Harvard students, posted a video showcasing the Orwellian world we currently face, centring around Meta’s Ray-Ban smart glasses doxxing unsuspecting folks. All it takes is a simple glance at people walking by. Think Watch Dogs but with a more interesting sub-plot.
This is just the beginning
Are we ready for a world where our data is exposed at a glance? @CaineArdayfio and I offer an answer to protect yourself here:https://t.co/LhxModhDpk pic.twitter.com/Oo35TxBNtD
— AnhPhu Nguyen (@AnhPhuNguyen1) September 30, 2024
Meta, obviously, didn’t intend for its smart glasses to be misappropriated this way. It’s meant for more simple stuff; live-streaming to Instagram, chatting with Meta AI, or answering the age-old question: Dude, where’s my car? For another, it’s already got all the information it’ll ever need about you, and Meta isn’t in the mood to share.
As for how a pair of Harvard students managed to whip up something quite so terrifying? Dubbed I-XRAY, the process is unfortunately a rather simple one that can be extrapolated beyond our wildest dreams. It involves simply live streaming the glasses’ feed to Instagram, where an AI waits to identify any faces it sees. Those are then fed into public databases to access names, relatives, addresses, and even phone numbers.
Read More: Meta’s Orion glasses promise to be the future of augmented reality
That’s all then fed right back into an app where the information is neatly wrapped up to be used for whatever nefarious schemes you, or in this case, AnhPhu Nguyen and Caine Ardafiyo, might have cooking.
Fortunately, those students aren’t using their newfound power for evil. Instead, they’re doing it to highlight the concerns behind tech like this using already-existing facial recognition tech and LLMs – and won’t release it to the public.
“Initially started as a side project, I-XRAY quickly highlighted significant privacy concerns. The purpose of building this tool is not for misuse, and we are not releasing it. Our goal is to demonstrate the current capabilities of smart glasses, face search engines, LLMs, and public databases, raising awareness that extracting someone’s home address and other personal details from just their face on the street is possible today,” reads a document explaining the project.