What facial recognition steals from us

What facial recognition steals from us


If you upload a photo into Google’s reverse
image search, it’ll find websites where that picture has appeared, or provide “visually
similar images” that have the same coloring and composition.
The leading search engine in Russia, called Yandex, has reverse image search too but it
doesn’t work the same way. It’s not looking for visually similar images. It’s looking
for similar faces, the same face. The difference between these search engines
is that Google hasn’t switched on facial recognition and Yandex has. On Google, you can enter a
name and look for a face. But on Yandex, you can enter a face and look for a name.
And that distinction represents a potentially enormous shift in our offline lives, where
we usually decide who we introduce ourselves to.
Now that computer scientists have created tools that can turn faces into nametags, it’s
worth reflecting on how we got here and what we stand to lose. A computer’s facial recognition system has
broadly the same components as your own facial recognition system. You see someone with your
eyes, your mind processes the features of their face, and recalls their identity from
your memory. Now imagine if you could have eyes in lots
of places and could download and store memories from other people, then you have something
more like the automated version of facial recognition, which has only come together
in the past 5 years or so. Its eyes are digital cameras, revolutionary
machines that turn light into data. “It’s a state of the art digital model, which records
images on memory chips instead of photographic film.” Digital imagery arose in the early
2000s, which coincided with the arrival of the social internet. So right when we were
able to take an unlimited number of pictures, Facebook, Flickr, Youtube, and other sites
told us our images had a home online. “100 million photos are being tagged every
day on Facebook.” Professional photography also went up on websites,
news articles, and photo libraries, and Google’s web crawlers gathered them into Image Search.
And then the computer vision researchers went to work. The millions of digital photos posted
to the internet, like the Facebook pictures where we tagged our friends or Google image
results of celebrities– they were used to build the “mind” of facial recognition
systems. That mind is made up of a series of algorithms.
They locate faces in an image, map facial features to correct for head rotation, and
then take over 100 measurements that define that individual face.
Those measurements are usually described as the distance between the eyes, the length
of the nose, the width of the mouth. But the truth is, nobody knows exactly what’s being
measured. That’s determined by a deep learning algorithm looking for correlations in raw
pixel data. To train that algorithm, engineers give it
sets of triplets: an anchor photo, another photo of the same person, and a photo of a
different person. The algorithm is tasked with deciding what to measure so that the
statistical difference between the two matching photos is as small as possible while the distance
between the non-matching photos is as large as possible. 
These algorithms are refined through millions of examples, but they still don’t perform
equally well on all types of people or on all types of photos.
That hasn’t stopped them from being packaged and distributed as ready-to-use software.
But whoever uses that software won’t be able to identify you until you’re in their
database of known faces. That’s the “memory” of the system – and it’s separate from the
training images. In the case of the iphone’s faceID, it’s
a database of one – you volunteer to store your face on your device in exchange for easily
unlocking your phone. Companies like Facebook and Google also keep
databases of their users. But it’s governments that typically have access to the largest
databases of names and faces, so facial recognition significantly expands the power of the state.
They collected these images for other reasons and now they’re repurposing them for facial
recognition without telling us or obtaining our consent, which is why several US cities
have banned government use of facial recognition. Retail stores, banks, and stadiums can create
or buy watchlists of known shoplifters, valued customers, or other persons of interest, so
they’re notified if one of those people shows up.
And then there’s another source of labeled photos. Those are the ones we’ve been labeling
ourselves by setting up profiles on social media networks.
It’s typically against the terms of use to program bots that can download faces and
names from Linkedin, Twitter, of Facebook, but it’s doable.
And what’s at stake is something that most of us take for granted: our ability to move
through public spaces anonymously. “So we typically think of public and private
as being opposites. But is there such a thing as having privacy when we’re in public?”
“I would like to think so.” Evan Selinger is a professor of philosophy
who argues that facial recognition is a threat to “obscurity,” Which is the idea that
personal information is safer when it is hard to obtain or understand.
“So We have natural sort of limitations in what we can perceive and what we can hear.
Even the human mind has a sort of basic limits in how much information it can store. So one
of the things that technologies do is they reduce the transaction
costs of being able to find information, being able to store information, being able to share
information, and being able to correctly interpret information. And so facial recognition is
probably the most obscurity-eviscerating technology ever invented.” We don’t have to imagine how this could play out. It’s already happening with
photos from the Russian social media network VK.
Aric Toler, a journalist who covers Europe for Bellingcat, showed me how it works with
a random video of Russian soccer fans picking fights in Poland.
“There’s about 10 or so of these soccer hooligans in this video and for every single one of
them you can find their profiles on VK. OK I’ll get this guy in the background. Let me
save him. OK so here’s the first result. This guy. So if you click the photo here it will
take you directly to the photo’s link. And here he is. I think he’s wearing I think he’s
wearing the same shirt. Yeah he’s wearing the same shirt even. This is him too. So this
is probably like his buddy who uploaded a photo. Yeah. So this is this guy’s profile
and here’s his buddy right here. Yeah so here he is during a baptism, probably.” “And the
photo you uploaded is not particularly clear or high resolution.” No “not at all, right,
it’s just a 200 by 100. So it does feel weird when you do this and you have access to way
more information than you should, is what it feels like. But also we only publish what
we’re like one thousand percent sure of and if possible we maybe dont include the
names of the people.” How you feel about this technology probably
depends on how much you sympathize with the person being identified.
Bellingcat has used these tools to find identify people linked to the attack on flight MH17
in Eastern Ukraine. It’s also been used to doxx police officers accused of brutality,
anti-corruption activists protesting against Vladimir Putin, random strangers as part of
an art project, and sex workers, porn performers, and others who have posted anonymous photos
online. “The way that we share our images and our
names on social media, LinkedIn Twitter, Instagram, it seems to suggest that we don’t want to
be obscure or we’re not really looking to be anonymous. Are we allowed to want to share
and connect with other people online and still be able to expect not to be recognized when
we’re offline in our regular lives?” “I would say absolutely. In fact I would go
further and say if we ever create a society where that’s not a reasonable expectation,
a lot of the things that are fundamental to being a human being are really going to be
compromised. Having any individuality requires experimenting
in life and experimenting requires the protections of some obscurity. But also intimacy requires
obscurity. Right. If you want to be able to share different parts of your life with different
people, and I think most of us do right. We don’t want to come into work and behave the
same way we do with our friends. We don’t want to treat our partners in the same way
we do acquaintances. And the concern, when you lose too much obscurity, is that these
domains bleed into one another and create what’s called context collapse. And it doesn’t
mean that one is more real or one is more authentic. Leading a rich life requires us
to be able to express ourselves in these diverse ways.”
The photos we took to share with friends, or document history, or simply get a government
ID have been used to build and operate a technology that strips away the protections that obscurity
has always provided us. It’s nothing less than a massive bait-and-switch. One that could
change the meaning of the human face forever.

100 thoughts on “What facial recognition steals from us

  1. Open Sourced is a new project from Recode by Vox, which includes lots of reporting and writing on our website in addition to videos: https://www.vox.com/open-sourced. We'd love to hear your questions about tech, data, AI, and privacy. Here's an easy form for you to submit questions for our journalists or contribute your own expertise: http://www.vox.com/opensourcednetwork

  2. Findface does not operate since last year lol. On the website it says that they moved on to work for the government. Don't know which is worse tho.

  3. My default worldview since I learned the internet existed has been that any illusion people currently have of obscurity is either already gone, or getting there quick. Y'know… 'Cause, Sci-fi novels and video games.

  4. Broadcasting on instagram, facebook for likes and followers and still have the guts to want privacy, if thats not hypocrisy than i don't what it is.

  5. It's always so scary and they only talk about doom and gloom when this stuff is brought up. But there are also good things that could come of this. A lot of crime could be prosecuted by the use of this technology. And while it isn't foolproof it is getting better and could aid in gathering evidence by giving a strong lead to law enforcement. Just a thought that its not all bad.

  6. Well I hope I can FINALLY get the photos from all the modeling I did. I never saw my commercials, music videos, commercials, print, catalogue work etc..

  7. YouTube is the only place I comment on videos and take part in discussions, because I can be anonymous. Social media where I share my face is different. There, I'm very careful not to share my opinion or express anything that can one day can be turned against me.

  8. I sometimes wonder if making this technology was a mistake as it seems we must now all wear helmets like Mandalorians when out in public or at the very least glasses with IR LEDs to blind cameras.

  9. Lol, people whining about this, and yet, no one would even think of not being on social media.
    This is the cost of social media and people will just have to accept that, along with anything else down the road that people will willingly give up freedoms for that is presented to us on a sliver plate.

  10. There are actually people promoting the law against illegal use of face recognition by government in Russia (e.g. Alena Popova)

  11. How about the privacy of using public WIFI like the kind at gyms or at Walmart? And what is at risk and what can be done about it

  12. in a simulated/blended reality the demand for the designer classics of time faces will fetch everyone's fancy, and then after in world tendencies play out through repeated stressors, the really bold will create the hybrid, and hybrid of hybrids, and intentionally obscure. All facets indicating that the path used by most has been done and something fresh is required for the fresh mindset they seek to curate/let live out.

  13. I am surprised you didn't mention the Chinese government's terrifying use of this in Xinjiang. Where they forcefully make people have their photos added to the database so they can facially recognise them.

  14. well if you are a user of google photos it automatically groups the same faces together and asks for a name to label the specific person.

  15. LOL Yandex got highlighted instead of Google, facebook and Apple that is cooperating with US Govt. I bet a million bucks that the NSA has got a much better software than yandex and Google and the Gang is subcontracted to keep it up to date.

  16. We have the ability and the means to all learn these new technologies. The revolution won’t begin with guns and bloodshed. It begins with an educated society who can reverse what the big tech companies have created. They are the enemy… not the government.

  17. With the maximum privacy settings on Facebook applied to a particular photo I upload, how many people can see it and use it? I know what the settings say they do, but do they really?

Leave a Reply

Your email address will not be published. Required fields are marked *