station10
Typewriter-blog.png

Blogs

our views and our knowledge in analytics and other releveant topics


our blogs


Oh hi there! – Facial recognition and data capture

 
facial-recognition.jpg

Who owns your face? If that seems like a stupid question then great, it should, because at the most basic level, it is. Obviously, you own your beautiful/ugly (delete as appropriate to your current level of self-esteem) mug. It’s attached to you, you’ve always had it, what am I even talking about.

What about images of your face though? Let’s say someone takes a picture of you, in your lovely smiling glory. Who owns the rights to that image? You? The photographer? Now assume the photo is digital. Somewhere in the information stored by the camera are the string of 0s and 1s used to recompose a perfect image of you, anytime, anywhere – enough copies to satisfy even the most deluded narcissist. Do you own that data? Do you care if you do?

One more step then. How about if that picture was taken with only your tacit consent, either because you were in a public place, or using a corporate provided service that ostensibly you’d accepted the conditions for (you know, like every piece of software with an unread user-license agreement you’ve ever used. Don’t worry, we all do it). That picture is digitised, reduced to a pattern of binary data and stored in a database for use in pattern-matching algorithms for facial recognition technologies. Do you care who owns that data now? I think you probably should, because it likely isn’t you anymore.

 Facial recognition technology is a current hot topic, undergoing testing under various guises and for various purposes around the world. The response so far has been mixed. In China, law enforcement agencies are using pattern-matching software in combination with extensive CCTV networks to keep an eye on the public, ostensibly to reduce crime. However, this coming from a state many already perceive as heavy-handed on surveillance has raised more than a few concerned eyebrows. Elsewhere, San Francisco has already introduced legislation banning city agencies, including the police, from using facial recognition on the grounds that it infringes human rights – but commercial applications are still viable and raise a whole raft of other considerations.

As is often the case, the questions of privacy versus convenience come to the fore in these kind of situations, and in most cases (as they have with less obvious digital privacy concerns like cookie policies and tracked advertising) legislators have been somewhat behind the curve when it comes to keeping pace with not just the technologies themselves, but the potential future applications and implications of those technologies.

For example, hypothetically, checkout free stores could use facial recognition technology to identify that you came in, link you back to your loyalty card account and payment details and automatically connect that information together for a completely seamless shopping experience. Facial recognition cameras could simplify multiple minor tasks that would otherwise require human interaction or identity verification. Forgotten your door pass to your workplace? No worries, the camera knows it’s you and the doors unlock.  Need to collect a prescription from the chemists? No need to remember that piece of paper, the camera clicks and it’s ready for you. Sounds good? Sure, but this data is all going to come and go from somewhere. Do you trust companies to keep that data safe? To treat it with respect and not use it nefariously for, say, demographic profiling? Bring law enforcement and government agencies in as well and the picture (of your face) gets even murkier – would the data be shared? Could corporate entities have access to police databases, with cameras notifying staff in a store that a person with a shoplifting conviction has just walked in? Whether this is a problem to you as an individual is going to depend on a lot of factors – how accurate the technology is, where and how it’s used. I’ve seen anecdotally a lot of people taking the view that “I’ve got nothing to hide,” so why would they worry if it was used to reduce crime or increase convenience? Until the system mistakenly identifies them as a criminal of course – remember the “Ross from Friends” shoplifter CCTV? Now multiply the consequences exponentially.

Businesses considering this kind of technology do need to worry though, even if the public don’t – In a way, being so obviously a data-driven method with the potential for serious implications on corporate/personal privacy interactions, it has crystallised the existing issues with data tracking and storage around it.

Simply because facial recognition is a more obvious use of the personal, the implications for data protection, corporate responsibility and prevention of nefarious data breaches are equally more obvious. This in turn means that businesses can’t afford to sit on the fence and wait for legislation to catch up in the way they have for more traditional forms of personal data security. From cookies and advertising tracking right up to future possibilities like biometric or genetic information, corporate policy has to advance, because no-one wants their company to be the one that gets caught out as the go-to example of a massive data breach.

The technology is in its infancy right now, and the current issues are mostly with the fact that it doesn’t yet work properly – In recent UK police trials, 95% of the matches the system used identified as potential miscreants were actually completely innocent people, and the systems also struggle with pattern-matching BAME faces, leading to accusations of racial bias. A continuing need to be considerate of the margins for error inherent in any such technology will always be necessary, no system is ever perfect. Nevertheless, the technology will progress, more and more commercial applications will be found and more and more companies will get caught out for not having robust data policies to make use of it safely and reasonably. Really, as with any kind of technology like this, there will be two questions: “Just because we can, does it mean that we should?” and “If we’re going to take advantage of this how do we do it in the right way?”

Station10 can’t necessarily help you with the first, but as experts in data policy structure, maybe we can help with the second.