Phil Schiller of Apple at the launch of the new iPhone Reuters/Stephen Lam
The new iPhone X puts face recognition front and centre. Why? Because it is the quickest and easiest way to unlock your phone. Before you鈥檝e even thought about placing your finger on the scanner or entering a PIN, the camera will have already processed your face, checked it鈥檚 you, and granted access.
Despite its handiness, however, the technology is far from proven. Apple keeps its algorithms under lock and key, meaning we don鈥檛 know for sure how well they work, but even the best in the business have woefully low reliability. 鈥淚 wouldn鈥檛 recommend anyone trust an iPhone鈥檚 face verification,鈥 says at Imperial College London.
And Apple isn鈥檛 the only one using software to recognise faces. Everyone from border control to high-street shops to police forces is trying to get in on the action. How much can we actually trust the technology? And do we want to live in a society where your face is always being tracked?
Advertisement
Is trying to protect your privacy futile?
The best current face-recognition algorithms use a form of artificial intelligence called deep learning. Companies trawl the web to gather billions of images and use them to train an algorithm inspired by neurons in the brain, called a deep neural network. Slowly, the algorithm learns to extract features from an image that are relevant to your identity, like the position of your nose or the gap between your eyes. The more images it sees, the better it becomes.
When your face appears in front of a camera, this network kicks into action. The iPhone X also has another trick: a 3D camera that means if your head is tilted away from the screen, the algorithm can work out what it would look like if you were facing straight at it. Then if enough features match, bingo! You are successfully verified and your phone is unlocked. If not, access is denied.
鈥淭he chance that a random person in the population could unlock your iPhone X and unlock it with their face is one in a million,鈥 said Apple鈥檚 Philip Schiller during a press conference on 12 September.
But in the real world鈥
However, Apple are only talking about one type of error that could happen with its Face ID algorithm 鈥 someone else gaining access. But what if your phone refuses to recognise your face, and doesn鈥檛 let you in? That鈥檚 a problem too. 鈥淭he two errors are related; for a given biometric system, if you reduce one type of error, the other type of error goes up,鈥 says at Michigan State University.
Under normal conditions, such as without perfect lighting, a perfectly still camera and the person in the perfect pose 鈥 in other words, the real world in which we use our phones, publicly tested algorithms have only managed to get below an overall . This is a high risk of your phone failing to unlock, or, worse, unlocking for someone else.
When demonstrating Face ID on stage at the press conference, Apple鈥檚 Craig Federighi found this out for himself. The software failed to recognise him and he ended up having to enter a passcode to unlock the phone.
鈥淲e have to wait till November [when the phone is released] to see the usability based on people’s experience when they get the phone,鈥 says Jain.
Until then, as Apple hasn鈥檛 demonstrated the resilience publicly, it seems silly to put too much faith in their software. And while you might be comfortable with using your face as a password, increasingly there is more at stake than a few text messages.
Family pictures, personal emails, those salacious WhatsApp messages, and even banking apps are all vulnerable once your phone is unlocked. Apps like Apple Pay and Android Pay allow you to flash your phone instead of your credit card, so theoretically anyone with access to your unlocked iPhone can spend an unlimited amount, although Android currently set a limit of 拢30 in the UK per transaction. The iPhone X will allow users to use Face ID to authorise transactions instead of PIN, and Android is expected to follow suit with something similar soon.
Why the long face?
Face recognition on its own may not be secure enough to protect our digital vault, but that doesn鈥檛 mean we should ditch it altogether. Instead, we can combine it with other biometrics. Fingerprints, iris scans and even voice recognition could all help provide a speedy authentication method.
There is an air of inevitability in the payments world that something about you will be your password in the future, rather than something you have to remember, says David Baker at UK Finance, the trade association for the UK banking sector. Although there is still a long way to surpass PIN on ease of use, he adds.
Are biometrics hackable?
Last year, police asked Anil Jain at Michigan State University to help unlock a murder victim鈥檚 Samsung Galaxy S6 phone. They thought it contained important information for the case, but it was fingerprint locked. Unfortunately for the police, they couldn鈥檛 just borrow the victim鈥檚 hand. A dead finger doesn鈥檛 have the surface electrical charge of a living finger that touch sensors rely on, so won鈥檛 unlock a phone.
They had the victim鈥檚 fingerprints on file from a previous arrest, giving Jain something to work with. After a month of digitally enhancing the fingerprints, and producing several different prosthetic prototypes, each with a micrometre-thick coating of metal on the surface, they eventually cracked the phone.
With face recognition, similar problems occur. As the latest devices like the iPhone X use 3D cameras, they can鈥檛 be fooled by simply holding up a flat picture. It could be possible to 3D-print a realistic looking face mask, but that won鈥檛 get you into a phone in a hurry.
Of course, if you鈥檙e in custody, the police could just hold your phone up to your face and gain access. Apple says there is a simple solution if you don鈥檛 want to let them in 鈥 close your eyes, and Face ID won鈥檛 activate.
Faces aren鈥檛 just taking off on the iPhone. What most people know as 鈥渇ace recognition鈥 is more properly called 鈥渇ace verification鈥, the task of confirming that one person鈥檚 face matches their records. True face recognition involves matching a face to a vast database of faces, like police trawling CCTV images for potential suspects. Although this is a much harder task, lower degrees of accuracy are required, because a human police officer will make the decision to further investigate a match.
This is already happening in the UK, where happened just last year, but the tech is still pretty basic. Police have been trialling the software at various large events, such as the Champions League Final and Notting Hill Carnival. At this year鈥檚 carnival the system flagged , so there鈥檚 still some way to go. Currently in the UK, the software is only used with bespoke cameras mounted on top of vans, but the same tools could be integrated with CCTV that is already installed.
Lack of transparency
Even though the technology is still in its infancy, the databases used alongside it are not. In England and Wales, the police have amassed a massive database of more than 20 million facial images. Many of these are mugshots, but many are also from interviews with innocent people who have been questioned during a case. In 2012, the High Court said it was , but the government has said the police do not need to delete them unless they receive specific requests from the individuals concerned.
Things are no better in the US. Approximately half of US adults appear in a face database amassed by the FBI, with about 80 per cent of the photos from people who haven鈥檛 been connected with a crime. They include driving license and passport pictures.
These huge databases are worrying, but what is that more worrying is we don鈥檛 fully know how they are being used. The software isn鈥檛 accurate enough to reliably issue arrest warrants or fines without a human checking the match first, but individual officers may not know that, and could be influenced by unconscious bias when presented with an 鈥渋nfallible鈥 machine鈥檚 results.
鈥淧eople thought deep learning would solve everything. But companies like Google have trained their algorithms on billions of images and they are still not very good in arbitrary conditions,鈥 says Zafeiriou.
Not that improving the algorithms would solve this thorny issue. If CCTV cameras could identify every person who passes it, the temptation to track everyone all of the time might just be too great for states to resist. So when you get in line for your new iPhone, think carefully about the world you are signing up for.
Topics:



