But is the use of biometrics really the answer?
Let's start by explaining what is meant by the term "biometric". Biometric technology uses electronic methods to identify a person by a variety of unique physical characteristics. The best known are face and fingerprint recognition, and iris scanning. Others continue to be developed and some of the more recent forms of biometric technology utilise not just your physical characteristics but also behaviours eg the way in which you walk.
Particularly face recognition and fingerprint recognition made an appearance several years ago on some laptops. It didn't catch on. The primary reasons were:
- For any chosen biometric feature the recognition algorithm has to allow for some "flex". Any living thing does not stay exactly the same from moment to moment never mind between logins. However, there is nothing that annoys users more than false rejection. When it is the legitimate user they expect to be recognised 100% of the time, not 90% or 80%. This led to some recognition systems having such large tolerances that they erred on the side of granting, for example system access, when it wasn't a real match. The only thing that annoys users more than being locked out of their own system is when they system erroneously grants access to others.
- Historically, biometric systems have not been very good at recognising living creatures (especially humans). Hence, there have been many stories in the press about, for example, fingerprint systems being fooled using everything from a photograph to a gummy bear. Even the recent iPhone fingerprint recognition system was allegedly hacked by the Chaos Computer Club using "lifted" fingerprints within days of release of the device, although it's not quite as easy as the video makes it appear.
Plus, the detection devices are improving.
The iPhone sensor is what is known as a capacitive sensor in that it detects not simply an image of the fingerprint but searches for the profile between the top of your fingerprint ridges and the troughs.
But sadly it doesn't look like it is yet fool proof, even if it is making life more difficult for would-be hackers.
I suspect this is something of a renaissance for fingerprint recognition. Smart phones are a slightly better platform because not only has the technology evolved since it first appeared on laptops, but also mobile phones have the potential to act as a means of providing universal two factor authentication.
We have already seen online services from major vendors using text messages to send authentication codes to supplement passwords. With the move to make security as transparent as possible, fingerprint recognition is an obvious way to prevent such a code "easily" falling into the hands of someone who has unauthorised access to both your password and your phone.
All of which begs a question. How secure is that biometric data? If someone stole your phone could they then steal your biometric data and impersonate you on other systems?
As always the devil is in the detail. Fingerprints are usually not stored as an image but rather bifurcations (changes of direction or splits) in the ridges are mapped and it is those that are stored and then compared when you place your finger on the sensor. But, this is still useful and potentially could be misused which is why it is vital that any such data is stored securely: encrypted or some other secure storage that prevents unauthorised users simply walking away with your biometric data.
We should assume use of biometrics in security are here to stay but whenever you see it in use I would recommend you explore two questions:
1. Is it the only means of securing your device? If so, be very careful that it has not already been circumvented.
2. Is it stored securely? Many will use woolly terms such as "encrypted" but it's important that the manufacturers state, for example, what encryption is being used.
As ever in security, the weakest point in the security chain defines the true strength of the security. Don't rely upon something that has a weak link.