Biometric authentication is reality not fiction
Biometric authentication is finding more and more parts of the human body to prove we really are who we say we are. But will it ever fulfil the promise of so many sci-fi representations? And will it ever be worth pursuing in preference to simpler checks?
They are at once unique and universal, and for decades they have been the focus of efforts to improve security, personal identification, and even access to electronic devices. Our biometrics – from brain physiognomies down to the characteristics of locomotion or gait – can in theory differentiate one individual from another, and the study of how best to sort between them has resulted in the technological enclave of biometric authentication.
Typically, authentication can be classified in three different forms: a user can create an unusual password or personal identification number, use a security token or smart card, or use their own physical or behavioural biometric traits. 'Biometric' implies the identification or authentication by human characteristics, such as DNA, fingerprints, facial characteristics, irises, and voice patterns, which are then securely measured, analysed, and matched.
As with any authentication process there are challenges, but the need to provide secure access to all kinds of technology has grown at the same time as the scale of threats such as identity theft, which means that security failsafe procedures involving some degree of biometric has expanded. In conventional enterprise IT, weak passwords continue to be a root problem in corporate data breaches and are a common way in for cybercriminals. Credit cards, smart cards and security tokens can be lost or stolen and hacked, or their secrets swiped by keystroke loggers, and even the most advanced encryption techniques may eventually be hacked.
Small wonder then that the security industry is investigating biometric authentication's potential as the next generation of primary identity assurance, as human characteristics are all unique. Biometric authentication is usually required for two-factor authentication purposes, through sheer convenience as finger or voice traits cannot be lost or stolen.
However, tokenless authentication experts SecurEnvoy say that, while two-factor authentication is necessary, biometric authentication is still flawed. "With so many biometric technologies out there the security risks differ but if you take face recognition for example, it is currently way too easy to confuse the system," explains Andy Kemshall, technical director at SecurEnvoy. "Face recognition with non-3D based cameras can easily be fooled with a single photo of the user's face. Unless the user has a very high-resolution camera, or even two cameras, to recognise their face it is far too simple to bypass the application.
"Another example of an unreliable method is the speed-of-typing recognition system. It detects the way a user enters characters and the strokes they use when on a keyboard. However, this system assumes the user will always be using the same keyboard and type in the same manner. People are unpredictable – a change in mood or sense of urgency can affect the way they type and easily cause problems when authenticating."
Biometric authentication technology has been something of a slow burn in adoption terms. Because of its very advanced nature, it is – and probably will always be – expensive, putting it beyond the reach of many organisations."I don't believe organisations are in fact changing their attitudes towards biometric authentication but they are moving towards two-factor authentication and realise they now need more than just a password in order to reliably authenticate," says SecurEnvoy's Kemshall. "Cost and reliability are both reasons why organisations are moving towards this, it is far cheaper and more reliable to authenticate using technology someone already owns."
Another aspect of widescale adoption are common technological standards, which have emerged only comparatively recently. In 2011, the International Organisation for Standards and the International Electrotechnical Commission jointly published a security and privacy standard to ensure the safeguard of biometric data used for online authentication and ensure it will not be compromised. The 'ISO/IEC 24745:2011 Information Technology – Security Techniques – Biometric Information Protection' standard has been published as a guideline with advice on the management and processing of biometric data used for authentication.
The standard outlines: specific 'solid countermeasures' to protect individuals; among them are analysis of threats and countermeasures inherent in a biometric and biometric-system application models; security requirements for binding between a biometric reference and an identity reference; biometric system application models with different scenarios for the storage and comparison of biometric references; and guidance on the protection of an individual's privacy during the processing of biometric information.
'Biometrics' refers to the identification of human physical and behavioral characteristics, such as fingers, hands, ears, teeth, veins, voice and eyes. This concept of biometric authentication means biometrics are used to authenticate the body parts themselves as it is difficult to steal them, lose them, or duplicate them.
Biometric authentication is a vital element in security due to unauthorised immigration, visa fraud, and border intrusion; it is increasingly being implemented at security checkout points at airports.
The different biometrics
Fingerprint scanning is becoming a widely used methods of verification, either to log into computer systems, at passport control, or premises entry control. Biometrics have even found their way to Disneyland in the US, where a system has been designed to deter visitors from buying fake tickets from scammers. iPhone manufacturer Apple is also taking advantage of fingerprint technology; it is thought that the iPhone 5S and iPhone 6 handsets will be embedded with a fingerprint scanner for added security.
Iris recognition is a fairly well-established authentication method; it works by analysing the characteristics in the coloured tissue surrounding the pupil, which typically has more than 200 points of reference for comparison, and along with rings, furrows and freckles. Facial recognition measures the distinctive facial characteristics, including the distances between the eyes, nose, mouth, and jaw edges. The measurements are stored and compared when an individual's face is scanned again. Iris and facial biometrics can be used in aviation security, accessing computers, buildings and homes, and border crossing. Biometric authentication systems have already been installed at schools where pupils are using fingerprint and hand scanners for attendance registration, cashless catering and site access. Airports too are using authentication systems to control border security; in particular in 2011 Gatwick airport installed biometric identification company Human Recognition Systems MFLow Track. The iris recognition system, which reportedly was part of the £45m upgrade of the South Terminal, aims to speed up security checks.
Voice recognition identifies who is speaking by digitising the voice; the recording is then dissected into small, recognisable speech bits called phonemes and stored. Once the voice recognition software recognises the phonemes, the complex process of identification and contextual analysis begins, as it compares and pairs up each recorded phoneme against text equivalents in its memory. Voice biometrics are commonly used for remote authentication, for instance by adding another level of security to smartphones.
Analyst Gartner Research has identified biometric authentication as a key technology to watch, according to its 2012 'Hype Cycle for Emerging Technology' report. This comes as no surprise as US research company Marketsandmarkets predicts from its 'Next Generation Biometric Technologies Market – Global Forecast & Analysis 2012-2017' report, that the total biometric technologies market is expected to reach $13.89bn by 2017. The report explains though voice, signature, vein, and DNA recognition are being used, face, fingerprint and iris recognition are commonly used in industries such as finance and government offices but are gradually being deployed in defence, consumer electronics, healthcare and home and commercial security.
The report also reveals why various sectors use biometric authentication: government applications cover voting, personal ID, and building access; travel and immigration use the technology for border access control, detection of explosives at airports; and the finance sector requires biometric authentication for account access and cashpoint security.
While technologies are evolving, some industry experts are taking the traditional biometric elements and refining them. Hand and fingerprint are currently the leaders, but IT companies Hitachi and Fujitsu have developed biometric systems specialising in vein authentication technology.
Hitachi has created 'SecuaVeinAttestor', a finger vein authentication system, integrated with near-infrared rays generated from a set of light-emitting diodes which penetrate the finger and are absorbed by the haemoglobin in the blood. The rays are absorbed by the veins and appear as dark areas, captured by a charge-coupled camera embedded in the 'SecuaVeinAttestor' device.
The image is processed to construct a finger vein pattern and is then compressed and digitalised and registered as the users' unique template. Similarly, Fujitsu has developed 'PalmSecure', like the Hitachi device it captures the vein pattern using infrared rays. The system can only recognise the vein pattern if haemoglobin is actively flowing within them.
Other advances in biometrics include brain waves and heart rhythms; though they are still in their infancy, scientists and researchers claim these two vectors are impossible to imitate or alter. Heart rhythms are unique to each individual as patterns can change due to stress or if an individual has a heart problem. "Patterns from the heart do possess uniqueness, but this is trickier as conditions such as exercise, emotional states can alter the heart patterns. Hence specific methods to normalise the changes will need to be developed," says University of Wolverhampton's senior lecturer in engineering, Dr Palaniappan Ramaswamy.
He continues to explain why brain waves are less complicated to test: "It is easier to normalise brain patterns, especially as you can develop the necessary two state paradigms – one state as baseline measure and another the actual 'test scenario' and use the baseline information to normalise any variations."
Researchers at the University of Wolverhampton investigated if the brain's electroencephalogram and heart's electrocardiogram signals could be used as a biometric tool; however, though in theory the concept means these biometric tools could deter fraud, the technology is (as yet) not accurate, and therefore is still classed as 'futuristic'.
How movies have ridden the biometric bandwagon
From reality to Hollywood, biometric technologies have been represented for decades in movies of all genres, ranging from science fiction films 'Terminator', 'X-Men' and 'Demolition Man' to action films such as 'The Dark Knight' and 'Mission Impossible'. Some films undoubtedly mispresent biometric technology somewhat – for instance the gruesome eye gouging scene in Marco Brambilla's 1993 'Demolition Man', where character Simon Pheonix (Wesley Snipes) uses a gouged eye to gain access through a secure biometric-enabled door.
Other films are more accurate and do represent the advancements of biometric authentication now occurring. Steven Spielberg's oft-cited 2002 film 'Minority Report' features extensive use of casual iris and retina scanning techniques for both personal identification and point-of-sale applications. During the film, the character John Anderton (Tom Cruise), changes his identity by having an eye transplant to enable him access to a security system.
'Gattaca' (1997) portrayed a society where DNA determined people's fate. The film showed a world where humans were classified by their genetics; those genetically engineered to be superior were 'Valid' and led privileged lives where as those who were inferior were 'Invalid' and endured restrictions. This was controlled by a biometric device which took a blood analysis by pricking the finger and sampling DNA from the blood droplet.
In 1995's 'Judge Dredd', the eponymous character carried a voice-activated smart gun which was biometrically synchronised. Similarly in the latest James Bond outing, 'Skyfall', 007 also used a smart gun which linked to his biometric palm print. Today, smart guns are available; German manufacturer Armatix pairs a handgun with a watch. There are many prototypes being trialled today; New Jersey Institute of Technology has produced a prototype personalised gun which relies on biometric sensors in the grip and trigger which records the owner's hand size, strength and dynamic grip style.
Human odour is another biometric giveaway, depending on an individual's diet, natural odour, environmental or cosmetic factors; each individual has a different scent. In 'Alien: Resurrection' (1997), directed by Jean-Pierre Jeunet, character General Marti Perez's used a breath recognition device to gain access control, it required two tries to enter, he later used a spoofed spray.
Biometric vectors from head to toe (and beyond)
The idea that the patterns of our brainwaves are highly individual should probably comes as no surprise. No two people think the same. There are two ways in which instruments could collect enough statistics to provide a biometric signature, although neither is likely to become mainstream. One is to collect EEG data from multiple points, which involves the user wearing a cap full of electrodes. The other is to use magnetic resonance imaging (MRI), for which the equipment is bulky and requires the subject to stay still for long periods. It is the kind of thing you can imagine being used to identify a senior military commander, but perhaps not to access a bank account.
Although the overall structure of the arteries and veins throughout the body is much the same, they grow subtly differently for each person. Hitachi patented a system in 2005 that used a near-infrared LED to illuminate a finger, and which is used in some cash machines. The haemoglobin in the blood absorbs that frequency of light so that, for the most part, arteries and veins appear as a pattern of dark lines. The advantage of the system over fingerprints is that the method needs an active bloodstream – it does not work for fake or severed digits. The technique has been expanded by Fujistu to cover the entire palm rather than a single finger.
The iris is as individual as the fingerprint and changes little with time. Your iris does not wear out in the way that a fingerprint can with age or manual work. An iris scan is relatively unintrusive; it does not need bright lighting and is one of the few techniques to make it into large-scale systems, primarily for checking users against their passports. It is however possible to fool iris scanners using contact lenses and even by good photos, although it's hard to imagine someone getting away with the latter in the closely monitored immigration hall.
The fingerprint scan is the most familiar of biometrics, thanks to numerous crime shows on TV, but electronic scanner design makes it relatively straightforward, if finicky, to fool one. Researchers have used gelatine to make effective false fingertips. Age and wear and tear also take their toll on fingerprints, making it hard to identify some people as they get older; and some people just don't have good fingerprints to start with.
Your face should be a good biometric, but in practice error rates are high for 2D face recognition systems. This is a problem when you consider that images stored on passports are in 2D. Moving to 3D scanning improves accuracy, but it could take years for that technology to move to ID cards and passports.
Hand geometry got off to a flying start in trials of border-control systems based on biometrics. It is surprisingly robust when it comes to identifying people; but it is possible to fool them with casts of hands in the right materials.
People tend to produce different odours – some more noticeable than others. Electronic noses are under development that can pick up the differences, but this is one biometric that is arguably not likely to prove acceptable to the general public.
The pinnae – the ridges of cartilage and skin that surround your ears – have distinctive shapes that can be used to identify you. However, over-ambitious use of ear prints in some court cases have called into question the usefulness of the ear as a realistic means of identification.
Dental records have proved to be one of the most useful and reliable means of identifying someone when there is very little left to go on. However, regular dental work means keeping the records up to date, which is an issue and it's a far from practical as an automated biometric.
The way you talk could be one way that a computer identifies you. It's unlikely that lip movement will be used on its own but could be used with speech recognition to make a more robust biometric. It has the advantage of being relatively cheap to implement as all you need is a webcam and microphone.
The way you walk can act as a long-distance biometric, letting CCTV system work out who you are; so it's a popular choice among law enforcement agencies, but less so among civil liberties campaigners. Recent work using the shadows cast by clothing have improved the accuracy of limb tracking.
In much the same way that EEG signals can reveal information about your brain, so can the shape of your heartbeat. In principle, this technique would be easier to use than EEG or MRI scanning as it is possible to detect the signals through a fingertip sensor. However, some doubt this technique's ability to provide a unique ID for users – it might instead be used in combination with fingerprint or vein scanning to provide a more useful combination biometric vector.
If you've been to the optician recently, you've almost certainly experienced the discomfort of having your retina examined – and spent several minutes waiting for the green haze from the bright light used to illuminate the retina to fade. Biometrically, retina scanning is a good performer, but it's never going to a popular choice among users. As a result, it is more likely to be remain restricted to science fiction flicks.
Keyboard and mouse
Some researchers believe the way that people type on a keyboard and use a mouse can tell you who they are. It cannot be used on its own but, with a password, it could prove to be more secure than other non-biometric techniques. Needing only a keyboard or mouse, this technique is cheap to implement but more work is needed to build effective ways of classifying users. One related technique is eye movement, which captures the personal idiosyncracies in the way users look around a screen – movement that could be captured by a standard webcam.
Voice recognition, another staple of sci-fi films, could be the biometric you use most of all. It has the advantage of being one of the few biometrics that can be used on the move, earmarking as a technique for automated telephone banking. Experts reckon the technique is not secure enough to do everything, so what you can do based on just what you say to the computer will be limited to smaller payments and balance checks.
In the sci-fi film 'Gattaca', the characters have to live with a daily routine of pinpricks as the system does a quick bit of chemistry on their DNA. Using more or less the same techniques as forensic fingerprinting, DNA-based biometric systems may appear in the future for high-security systems. But they are more likely to use a little bit of spit. Even so, it's unlikely to be a popular choice for taking money out of an ATM.
By Chris Edwards
A brief history of biometric authentication
There are examples of biometric authentication uses dated as early as prehistoric times, where caves were decorated with pictures and signed using fingerprint stamps of the authors. Evidence of fingerprint stamps were seen during Babylonian at 500 BC to record business transactions on clay tablets. The first reported use of biometrics was by Portuguese explorer Joâo de Barros in the 14th century: he described the practice of Chinese merchants stamping children's palmprints and footprints to distinguish one from another. By the 19th century Alphonse Bertillon, a member of the police working as a record clerk in Paris, invented anthropometry which is the use of body measurement to identify criminals. The system recorded body movements, distinctive bodily marks, such as warts and tattoos. This system was later used by both American and British police forces and the system was soon dubbed 'Bertillonage'. In 1892, Argentinian police official Sir Francis Galton developed a classification system for fingerprints; he discovered no two individuals shared the same fingerprint, which is still largely used. Later in the 20th century, ophthalmologist Frank Burch proposed the concept of using iris patterns to identify an individual; but by 1987 two ophthalmologists Aran Safir and Leonard Flom patented the idea and two years later, actual algorithms were created. Furthermore face recognition and acoustic speech production were developed and was heavily adopted by the Federal Bureau of Investigation. In 1980, the term 'biometrics' – methods of automated human identification – was coined.
|To start a discussion topic about this article, please log in or register.|
"Our summer watersports special: surfing artificial waves, racing yachts for sport, superyachts for pleasure and much more besides"
- First experimental wormhole created
- NHS health records available online next year
- Internet of Things takes centre stage at IFA 2015
- Nissan invests £100m in Juke production at Sunderland
- Graphene fuel cell electric supercar planned to take on Ferrari
- Starfish-killing robot to protect Great Barrier Reef