Eye tracking technology graphic

Eye-tracking tech: Countdown to lift-off?

Can eye-tracking technology move beyond niche applications and into the mainstream - and would you want it to?

What if you could pause your TV with a glance? Or scroll through a pdf on your computer just by moving your eyes down the screen? Or even activate controls in your car by briefly staring at them – while keeping your hands on the wheel?

Eye tracking has been around for decades, used by researchers studying the physiology and psychology of human vision. But with rapidly falling costs and recent technological advances, the tech is finally moving away from passive analysis and into the area of active interaction. People with disabilities use it to move the cursor on a computer screen, and there are trials to get them to control their prosthetic limbs just by using their gaze. Eye-tracking video gaming is starting to take off, and many firms are beginning to explore the tech to understand how their customers react to adverts and shop displays using state-of-the-art headsets with integrated eye tracking.

'The fact that eye tracking has become so cheap to make means anyone can now integrate it into essentially anything – there will be new applications I can't even think of,' says Andrew Schall, principal researcher at user experience (UX) firm Key Lime Interactive. 'It's finally getting to the point when it's possible to track people wherever they are.' While this will bring obvious benefits it is also likely to raise privacy issues like never before.

It all goes back to the 18th century, when scientists first began to investigate eye movements. Early methods were primitive, often involving devices directly attached to the eye. The 20th century saw the development of less invasive solutions, and in 1947 Paul Fitts, a psychologist at Ohio State University, carried out the first video-based eye-tracking study.

In the following decades this approach flourished, with a host of techniques that relied on recording reflections from light bouncing off the eye. Specifically, academics looked at how tracking the eye's endless sequence of fixations (pauses on a specific area) and saccades (rapid movements between fixations) could help them better understand the visual system and even spot neurological disorders; as computers became widely available, they could even compute and analyse subtle differences in gaze patterns that are not noticed by humans.

Gaze for video games

Today though, the technology is no longer limited to analysis; already, it has found its niche as an active alternative input for people with disabilities. South Korean consumer electronics giant Samsung has recently revealed a new eye-tracking gadget dubbed EyeCan+ that replaces a mouse and keyboard. Shaped as a portable box, the device allows disabled people to write documents or browse the web merely by eye movements.

Other companies, such as Swedish eye-tracking expert Tobii and SensoMotoric Instruments (SMI) from Germany have also integrated their technology into assistive devices. Tobii makes eye-controlled tablets and speech generators, and also produces a PC-compatible eye tracker that controls an onscreen cursor. The device is mounted on the monitor, where it tracks the user's eye movement by shining a light onto the eye, with high-resolution cameras recording reflections from the cornea and pupil. These are analysed by image-processing algorithms to determine the subject's gaze position, and the data is used as an input for the cursor.

Just five years ago, the cost of such a device was in the thousands of dollars, but today, basic eye trackers are much cheaper. For instance, Tobii's EyeX developer kit is $139, while Danish firm The Eye Tribe has released the first eye tracker for under $100. Plummeting costs have prompted eye tracking to spill into the commercial world. 'We think the technology is ready for the first early niches of consumer markets,' says Oscar Werner, president of Tobii's Tech business unit.

The company is now targeting the gaming industry – and in January, it signed a deal with gaming equipment provider SteelSeries to create the Sentry Eye Tracker, which links to a PC via USB to enable gaze control of video games. The number of compatible games is still limited, but one major coup was a recent release of Assassin's Creed Rogue, a video game by software developer Ubisoft. It allows users to control the third-person camera through the Sentry Eye Tracker. It can also be used by professional gamers to analyse their eye movements while playing the games DOTA 2 and StarCraft 2 to help improve their play, or give fans unprecedented insight into how they play.

The Eye Tribe is confident that eye tracking for gaming on mobile gadgets is next. Last year, it demonstrated how its low-cost eye tracker could be used to play the popular mobile game Fruit Ninja by cutting the fruits with a mere glance, and at the Consumer Electronics Show (CES) in Las Vegas this January the company showed how the same device could be used to build on-screen Lego with your eyes.

While eye tracking for computer games is still in its infancy, the potential is huge. The input could replace the mouse or joystick to enable gaze-controlled aiming in first-person shooters, or act as a complementary input to create an extra level of control. It could also allow more realistic interaction with characters by integrating eye contact as a gameplay element.

One of the major applications for the technology could be virtual reality headsets. Oculus Rift and Sony's Project Morpheus have dabbled with eye tracking, albeit without concrete results so far. SMI, however, released an upgrade for the Oculus Rift SDK2, which integrates eye tracking into the device. Meanwhile, American start-up FOVE plans to ship developer kits of the first VR headset with integrated eye tracking by early 2016 – and in June, Samsung gave the company even more visibility by announcing plans to invest in it. As well as introducing the same functionality that PC or console eye tracking would, the technology also reduces the VR-induced motion sickness that plagues such devices.

FOVE's chief executive Yuka Kojima says eye tracking can also optimise graphics by diverting processor power to the area a user is focusing on. 'This reduction of rendering power by up to a sixth means a full virtual reality experience could be run from a smartphone in the near future,' he says. Integrating eye tracking into VR headsets will be the first step towards applying the technology to augmented reality, where virtual elements are overlaid onto the real world. SMI has already released a prototype upgrade for the Google Glass AR headset, which makes it possible to select menu options, navigate picture galleries, scroll text and navigate maps hands-free.

Gaming is likely to drive the technology's breakthrough into the wider consumer electronics market. Werner is cautiously optimistic. 'I think it's gaining momentum. It's just that bringing new technology into large markets involves a lot of obstacles,' he says, adding that the main one relates to creating a solid and intuitive user interface.

In an attempt to drive adoption, The Eye Tribe has focused on making a product that works with low-cost standard components that can be easily and cheaply integrated into the next generation of mobile devices rather than the higher quality custom-made components used by companies like Tobii and SMI. The Eye Tribe's chief executive Sune Alstrup says that while the details are confidential, consumer devices with their tech should start to appear from 2016.

Real-world applications

One drawback eye tracking needs to overcome, though, is the so-called Midas Touch problem. Named after King Midas who was cursed by having everything he touched turn to gold, the problem describes the fact that gaze is always 'on', making it hard to differentiate between interaction and eye movements aimed at perception. While eye tracking is good at indicating interest, it struggles to discern intention. As a result, eye-tracking interaction solutions are often combined with touchscreens or voice controls to activate commands.

To solve the problem, the industry needs to properly define the use cases, thinks Eberhard Schmidt, managing director of SMI. 'As soon you use it for more complicated interactions like pointing or grabbing, things normally to do with the hand, it becomes more complicated,' he says.

Soon, though, eye tracking could move from computer monitors into the real world. A group at the University of Bradford, led by engineer Prashant Pillai, has integrated eye-tracking cameras into an electric wheelchair to control the direction of travel. Still just a prototype, it can be rigged up to a gaze-controlled tablet that governs a home automation system, letting users control things like lights and thermostats. The team plans to commercialise, but still needs to fine-tune the system for real-world situations with dynamic lighting. 'These commercial systems work well because they're normally indoors with a single light source,' explains Pillai. 'Using this for something like wheelchairs where you've got to constantly adapt to the environment is quite challenging.'

Venturing further into the realm of science fiction, a group from the Johns Hopkins University's Applied Physics Laboratory (APL) in Baltimore is using eye tracking combined with electrodes placed onto your brain to control a robotic arm that could one day become a prosthetic. The system uses Microsoft's Kinect infrared motion sensor to capture video and to detect objects in front of the user. Video is transmitted to a monitor featuring an eye tracker that lets the user select objects by fixing their gaze on them. The electrodes then detect the intention to reach for an object, triggering the arm to grasp it and drop it into a container. Users can control the arm using brain signals alone, though Brock Wester, supervisor of APL's Applied Neuroscience Section, says that combining it with eye tracking increases operation speeds considerably. 'Their high-level intent gets converted into this train of robotic motions they don't have to worry about,' he says.

Emotions and privacy

As the limitations of using eye tracking for physical interaction are being overcome, the head of biometric research company iMotions Peter Hartzbech says the real future of the technology lies in combining it with other biometric measurements. Founded in 2005, the Danish firm was an early pioneer of eye-tracking software, but four years ago it decided to provide a one-stop solution for eye-tracking data, facial expression analysis, skin conductivity, heart rate and brain activity in real time, to measure emotional response to stimuli. Being able to match emotional responses with gaze-tracking data could help machines understand us. Hartzbech says the hardware still requires work, but he believes these kinds of systems could be integrated into cars in the next five years.

Helping machines to empathise with humans is also a long-term goal for SMI's Schmidt. 'I think the next big thing will be using user behaviour and perception to modify content or experience 'on the fly' to better fit the intentions and capability of the user,' he says. Real-time analysis of behaviour and perception would enable the technology to personalise user experiences to match their preferences, skill level and focus in applications from watching movies to driving a car, as well as allowing future service robots to interpret our needs.

The growing number of sensors this will require is likely to raise questions over privacy. Google has recently patented so-called 'pay-per-gaze' technology for online advertising and some companies are beginning to experiment with using eye-tracking solutions to study how people react to things like shop displays. But with smartphones already exposing everything from our location to the content of our emails, Key Lime's Schall says we need to ask how much of an issue eye-tracking will add, especially when weighted against the loss of really private information.

Combining the technology with emotional analysis could muddy the waters further, says Schall, but if it proves useful, many people could actually turn a blind eye. 'It's going to allow computers to be a lot smarter about humans who aren't always very predictable or rational,' he says. 'The fact that it can be anywhere and everywhere is really how you get to a true state where computers are operating in the background and kind of anticipating your needs.'

While such lofty ambitions are still some way off realisation, eye tracking is on an upward trajectory, with the potential applications apparently limited only by the imagination of developers.

'I remember the first time I navigated Google Maps via touchscreen very clearly and it was a magical experience,' says The Eye Tribe's Alstrup. 'We believe that eye tracking belongs to everyone, should get out to everyone, just like the touchscreen has got out to everyone and made our lives easier.'

    Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

    Recent articles