Sensors on the faceplate of an Oculus Rift headset read the wearer's emotions

Emotion tracking sensors add feelings to virtual reality

A new technology for virtual reality replaces external cameras with sensors attached to a headset to better track emotions. 

The system, developed by Brighton-based start-up Emteq, consists of miniaturised sensors that read electrical signals and the heart rate of the wearer of a virtual reality headset.

Artificial intelligent algorithms translate these signals immediately into emotions, which are than expressed by the digital avatar representing the user in the virtual reality environment. In the past, similar results could only be achieved with complex external cameras.

“Our machine learning combined with facial sensor technologies represents a significant leap forward in functionality and form for developers looking to introduce human expression and mood into a digital environment,” said Graeme Cox, Emteq's CEO and co-founder.

“Imagine playing an immersive role playing game where you need to emotionally engage with a character to progress. With our technology, that is possible – it literally enables your computer to know when you are having a bad day or when you are tired.”

Dubbed Faceteq, the technology, which is built directly into a VR headset such as Oculus Rift, reads a wider range of parameters than the camera-based systems. That includes eye movements as well as activity of cheek and forehead muscles.

In addition to game developers, the technology could be of interest to marketers who wish to accurately assess the effect of video advertisements and films on the audiences.

The company is currently developing an application that would allow the technology to be used for rehabilitation of patients who have lost control over their face muscles due to conditions such as stroke or facial palsy.

“People with facial paralysis or stroke can have a very limited awareness or control of any abnormal facial movements they may have,” explained Charles Nduka, Chief Scientific Officer of Emteq, who is also a plastic surgeon at the Queen Victoria Hospital in West Sussex.

“Without proper feedback, their condition may worsen and lead to permanently abnormal movements. By wearing a pair of glasses which provide real-time muscle feedback, patients would be able to practice their exercises without having to stare at themselves in the mirror regularly.”

In cooperation with Nottingham Trent University, Emteq is developing an eye-wear-based device fitted with the same technology as the virtual-reality headset that would create personalised rehabilitation programmes.

The system, controllable via a smartphone app, will provide a schedule of routines, give live feedback, data on muscle tone, number of repetitions, weekly progress and historical information.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them