Smart necklace tracks detailed facial expressions
Image credit: Mirela Schenk/Dreamstime
Researchers in the US have developed a smart necklace that can continuously track full facial expressions without the constraint of a frontal camera.
NeckFace, one of the first necklace-type wearable sensing technologies, was developed by researchers at Cornell University. The team, led by Cheng Zhang, assistant professor of information science in the Cornell Bowers College of Computing and Information Science, said the technology can continuously track full facial expressions by using infrared cameras to capture images of the chin and face from beneath the neck.
NeckFace is the next generation of Zhang’s previous work, which resulted in C-Face, a similar device but in a headset format. According to Zhang, NeckFace provides significant improvement in performance and privacy, and gives the wearer the option of a less-obtrusive neck-mounted device.
Besides potential emotion-tracking, Zhang sees many applications for this technology. These include virtual conferencing when a front-facing camera is not an option; facial expression detection in virtual reality scenarios; and silent speech recognition.
“The goal is having the user be able to track their own behaviours, through continuous tracking of facial movements,” Zhang explained. “And this hopefully can tell us a lot of information about your physical activity and mental activities.”
To test the effectiveness of NeckFace, Zhang and his collaborators conducted a user study with 13 participants, each of whom was asked to perform eight facial expressions while sitting and eight more while walking. In the sitting scenarios, the participants were also asked to rotate the head while performing the facial expressions and remove and remount the device in one session.
NeckFace was tested in two designs: a neckband, draped around the back of the neck with twin cameras just below collarbone level; and a necklace, with a pendant-like infrared (IR) camera device hanging below the neck.
The group collected baseline facial movement data using the TrueDepth 3D camera on an iPhoneX, and they then compared that to the data collected with NeckFace. Between the sitting, walking, and remounting expressions, study participants expressed 52 facial shapes.
Using calculations involving deep learning, the team found NeckFace detected facial movement with nearly the same accuracy as the direct measurements using the phone camera. The neckband was found to be more accurate than the necklace, the researchers said, possibly because two cameras on the neckband could capture more information from both sides than could the centre-mounted necklace camera.
Zhang said the device, when optimised, could be useful in the mental health realm, for tracking people’s emotions over the course of a day. While people don’t always wear their emotions on their face, he said, the amount of facial expression change over time could show emotional swings.
“Can we actually see how your emotion varies throughout a day?” he said. “With this technology, we could have a database on how you’re doing physically and mentally throughout the day, and that means you could track your own behaviours. And also, a doctor could use the information to support a decision.”
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.