Face becomes speed control

A computer science postgraduate has turned his face into a remote control for video as part of a project to help develop robots that can understand human facial expressions better.

Jacob Whitehill, a computer science postgraduate student from UC San Diego’s Jacobs School of Engineering, is leading this project. In a recent pilot study, Whitehill and colleagues demonstrated that information within the facial expressions that people make while watching recorded video lectures can be used to predict a person’s preferred viewing speed of the video and how difficult a person perceives the lecture at each moment in time.

This new work is at the intersection of facial expression recognition research and automated tutoring systems.

“If I am a student dealing with a robot teacher and I am completely puzzled, and yet the robot keeps presenting new material, then that’s not going to be very useful to me. If, instead, the robot stops and says, ‘Oh, maybe you’re confused,’ and I say, ‘Yes, thank you for stopping,’ that’s really good,” said Whitehill.

In the pilot study, the facial movements people made when they perceived the lecture to be difficult varied widely from person to person. Most of the eight test subjects, however, blinked less frequently during difficult parts of the lecture than during easier portions of the lecture, which is supported by findings in psychology.

One of the next steps for this project is to determine what facial movements one person naturally makes when they are exposed to difficult or easy lecture material. From here, Whitehill reckons he could then train a user-specific model that predicts when a lecture should be sped up or slowed down based on the spontaneous facial expressions a person makes.

To collect examples of the kinds of facial expressions involved in teaching and learning, Whitehill taught a group of people in his lab about German grammar and recorded the sessions using video conferencing software.

“I wanted to see the kinds of cues that students and teachers use to try to modulate or enrich the instruction. To me, it’s about understanding and optimising interactions between students and teachers,” said Whitehill.

Whitehill said if someone is nodding, “that suggests to me that you’re understanding, that I can keep going with what I am saying. If you give me a puzzled look, I might back up for a second.”

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them