Geminoid-F on display

How human should a humanoid robot look?

Sci-fi films such as 'Terminator' and 'AI' presume robots should look and act like humans, but a natural human response to be creeped-out by the not-quite human means robot designers will have to think again.

The Hong Kong shopping public were recently the subject of what was on the surface of it a pretty standard (if slightly brutal) media market-research field study. What, they were asked, did they think of this up-and-coming celebrity? What did they think of her aesthetic appeal, likeability and the quality of her feedback to their questions?

The difference was that in this case the celebrity was a humanoid robot.

The designer of this experiment was robotocist Hiroshi Ishiguro, but the true star of this show was his humanoid robot Geminoid-F, an exceptionally humanlike prototype robot modelled on a young Japanese woman. The field study was conducted in an effort to discover how the public's reaction could influence humanoid robot design, and in turn allow these types of robots to be accepted into mainstream society.

During the test, Geminoid-F conversed, laughed with and sang to shoppers. The robot uses an air servosystem developed by Japanese company Kokoro Co to move her upper body, although is stationary from the waist down. She interacted with shoppers using a language system influenced by Alan Turing's 1950 Turing Test for 'intelligent' machine behaviour, and a movement-tracking camera from a Microsoft Kinect. To enable her to react appropriately to humans, Geminoid-F's emotional state was programmed electronically via 65 different human behavioural scenarios, including surprise, embarrassment and happiness.

The language system used by all of Ishiguro's robots is a web application called Cleverbot, a publically available $0.99 artificial-intelligence app that learns from humans as it is exposed to human language. Cleverbot searches a database of past responses from humans to questions on any subject and formulates its own text-to-speech answer based on these responses.

Ishiguro and his team deemed the shopping-mall experiment a success; participants accepted the geminoid in social interactions and on the whole participants were surprisingly receptive towards Geminoid-F. "The funny thing was that people thought of her as their friend, they built up a relationship with her. They wanted to believe she was real. Or they wanted to buy her."

Conversation stopper

Currently, Ishiguro's humanoid robots, along with the majority of other developed robots that populate the sparse humanoid market, are unable to hold a conversation with members of the public to the extent that they can answer questions independently via voice recognition. Instead, questions are posed via external interfaces such as tablet computers, which the internal computer processes and answers using Cleverbot software.

Ishiguro predicts that conversation via voice recognition will become possible, but says the limitations of human-to-robot interaction are still obvious. He recounts a scene between a robot and a human during a field test in which a humanoid robot conducted a conversation with a human counterpart. The conversation is stilted but relevant until a question is posed regarding the robot's favourite food, which it is unable to answer as its systems possess no prior knowledge of taste or smell.

Ishiguro sees cloud computing as a solution to this problem: a direct wireless link between a robot's electronic memory and databases such as those of Wikipedia and Google may become a viable way to provide a robot with a knowledge of which it has no previous experience.

"Now we have a powerful cloud-computing system which allows intelligent engagement. If people can react to realistic scenes with a robot system then we can gather meaningful data; it will allow us to make conversation between a human being and an intelligent agent autonomous."

Aping the apes

Geminoid-F is currently among the most advanced humanoid robots on the market, but her lack of lower motor skills highlights the progress humanoid robotics still has to make. Due to the complex development of actuators in multi-limbed robots, the majority of android and humanoid robots currently still experience a limited range of movement.

So why develop complex robots that look and act like humans when the lucrative industrial automation market is developing at such a rate? Whilst industrial manipulator and mobile robots need to adapt and drastically change every time their environment does, in theory humanoid robots should be able to work directly in the same environment as humans with little need for modification. Unfortunately, the development of humanoid robotics is maturing so slowly that these theoretical benefits are not being realised.

This delay is in part due to the complex communications involved in controlling several limbs from a robot's central processor; this often results in jerky movement. Compared with the quick precision of movement in automotive or other forms of manufacturing automation, humanoid robots soon lose their appeal.

But Ayse Saygin, associate professor in the department of cognitive science at the University of California, says that the two strains of robotics are practically incomparable. She says industrial robotic arms seen on the factory floor such as those manufactured by Kuka are primitive in comparison to the multi-jointed limbs required by android robotics.

Sci-fi has a lot to answer for: what society may perceive as slow progress in robotics is predominantly influenced by the'aesthetic expectations we place on android and humanoid robots. Saygin says the parameters of 'acceptable' motion of these robots are extremely high, thanks to the highly sophisticated movement of 'robots' in films such as 'Terminator', which'mimic a human's exactly. Industrial robots are mercifully not subject to such strictures.

"The human brain only mirrors 'human' motion, and not robot motion," says Saygin. "This means that humanoid robots can perform recognised actions, but that it is difficult for humans to accept the traditional 'jerky' motion associated with current robot's limited range of movement."

"If you look humanlike but your motion is jerky or you can't make proper eye contact, those are the things that make them uncanny. I think the key is that when you make appearances humanlike, you raise expectations for the brain. When those are not met, then you have the problem in the human brain."

Uncanny Valley

Although research into android and humanoid robotics has slowly progressed, acceptance by the public (particularly in the West) has not. Sci-fi films promote an assumption that humans are more receptive to robots that look, act and move in a way that mirror our own behaviour as closely as possible. In theory, the more 'human' a robot appears the more accepting we should be of the machine; in reality this does not follow.

The natural discomfort of humans in the presence of something not-quite human has been researched extensively by engineers, designers and scientists in relation to a study by Japanese robotocist Masahiro Mori called 'The Uncanny Valley' (or more accurately translated, 'The Valley of Eeriness') published in an obscure journal entitled Energy over 40 years ago. The title reflects a graph which charts human responses to entities that appear human, tracking a subject's aesthetic acceptability. A distinct 'dip' occurs when a human being experiences a sense of eeriness, escalating into a strong revulsion past a certain point. This phenomenon of withdrawal of acceptance is experienced not only in the case of human-like robots, but sadly also in the case of burns victims and people who have undergone extensive cosmetic or reconstructive surgery.

Researchers including Saygin and Karl MacDorman of Indiana University believe that this feeling of 'uncannyness' does not necessarily directly follow Mori's slightly outdated concept, but is triggered by an artificial entity that behaves 'real' enough to be classed as human.

Karl MacDorman, associate professor in the human-computer interaction programme of the school of informatics, Indiana University, has observed with interest Japan's acceptance of robots in public life. Tech-savvy Japanese even have a word to describe a feeling of familiarity, likableness, comfort level and affinity with a non-human entity: shinwakan.

"This discomfort experienced by most human beings in the Western world is prompted by your visual system telling you something is human, but your sensory system telling you something different," says MacDorman. "If the speed of a smile is half speed, rather than happy it becomes sinister.

"This match in levels of realism is very important to overcome if robots are to become an acceptable part of society, but the problem is as you increase the realism you increase familiarity, and in this depth of the valley people report higher levels of eeriness. Even if a robot looks human there are so many aspects that must be accurate. The smell of burning from motors or oil from hydraulics rather than human skin can be very off-putting."

Humans are good at detecting flaws in technology, and in the case of humanoid robotics we will often search out these flaws. MacDorman says that our hard-wired, primitive instinct in mate-selection makes us experts in judging human skin quality, sanity and health and if these qualities seem synthetic, we will unconsciously reject them.

"We have developed a natural disgust over years of evolution towards things that appear dead, which has taught our organisms not to eat our own bodies; it's a type of moral disgust. We have a hard-wired natural fear of something that's dead to avoid exposing ourselves to poisonous pathogens."

He believes the key to acceptance of robotics could lie in the future of android-appearance robots, rather than humanoid. The likes of Asimo, the human-proportioned, but clearly machine-based robot created by Honda has shown in studies to be preferable to Ishiguro's very human-like gemanoids. "We as robotocists can produce both soft and rigid movements. Asimo walks with his knees bent in a very unnatural way, but as he is clearly a robot we are used to this representation of a robot. But if we see a human walking like this it would be classed as strange."

Beyond the valley of the uncanny

So how does the 'uncanny valley' affect robotic design? MacDorman says roboticists should focus less on making robots looking like carbon copies of humans, and concentrate more on projecting human personality. Facial proportion can be achieved more accurately through CGI images on LCD screens, as well as more accurately photorealistic skin.

One robotocist has factored slick product design into his line of android robots. Tomotake Takahashi is CEO of Robogarage, research associate professor at the University of Tokyo and visiting professor at Osaka Electro-Communication University. "Appearance, size, motion, function... these things are all deeply related to engineering," he says. "Movement is extremely important and smooth motion must be achieved in robotics. Contrary to popular belief, although human beings do stand upright they do not have straight limbs."

Robogarage's vision was to manufacture a stable of limber robots that experience an ambitious 23 degrees of movement and are fully autonomous. A unique biped walking mechanism called SHIN-walk was developed by the research laboratory in order to help achieve this goal, which Takahashi has incorporated in his ROPID (i.e. 'rapid robot')design. This allows his robots to make quick movements including jumping and running.

Takahashi tackled the issues of movement via an unusual study which garnered the company local celebrity status in Japan - the world's first female biped robot. Enter 'FT': 800g and 35cm of curvaceous lithium polymer, she features female proportions and a 'Vogue-style' catwalk gait based on Takahashi's observation of catwalk models.

"I wanted to challenge the public's perception of a robot. What makes a robot feminine? Women and children are scientifically proven to be more flexible than men; they experience a 30 per cent extra free degree of movement in reverse angles. I wanted to challenge the preconception of a masculine robot."

Takahashi's robots are supported from the outside by an exoskeleton rather than an internal metal skeleton meaning movement is more fluid and that they are lighter and more energy efficient to manufacture. Modelled initially from wood, the exoskeleton is then vacuum moulded in plastic and strengthened by carbon fibre components.

Takahashi, whose astronaut robot Kibo is due to enter orbit later this year, believes that the key to meeting human expectations of android and humanoid robotics is to start small. "People expect more from a larger, full-size robot. To correctly meet human expectation we must nano-engineer. If a robot is smaller, less dominant, we expect less."

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles