vol 8, issue 9

How human should humanoid robotics look?

16 September 2013
By Abi Grogan
Share |
Hiroshi Ishiguro with a human size robot

Q&A with Hiroshi Ishiguro

Hiroshi Ishiguro

Hiroshi Ishiguro’s goal is to see a human ‘believably affectionate’ towards a robot companion

Robots could become potential life partners if Japanese roboticist Hiroshi Ishiguro has his way. The world renowned scientist and university lecturer tells E&T of the robotic skeletons that lie in the cupboards of University of Osaka's Intelligent Robotics Laboratory.

Hiroshi Ishiguro sometimes asks interviewers whether they believe the man seated in front of them is a human or a robot. This would be implausible in any other situation, except for the fact that he is the man who created a robot in his own image and has helped others, for large sums of money and a four-month development period, to do the same.

His radical predictions of futuristic human and robot social hierarchies, in addition to his pioneering research in the field of humanoid robotics, have made him a minor celebrity in tech-obsessed Japan. Ishiguro has received funding from the Japanese government and is also a professor in two of the top universities: Osaka and Kyoto.

What are the key drivers of your research into humanoid robotics?

Initially my purpose was to develop interactive robots; I wanted to have an 'ideal design' prototype and this is why I developed the very humanlike android robots. Then I developed a complex teleoperated gemenoid robot. Today our key driver is exploring how robot operators adapt to working with a 'human' body. In our case a computer catches the operator's movements and voice, which are then duplicated in the robot. The operator recognises the gemenoid body as his or her own.

Like a phantom limb, we are using the gemenoid as a 'phantom body'. We are also developing minimal design of robots, trying to make them simpler, for practical use. We can learn a lot about minimal design from the gemenoid, as it is a very complex machine and is expensive so it is not practical.

Today society is accepting of robotics in an industrial environment, but what role do you see humanoid robots playing in the future?

The most important aspect of an interactive robot is its role as a social partner for a human. A human can project many things onto a robot, so essentially studying a social relationship between a human and a robot will allow us to comment on general human society. We need to study phenomena that happen in 'real society' before we can discuss the possibility of integrating robots into society. Before it was mostly important to have practical robots, but now the next two challenges in robotics are three things. To minimise more, to use the human shape, and represent the human soul. Do you believe that we have a soul? This is why we build humanoid and telenoid robots: to project the human soul.

Do you think a human being could ever become genuinely attached to a robot as a social partner?

Yes, that is my goal: for a human to become believably affectionate towards a robot social partner. Belief is the single most important aspect of a human being. You believe that I am a human, right? The human brain is just guessing, perceiving and believing. Everything is just a kind of illusion, or a trick, because the human brain cannot process everything. Everything is subjective.

Society suggests people's reliance on computers has damaged human relationships to a certain extent. Is introducing robots into this home environment not a similarly dangerous concept?

Well, before society said the computer is dangerous, now you say the robot is dangerous. It is the same; a robot is just a simple extension of the computer. Computer processors; actuators: that is a robot. The reason I am interested in humanoid robotics is because they are a sort of intermediate between the digital world and physical world.

It's been observed that Europeans display a different level of receptiveness to robots in comparison with the Japanese, who are more accepting. Why do you think that is?

Relatively speaking I don't see any differences between Japanese and European people. Of course when they are talking about robots they have differing opinions but when they come face to face with an actual robot their reactions are very similar. Asimo (Honda's android robot) is being presented in many countries and the reaction of children in particular, regardless of background, is exactly the same. Their preconceptions may be different but once they start to interact with the robot they forget. It all lies in education. For example, French people, they are more accepting because they love Japanese cartoons where robots are commonly depicted. We will share the culture soon.

From a technical perspective humanoid robotics are designed in human likeness with human-inspired sensory capability. How are the speech systems of your robots configured?

We are doing ongoing studies, but in the future I would like to integrate the use of online search engines and information banks such as Google and Wikipedia as a direct point of reference for the computer. It's quite difficult to develop an autonomous interactive robot but we are mostly studying conversational patterns, analysing sensory data.

A lot of our speech system is inspired by the Turing Test. But we have many things to improve; the computer has to be very powerful to gather such a huge amount of data when it holds a conversation to have a very quick, humanlike reaction. We still need to work on the android's logical flow of conversation and also the android needs to be more emotional.

To what extent is the visual system similar to a human's and how does it 'see'?

Computer vision technology is advancing so quickly at the moment. If you go to a computer vision conference you'll be amazed at the computer vision technology on offer there. Prior to the Kinect, an average laser scanner was very expensive and, although we used one, most research labs couldn't afford to. Plus the Kinect runs on an android system which is a familiar format for everyone. I think with this pattern recognition, now the vision system is at a human level, at least it is much better than an elderly person's sight!

How do your robots 'feel'? What sensory arrays do they use?

My gemenoids have a full body sensory array; the only thing we can't really integrate is taste and smell. Although we do actually have a team of researchers in Japan working on these two remaining sensors, if we wanted to, eventually we could install them. But the flip-side is that an android has no need to eat so it would be a relatively pointless and expensive integration.

How do your robots process what they are 'hearing'?

Hearing is definitely the biggest challenge. If there is just one person in the room you can use voice recognition software, but of course that person needs to speak clearly and slowly. If more than one person is involved though the recognition software becomes confused, it cannot separate the voices.

Siri for iPhone is a very good example of this; it never works with background noise especially if you have an accent. This is made worse by the fact that in Japan it is very difficult to be alone. In Japan, nobody uses the Japanese version of Siri, or even a Bluetooth headset because of the noise pollution.

The next step is to train our systems to understand the human voice, whether that be one or several. In order to have one model, we need to have one computer, for example the android operating system. We also need to scale down to powerful microprocessors, but this will take time, maybe ten years or so.

Currently your androids have upper body movement but no locomotion. What solution are you developing to allow your robots to walk?

At the moment my main priority is my research and now we are just focusing on the human-likeness aspect of humanoid robots. But we are working with Honda, who have a pretty good biped technology for making robots walk, but that is not really our role.

With more funding though, we will be able to create a biped android which is our next challenge. The most important mechanism associated with walking are the actuators, and although we've spent six years developing the current ones we still need more powerful ones.

Your androids famously need their own lorry to transport them to conferences. How are you making your new models more portable?

We have just changed the policy of our mechanical design, by adjusting the position of the joints and slimming down the number of actuators. Our original design featured 60 actuators, our newest model features only 12. Once we defined the purpose of the android – communication with human beings – we could focus on the areas that were most in need of complex actuators, in this case the facial muscles for dexterous facial expression.

How are the different systems – sight, sound, movement – all networked together?

Unfortunately the human network is actually a very poor model. There are many parts in the human brain and it is a very powerful processor, but the connections are not actually that tight or dense. When you are walking, you do not know which muscle is moving, your brain is just telling your body subconsciously how to move.

Your robots are famously modelled on people that you know or admire. Where do you take inspiration for the design?

My first android was modelled on my daughter. Because I am a scientist it is very important for me to have a reason for my work. When I began my daughter's copy most research into humanoid robotics was based on scaled-down versions of a human being, I wanted to compare the same size. I could not make this comparison with just any child, so I chose my daughter. The second female android was intended for use in an exhibition, so the choice of model was quite difficult. So I chose a Japanese newscaster, someone who appears on TV every day and is watched by many people. She is almost like a product of her TV show, a well-known brand.

You have famously created a robot in your own image. What effect did this have on you when you interacted with it?

It was like meeting my twin brother. But what's strange is that the human body does not know its own face. Nobody knows their own face.

How important is it that future robots are built in a human likeness?

AIt depends on the situation. We will have more choices; we will use many types of robot. When that robot is an interface between a human and a computer that is when then they should be in a human likeness. The brain has a function; to recognise a human shape. The world at the moment is geared towards the physical shape of a human, with two legs and two arms, which need specially purposed machines. Any situation where human eyes, speech and body are needed is where a humanoid robot will be used – where there is an information exchange; a guide; newscasters – which is a pretty wide scope.

Isn't there a'danger that a robot that'looks very humanlike may cause human beings to project an unfair expectation that it can operate exactly as a human does?

According to a human situation we can design human behaviour, but I don't think it's fair to put those kinds of expectations on a robot, or even appropriate. For example, a human being is capable of dancing. You can dance, I can dance, shall we dance now? No, not here or now, because it is not the right situation. It is the same with robots; they'will not be able to do everything that a human being is capable of doing because it will not always be appropriate or relevant.

You have been quoted as saying that android robotics could become the ideal partner for a human.

A All of my staff have formed a relationship with and become very attached to the androids they have been developing, as they can touch her in a way that they cannot touch other human beings, almost like a lover. But of course the difference is that physically she is often more attractive than their girlfriend or boyfriend. She is like the ideal wife.

Share |
Related forum discussions
forum comment To start a discussion topic about this article, please log in or register.    

Latest Issue

E&T cover image 1409

"Who's getting the best engineering education? And what did your careers advisor suggest you do when you leave school?"

E&T jobs

E&T Marketplace

The essential source of engineering products and suppliers.

E&T podcast

Tune into our latest podcast

iTunes logo

Subscribe

Choose the way you would like to access the latest news and developments in your field.

Subscribe to E&T