RoboThespian - the robot entertainer

E&T meets the designers of the RoboThespian, the all-singing, all-dancing humanoid robot, which is entertaining and informing audiences around the world.

Let's be honest, your average industrial robot is a bit of a let-down. True, being able to execute a pre-defined repertoire of pick-and-place movements with tireless precision can be enormously useful, but it's an awfully long way from the robotic ideal of the popular imagination.

It may be a bit irrational, but deep down most of us believe robots ought to be quite a bit like people. They should have arms and legs, heads and bodies, and, at a pinch, should be able to make a fair stab at something resembling intelligent conversation.

If this is what a robot means to you, then you'll love RoboThespian, the creation of Engineered Arts, based in Penryn, Cornwall. A classic humanoid robot, RoboThespian can talk, sing, act, and even dance (currently limited to a very restrained shimmy, but that's not too dissimiliar to the dancing of many a man, and they are working on it).

Engineered Arts specialises in the design and build of mixed-media exhibition installations, involving a combination of art and engineering. Will Jackson, the company's director, studied 3D design (which he describes as being "on the borderline between arts and crafts and industrial design") and spent seven years in the film and television industry before branching out on his own some 15 years ago. He started by building bespoke slot machines, and two of these, The Crankenstein and Brain Washer, are currently still pulling in the punters on Southwold pier, Suffolk.

Through the slot machine connection he became friendly with artist-engineer Tim Hunkin, responsible for Southwold's 'Under the Pier' slot machine collection, and the 'Secret Life Of...' series for Channel Four television. Jackson and Hunkin subsequently worked together on the Science Museum's 'The Secret Life of the Home' exhibition, where Jackson designed the popular cut-in-half toilet, illustrating the principle of the siphon. This features an ingenious mechanism that, after each flush, catches the 'synthetic poo', and returns it to the bowl.

Robothespian born

The idea for RoboThespian was born out of a number of commissions involving tableaux of mechanical figures. In 1999 Engineered Arts produced an installation for Cornwall's Eden Project, designed to illustrate the fate of a world without plants. This features wooden figures of a man and woman, who ultimately lose their clothes (demonstrating the outcome of having no natural fibres) and then drop down dead through lack of oxygen. This was followed by a project at the Glasgow Science Centre comprising three small tableaux telling the stories of the world's first heart transplant (using Barbie dolls), the cloning of Dolly the sheep, and Laika, the first dog in space.

All these projects involved simple mechanical figures, animated by standard industrial controllers. The role of 'mechanical actors' took a significant step forward when, early in 2005, the company began work on the Mechanical Theatre, another Eden Project commission. This involved three figures, with storylines focused on genetic modification and the patenting of genetic material. Rather than designing another ad hoc set of figures for this new commission, it was decided to develop a generic programmable figure that would be used for the Mechanical Theatre, and the succession of similar commissions that would, hopefully, follow. The result was RoboThespian Mark 1 (RT1).

The current RoboThespian, RT2, follows the basic physical design for RT1. It's a life-sized theatrical robot, 1.7m high. The body, limbs and hands are made from welded aluminium. The torso and face are formed from thermoplastic resin (PET), and the eyes are LCD displays. RT2's principal advances over RT1 include the ability to interact with an audience and greatly enhanced mobility.

A robot's capacity for movement is defined by how many axes of rotation it supports - when you lift your forearm it's rotating about an axis passing through your elbow. RT2 has 31 axes of rotation - more than twice the number supported by RT1: six per arm, two per leg, four per hand, four in the head, two in the body, plus a 180° rotation of the upper torso.

Festo fluidic muscles

On the human body movement is controlled by muscular contractions - it's contracting biceps muscles, for example, that lift your forearm towards your shoulder. All major movements on RT2 are powered by Festo fluidic muscles - the pneumatic equivalent of human muscles. The operating principle is simplicity itself. Each muscle is formed from a pressure-tight length of rubber hose, and when pressurised air is admitted the tube expands in the peripheral direction and contracts in the longitudinal direction. Plus points included relatively low cost, mechanical simplicity, the ability to apply a large contraction force, and natural compliance - like a real muscle it 'backs off' when acting against an opposing force. The down side is a non-linear response, which changes with the applied load, necessitating a highly complex control algorithm.

The remaining actuator technologies used on RT2 comprise Maxon DC motors, used for axes lying along the length of a limb, for example to rotate the forearm about its longitudinal axis (a complex link with three pulleys would be needed to do this pneumatically), and simple pneumatic cylinder valves to control the four fingers. The valves only operate in one of two states, so the fingers are either closed or fully extended.

There's a position sensor on every axis (accurate to one-tenth of a degree) and all positional informational is fed to two motion-control boards, which slot into the back of the robot. Each board can control nine axes at a time - four motor-driven and five pneumatically operated. Somewhat surprisingly, the control boards, the position sensors, and even the network protocol linking the two, are Engineered Art's own designs.

"As a rule, we try not to use any proprietary kit anywhere on the motion control side," explains Jackson. "It's all designed for factory automation, so it's brilliant for pick and place, but terrible for what we're doing. For us, most of the time, the standard parts just don't work. So we have to go back to basics and design exactly what we want. If you don't, you end up with something that is very large, very clumsy and doesn't work very well."

The software that gives life to all this bespoke hardware is organised into three levels. At the top level are control messages that initiate some robotic action, for example turning the head. These are referred to as sources. At the bottom are the destinations, or targets, for the control messages. For example the axis that turns the head. Between these two layers lies the I/O Server, responsible for ensuring that every bit of control information gets to the right place (the relevant target) at the right time. Jackson likens I/O Server to a telephone exchange with a built-in tape recorder (used to playback sequences of control messages).

The control system is built on standard Web technology, allowing the robot to be controlled by any device that can display a Web browser. If you know the password, you can assess and control individual targets for any RoboThespian anywhere in the world.

Control messages can be generated in a variety of ways. In general, audience interaction with RoboThespian is via a touch-screen interface. Control options available include the re-enactment of scenes from famous films (selected from a pre-defined library), changing RoboThespian's eye or cheek colour, gesture and pose, and composing new RoboThespian performance sequences.

Blender 3D animation

Inevitably, the level of control possible via a touch screen is going to be somewhat limited - it's constrained by the menu options available on the screen. Generating the sort of highly complex sequence of actions required for something like Eden's Mechanical Theatre is now done via a virtual 3D RoboThespian, generated by Blender, an open-source 3D animation package. By clicking and dragging on an anchor point on the virtual model you can change the model's pose, automatically generating the sequence of control messages that will make the real robot behave in exactly the same way. The virtual RoboThespian doesn't just looks like the real thing; critically it also behaves like the real thing, and you can't make the 3D model adopt a pose that couldn't be realised by the real robot.

Future software developments will, Jackson says, be focused on facilitating more subtle and convincing social interactions, citing the example of the robot receptionist he met at a recent visit to Carnegie Mellon University, in the US. The basic set-up at the university is very simple. There's a screen showing the receptionist's face, and a camera that locks onto the face of the user. However, the receptionist has a notably warm, helpful personality. "It's a very fun bit of software, but the hardware is mechanically very crude," says Jackson.

"There's excellent software development work being done in areas like speech and face recognition, but without the hardware to bring it to life, it's of limited value. So one of our missions is to team up with as many universities and academics as we can, harnessing all this great software on our hardware, and making RoboThespian much more intuitively interactive."

RT3 and beyond

Over the next 12 months Engineered Arts will be rolling out RT3, a more mobile and independent variant on RT2. On the human body, muscles operate in pairs - the biceps pull your forearm up and your triceps, pulls it down. On RT2 the fluidic muscles act singularly, but on RT3 paired fluidic muscles will be introduced. This will enable better dynamic performance, and allow RoboThespian to adopt hitherto unrealisable poses. Inevitably there's a cost, and the need to control the relative tension between the paired muscles adds to control complexity.

The second significant innovation for RT3 involves moving the valves controlling the admission of pressurised air to the fluidic muscles from their current position - in a console under the touch screen - to within the body of the robot itself. The existing valves are essentially digital, they're either open or closed, so getting a small quantity of air into the muscle means they have to open and close very quickly, in around 5ms. This, in turn, means they have to be fairly large, which is why they can't fit into RT2. On RT3, air flow will be controlled by an analogue valve - it can be partly opened, just like a conventional tap, to admit a small quantity of air. This eliminates the need for speed, allowing for a much smaller valve assembly able to fit comfortably into the back of RT3.

With all the valves in the robot, the 16 air hoses currently linking the assembly to the robot can be replaced by a single hose. Not quite cutting the umbilical cord, but certainly trimming it down to size.

RT2 is somewhat deficient in the legs department, and the robot is supported, rather ignominiously, by an aluminium tube. RT3 will have a new pelvis design, allowing a degree of hip flexing, and will have a credible set of leg muscles, so that he will be able to stand unsupported - robo-erectus at last.

There could be more. Jackson is very keen to visit a German professor responsible for a robot that seems to be able to manage a particularly brisk walk, on the basis of an encouragingly simple leg design. Who knows, by the time we get to RT4 it could be a case of goodbye restrained shimmy; hello John Travolta.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them