I Am Mother

I Am Mother: who is programming whom?

Image credit: Netflix

‘I Am Mother’ provokes interesting questions about how human a non-human can be.

The day after the human race becomes extinct a droid (known as ‘Mother’) selects a human embryo and grows it, within a day, to be a healthy screaming baby girl. It then nurtures this child (‘Daughter’) through to young womanhood, fulfilling all the necessary maternal duties along the way.

While the first half of the film carries on in a fairly linear fashion in terms of a typical girl growing up in a typical one-droid-parent environment, the second part introduces a number of clever twists and turns that make the whole scenario a lot more memorable than it might at first sound. The role Mother plays in this rebirth of the human race has to be re-evaluated several times, particularly when a rogue survivor from the extinction disrupts the lives of Mother and Daughter and opens up Daughter’s eyes to the possibilities of life beyond the laboratory that has been her home since birth.

For the record, extinction had come about as a consequence of a war, but what the war was about and who it was between was not even referred to until the end, and so to avoid spoiling the twists I’ll say no more. The important point is that it left Mother and Daughter to exist together, unable to go outdoors or elsewhere because of the toxic environment the war had left.

Key to the technology here is AI, of course. The droid Mother always has to remain one step ahead of the human Daughter, and as any parent will know that is a near impossible task with teenage girls. (Maybe it would be easier if it was a boy!)

There is much to ponder here in how a child develops, most particularly in terms of what behaviours are learned and what are instinctive. Would a teenager know how to slam a door, say “I hate you” and storm up to a bedroom to watch inappropriate films if they hadn’t already seen others behave that way, either in real life or on TV?

In ‘I Am Mother’, until the appearance of the woman from the outside, Daughter seems to be impeccably behaved and clearly very bright. As her social development has been limited to her interactions with Mother and past recordings of the ‘Late Show’ on a small tablet-like device, we have to assume that Mother did a phenomenal job of turning her into a well-balanced young adult.

Beyond the flexibility offered by the AI of the future, one of the key skills that the droid Mother needed to master was how to assess what her human charge is feeling. Indeed, when we speculate about the use of robots for home assistance, their usefulness and acceptance could be defined by how well tuned into the human mood they are. We will not be able to get robots to respond correctly if the situation is not properly assessed in the first place.

So the question is, could we build a robot that can read a human’s mood? It’s easy with babies – they gurgle contentedly, cry or sleep. But if we jump forward 15 years, it’s a much more complex problem.

Let’s face it, even at the IET, where we like to think of ourselves as reasonably competent human beings, we are encouraged to go on ‘Emotional Wellbeing in the Workplace’ courses to make sure we are keeping a compassionate eye on our colleagues. So robots, without either the benefit of a lifetime engaging with moody humans or emotional wellbeing training, have a pretty steep learning curve.

Irrespective of how the robot needs to respond in empathetic fashion, the initial challenge is to be able to program that robot – or equip it with the necessary AI – to identify and measure human moods in the first place. Emotion processing, in other words. Can this be done?

“There could be a whole host of gadgets or devices that humans have to enable emotion processing and, therefore, to enable empathy,” says Dr Punit Shah of the University of Bath’s Department of Psychology.

One of these ‘gadgets’, and one that has been steadily refined, is the Facial Action Coding Scheme. This was originally a manual identification system – if one muscle moves one way and another moves another way it could be identified as a smile and therefore happiness could be assumed. “There’s a whole combination of things that can be put together to create emotional combinations and therefore visual perception of emotion,” says Shah. This process has over the decades been automated, and software analysis of either still photos or short video clips can now code the amount of each emotion present. “It’s really quite accurate, a lot of this stuff,” adds Shah.

But it is only accurate at the easy stuff – the six universal human emotions (anger, disgust, fear, happiness, sadness and surprise). What if a smile is in fact a fake smile, a sneer or a dose of self-satisfied smugness? “That’s where these machines really struggle in disentangling the subtlety,” admits Shah.  “It’s certainly being refined over time, and machine-learning algorithms are now being used and being trained on a huge corpus of images to be improved.  But it’s definitely a work in progress.”

These visual techniques are leading the field in interpretation of emotions, but it is not the only option. Shah says: “The visual domain is just one aspect of recognising someone’s emotions, and other domains of emotion processing are a lot more complicated.  It’s difficult for a machine to – in the way that humans can – detect the subtleties in how another person sounds in combination with how they look, in combination with how they smell, perhaps, as well. We don’t really know how each of those individual things work and combine in order to generate emotion detection.”

There are plenty of other possibilities, such as eye-movement tracking, pupil dilation, even skin-colour monitoring – Shah suggests that changes in heart rate can be detected by flushing or paling of the skin and therefore reveal secrets about levels of excitement or arousal for example.

However, as Shah says: “We are not even close to this. I’m unfortunately sometimes stuck in this realm of reality of thinking in terms of what we can do right now and that’s not much. But yes, maybe a combination of all of these things, looking at behaviour as measured through observation and trying to decode behaviour from that in addition to the biosensor type devices, is possible.”

It seems we are decades away from this at best, and unless we can measure human emotions then programming a robot to act empathetically to human behaviour appears impossible.

Ultimately, ‘I Am Mother’ revealed a different and thought-provoking approach to how humans programme robots, or in fact, as you will see in if you watch the film, is it the other way round?

‘I Am Mother’ is available on Netflix

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles