Mammoth in the Mars Yard at the Powerhouse museum in Sydney

G'day Robots! Field robotics research in Australia

We visit the largest civilian field robotics and intelligent systems group in the world, the Australian Centre for Field Robotics at Sydney University, to see what autonomous machines can do to make people’s lives easier, and what the future holds.

“It’s one of those things, you don’t need them but they’re nice to have...” Professor Salah Sukkarieh is talking about robot vacuum cleaners. And also about mobile phones, it transpires. He doesn’t have one of those. Their usefulness doesn’t yet outweigh their disruptiveness, he says.

He is Professor of Robotics and Intelligent Systems and also director of research and innovation at the Australian Centre for Field Robotics (ACFR) at Sydney University. Sukkarieh is very familiar with this university since he was an undergraduate here. He was also a postgraduate here, studying ground, air and vehicle navigation systems.

Last year, Sukkarieh won two awards: one from AUSVEG (Australia’s industry body for vegetable growers), mainly for his work with the field robot called Ladybird, and the New South Wales Science and Engineering award for the application of robotics in agriculture and the environment.

The Centre’s claim to primacy depends on which metric you choose, as there is a similar institute at Carnegie Mellon University in the USA, but Sydney’s Field Robotics Centre is the largest such research group with a focus on civilian (rather than defence) applications.

Dr Robert Fitch, manager of systems planning at the ACFR and organiser of the first Summer School on Agricultural Robotics at Sydney University, has high hopes for the future of robotics. He told the Australian Broadcasting Corporation’s Rural Service that more people would be needed to build robots and write apps for them. He was also hoping that robots would entice youth back into farming.

Beyond agriculture

As well as developing ACFR’s star agricultural robots, known as Shrimp, Mantis and Ladybird, Sukkarieh and the team are involved in other fascinating projects like underwater archaeology at Pavlopetri, a 5,000-year-old submerged city off the coast of the Peloponnese in southern Greece, and Rio Tinto’s Mine of the Future in Western Australia.

Pavlopetri was a rather sophisticated Neolithic and Bronze Age settlement, which is thought to have sunk due to earthquake activity. Acoustic mapping of the site can be done from surface vehicles, but for deep water optical surveys remotely operated vehicles (ROVs) or autonomous underwater vehicles (AUVs) are needed to collect stereo high-resolution imagery. Here SLAM (Simultaneous Localisation and Mapping), which is an algorithm allowing robots to navigate without GPS, has been used for the first time for underwater archaeology.

Sukkarieh says that in 2007, Rio Tinto had an internal vision of an automatic mine, the Mine of the Future. The ACFR received funding to this end. The Mine now has auto trains, auto-drilling and auto trucks. According to Sukkarieh, the Mine has a whole smorgasbord of sensors: GPS, laser, radar and vision - as you can’t rely on a single sensor. The sensors use different radar frequencies for different penetrations (high resolution or broad spectrum). It also depends on what frequencies are permitted. “For instance, a 94GHz dual polarisation signal is a frequency allowed in Australia and is used in dusty, foggy and rainy environments,” he says. He adds that the mine is monitored for tele-operational purposes, but it is a question of monitoring versus control.

Meet the Robots

Unmanned Aerial Vehicles (UAVs)

Professor Sukkarieh is leading the ACFR’s robotic aircraft project. They are targeting invasive species. Not aliens, but other undesirable life forms like weeds, feral animals and locusts.

A series of rotary and fixed-wing ‘helicopter-based UAVs’ track and detect, and the information is logged on large-scale regional maps. Targeted applications and algorithms can learn the characteristics of certain types of weed, detect them and then aerially auto-spray them.


With her rounded red panels, she looks like a giant invertebrate that might have come from Roald Dahl’s ‘James and the Giant Peach’. The Ladybird is entirely solar-powered by battery and panels, and on a sunny, clear-blue day she (Sukkarieh animates this robot with a personal pronoun) can run indefinitely; otherwise she needs charging after six to eight hours.

Ladybird has been developed with industry funding and has a sensing system that recognises weeds and collects data with machine algorithms. She has hyper-spectral sensing which can map a vegetable farm - things such as crop yield estimation and insect management. She can detect, classify and target individual weeds. She also has a manipulator which measures and spot-sprays. “It may be slow compared to broad-acre crop-spraying, but do you want broad-acre crop spraying?” Sukkarieh quips. Ladybird also removes herbicide and fertiliser costs.

The sensing systems and algorithms can be manufactured on a bespoke basis, depending on whether the farmer just wants a weeding bot or the complete capability. If funding is available for a certain type of crop, Sukkarieh predicts that robotic harvesting should be available in around five years.

Sukkarieh acknowledges that he doesn’t live in the commercial world to understand implications on cost. Automated milking systems like Dairybot cost over a million Australian dollars (about half a million pounds), which dairy farmers pay back over six to seven years, but vegetable farms are a different story, he explains. They have much smaller land holdings and “for a weeding bot to take six or seven years before it returns on the investment, it would be shot down.” A couple of years at most, he says.


Imagine a taller version of the Disney robot WALL-E. Working primarily in orchards, Shrimp’s brief is to collect data from different sensors and through algorithms, convert this to high-level information and high-resolution maps charting colour, crop yield and vigour, giving the farmer a daily or weekly report. Each tree can be mapped individually.

Sukkarieh says that so far Shrimp has mapped almonds, bananas, lychees, avocados, mangoes and apples. One farmer described how he discovered why one side of his orchard had fewer apples. From scanned data, he realised there was less pollination. Another farmer would appreciate not having to get up in the middle of the night and wander round with a torch to do night insect surveys.

Ultimately, says Sukkarieh, Shrimp should be able to analyse the data collected and make its own decisions as to whether it should spray, add more fertiliser in a crop row, selectively harvest or remove insects, for instance.

But surely, autonomous decision-making must remain limited because one can’t account for infinite possibilities? Or is it more like chess, where one can make future predictions from patterns but, despite vast scope, the possibilities are finite? Sukkarieh says you can task the robots around how the farmer structures operations in the same way as chess: these are the rules, this is the scope and size of the operation and the opponent is weather or pests.


Shrimp and Mantis have the same platform but different sensors. “We wanted co-operative robots, sharing info,” says Sukkarieh. “We chose the names because a mantis shrimp has high perceptibility; it can sample space in 16 frequencies. These robots were built for perception-related tasks: RGB, infra-red, radar, laser, thermal, hyper-spectral robots with lots of computational power.”

Mantis has been doing cow-herding. “If something is coming toward them, dairy cows move away. But beef cattle are more aggressive. Can one scan a cow? It has been done in the beef industry”, says Sukkarieh. Meanwhile, at ACFR they are moving towards determining cow health in the paddock. Many questions are being asked, like why the cow is foraging there, whether the grass better there; and how this affects the nutrients in the milk.

Mawson, Continuum and Mammoth

These are planetary rovers, but they won’t be dropping onto a planet any time soon. Australia doesn’t have a space agency, therefore they will be for research into robo-kinematics, dynamics and robotic intelligence. Sukkarieh describes Mawson as being like Nasa rovers Spirit and Opportunity, Continuum as having a robotic arm, and Mammoth as being multi-hinged. The prehistoric elephant’s namesake has wheels and a hip-knee mechanism, meaning it can walk, clamber and roll energy-efficiently. Currently the robots are in a ‘Mars yard’ - 150 square metres of Sydney’s Museum of Applied Arts and Sciences (more commonly known as the Powerhouse), where they are interacting with school children on site and online.

Robot takeover?

Sukkarieh, like many others, thinks driverless cars are the future, as long as regulations allow. He points out that in America they have been trialling them for the last three years and trials are also taking place in London. Driverless cars using high-resolution, high-frequency sensors are generally safer than human-error driven cars, he says, and they don’t infringe traffic regulations either.

He has described robots as cost-effective and transformational, but they are also doing us humans out of jobs. Web entrepreneur Andrew Keen told the Daily Mail that robots could replace the middle classes as teachers, lawyers and doctors. “Machines have always been doing us out of jobs,” Sukkarieh comments. “They replace physical activity and computers replace mental activity. Robots are a combination of physical and mental.”

Society is always driving towards efficiency, Sukkarieh says, but the difference is that now it’s all happening in a much shorter space of time. He adds that research is ongoing into ethics and robotics and positive computing; there are government agency reports into what type of white-collar work will start to disappear. And education needs to keep up. By the time students graduate, he says, their intended jobs may not be there. “What skills will they need? Is it important to address maths, physics or systems thinking? Do they need soft skills or hard skills? If, say, you want to be a physiotherapist, what aspects of computing technology will you need to know?”

The ACFR also has a social robotics department. They are working on getting robots to move like humans. But can a robot have emotional intelligence? “My personal opinion: what a robot can do is compute a lot faster and more accurately than a human. If certain emotions can be reduced to a bunch of computations; that is, I detect by certain facial expressions that you are smiling, hence happy, smiling condescendingly or smiling, but only kidding me, then a robot can deduce that. But the jury is still out as to whether emotions are computations.” If even subtle motions of the face, like smiling with the eyes, can be computed, they can be replaced. Then computers can do emotional intelligence. “But there may be other vibes - smell, sounds, sight - that are not computational but abstract. The relationship between them is a lot harder to compute.”

Many people wonder whether a robot can react originally in conversation. “I can always randomise something in code. I can always make a robot do something unexpected by changing random numbers,” Sakkarieh comments. IBM Watson could answer questions more correctly than humans, but then it needed a whole cloud-bank of computers behind it, he adds. But could Watson be witty? “Whether or not those things can be brought down to equations or algorithms, you would still probably need 10,000 computers in the background to compute all that. I agree that the brain is a sophisticated multi-parallel-processed computer.”

Meanwhile, Professor Stephen Hawking has warned of the dangers of artificial intelligence, precisely because of its speedy evolution. Are Big Brother bots going to take over? Sukkarieh doesn’t believe that if a robot can walk into an office block and start typing, then it can walk in and take over, “but what it’s capable of is a specific task, and robots can do specific tasks better than humans.”

With the example of aeroplanes, he says that with enough sensing and algorithms, computers can fly more accurately and for longer periods of time. They can navigate and do flight control better. Which is why we have auto pilots. And why the number of actual pilots has decreased from four to two in a 747 cockpit. “But largely computers still can’t deal with risk and uncertainty fast enough, as compared to human pilots. And also we humans don’t want to hop on a plane with a computer,” he adds. So for now, the people are still in charge.

Image credits: Australian Centre for Field Robotics

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles