University of Sheffield researchers are teaching UAVs to learn from the environment and to communicate with each other

Intelligent copters capable of learning

University of Sheffield researchers have developed software enabling UAV fleets to autonomously coordinate their actions and learn from the environment.

The quadcopters used by the Sheffield researchers are fitted with forward-facing cameras monitoring the environment ahead of the aircraft. The images from the cameras are used to create a 3D map of the environment with all key objects and obstacles within it. The visual data is supplemented with information from barometric and ultrasonic sensors to provide a complete picture about the situation around the UAV.

“We are used to the robots of science fiction films being able to act independently, recognise objects and individuals and make decisions,” said Professor Sandor Veres, who is leading the research. “In the real world, however, although robots can be extremely intelligent individually, their ability to co-operate and interact with each other and with humans is still very limited.”

The new software may contribute to a major change. With the available information being fed into autopilot software, the quadcopters can not only navigate safely, but also learn about the objects nearby and navigate to specific items.

The program developed by the Sheffield team also enables a quadcopter to communicated with other machines in the fleet without overloading conventional communication infrastructure – a feature of critical importance for possible deployment in disaster zones, where communication networks could be either disabled or already overloaded.

“The learning process the robots use here is similar to when two people meet in the street and need to get round each other,” explains research fellow, Jonathan Aitken. “They will simultaneously go to their left or right until they coordinate and avoid collision.”

The robots learn in the process, starting off flying at the same altitude and then working out independently which robot would fly higher and which would fly lower so they are able to pass.

The researchers used a computer concept called game theory to programme the quadcopters. In this framework, each robot is a player in the game and must complete its given task in order to ‘win’ the game.

If the robots play the game repeatedly they start to learn each other’s behaviour. They can then perform their task successfully – in this case getting past the other robot – by using previous experience to estimate the behaviour of the other robot.

“These simple tasks are part of a major research effort in the field of robotics at Sheffield University,” said Professor Veres. “The next step is to extend the programming capability so that multiple robots can collaborate with each other, enabling fleets of machines to interact and collaborate on more complex tasks.”

In future, the system could be deployed in areas where hazards are too high to allow human workers, for example in areas contaminated by radiation.

Such autonomous drone fleets could also help to survey disaster zones and assist in search and rescue operations.

Learn more in a video below: 

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them