Robots watch and learn to cooperate
Robotic design is changing to embrace the cloud so they can cooperate more
Since the 1970s, the robot has threatened to take over manufacturing. But it’s a revolution that is taking a long time. There are a number of industries that, despite heavy investments in automation still use relatively few traditional robots and those that do have them are limited to specific tasks, captured in a safety cage.
Rethink Robotics chief marketing officer Jim Lawton said at the recent Supply Chain Insights Global Summit: “If you look around a factory that uses robots, they are usually either welding something or painting something. They are unsafe and they take hours to program for each job. It was an OECD study that said 90 per cent of jobs in production can’t get done with [traditional] robots.”
Now, the robot is changing to try to eat into that 90 per cent. Suppliers are making them easier to teach, sometimes even pushing the software that organises them into the cloud, and able to handle a much wider range of tasks. And it is not just the familiar robot arm that is changing.
Makers of traditional automation equipment have begun to embrace ideas from robotics, such as mechatronic control, to make automation more flexible, says Paul Goossens, vice president of engineering solutions at Maplesoft: “The kind of areas where we are starting to see increased interest is coming from traditional production-machine manufacturers in areas such as packaging. They are machines that do very specific things. We are now seeing a move away from that towards more mechatronic systems to give their customers more flexibility. If they need to reconfigure for a job, they can just enter new numbers and the system is automatically reconfigured.”
Another area where robots are mingling with traditional automation equipment is in electronics. The industry moved to surface-mount components to take advantage of progress in automated board “stuffing”. Pick-and-place machines can pop thousands of components onto a PCB in a matter of minutes before those boards are sent into a reflow-solder oven to be secured in place.
The problem for electronics manufacturers are the necessary components that are stubbornly resistant to automated placement. These are the high-voltage power transistors, various connectors and other fiddly parts that cannot fit into pick-and-place machinery. And then there are the final assembly steps that put the various circuit boards, displays and control panels into the final product. This has led to the development of robots such as Kuka Robotics’ KR3 Agilus. This can take on the parts that do not fit readily into a surface-mount flow, inserting difficult components before being transferred to the solder oven to be fixed into place.
Flexible robots potentially allow many different parts of a production line or cell to be brought under automated control. But the traditional problem has been one of integration. Machines on the production line often use manufacturer-specific protocols that require a lot of software investment to integrate.
By putting a webcam with a regular USB3 interface on just about any robot arm, cloud-software company Tend aims to expand the idea of using robots as machine attendants to a wide array of automation scenarios and to make it easier for them to work together with other machines.
“We are trying to solve a problem that many manufacturers have been trying to solve in different ways for some years. The concept and approach of Industry 4.0 and smart manufacturing in general is to connect all the robots along a production line,” says Eric Foellmer, chief marketing officer at Tend. “To do that they need to fit into a common communication protocol.”
Tend’s offering takes the approach that many machines, from ovens to 3D printers and computer numerical control (CNC) lathes have operator panels set up for humans to read and set controls using buttons and knobs. The robot uses the camera to look at the user interfaces of production machinery to check on their status and as a guide perform the same button presses an operator would, as well as moving assemblies and parts around the work cell.
“No PLC is required and no integration. The interpretation is all done in the cloud,” Foellmer argues. “It’s a simplified approach: we are just reading the human-machine interface. We are looking forward to the time when these machines can share data directly but in the short term we can get the data this way.”
Rethink has taken a similar approach with the Intera 5 software for its Sawyer robots, using a combination of cameras and force sensors for both reacting to changes on the production line and for training. To show a robot what to do the operator moves Sawyer’s manipulator arms into different positions and the robot registers itself using the cameras.
At Tuthill Plastics, a Sawyer unit picks parts from a conveyor belt and communicates with a CNC machine to start and stop operations, moving the parts in and out of the tool. Using the sensors to apply a precise level of force while placing the part, Tuthill says it has been able to improve part quality and consistency, reducing a length defect on the part by 98 per cent since introducing the robot.
Richard Curtain, president of Tuthill says: “Part placement is extremely critical to our machining process. Sawyer is able to effectively ensure product quality and consistency, handle the variability of the production line, and automatically re-register to the environment in the event that any parts move.”
As well as guiding the robot the camera and software can act as a quality-control inspector. “You can use the camera to determine, if there are meant to be ten screws, they are all in place. If there are only nine, the software triggers an alert,” says Foellmer.
In one demonstration by Rethink, the Sawyer robot looks first to see if cables are attached to an assembly and then uses the force sensors on its arm as it pulls gently at one to ensure it’s been secured.
With the Tend system, the alert can go to the smartphone of an operator or integrator who need not be on-site. “If you are a systems integrator who is remote from the customer’s facility you can receive the alert and check on the robot’s status from the app. It could be any form of mobile device but the smartphone is interesting from a control perspective. It’s a simplified user interface that’s presented that is consistent across all types of robot.”
The app is not just for monitoring and alerts. The aim is to support round-the-clock automated production. “If your order quantity has changed you can pause production or modify the component count without having to go in,” Foellmer explains. “Another aspect of smart manufacturing is the distributed nature of what we are doing. We are pursuing the concept of goal-based manufacturing, with X robots to create Y widgets. Through the cloud-based approach you can distribute the workload across a fleet of robots.”
The workload could potentially be distributed across multiple facilities, opening up the possibility of cloud-based manufacturing in which general-purpose production lines that may comprise a bank of 3D printers, CNC machinery and assembly robots are rented for several hours at a time to fulfil periods of high demand before being returned to a pool and used for other systems. “You may run out of raw materials in one region, so you shift production to another,” Foellmer says. “Eventually there will be convergence with other types of cloud robots, such as autonomous vehicles.”
Goossens says physics-simulation tools are key to modelling the more flexible mechatronic systems that are appearing, to the extent of creating a ‘digital twin’ of the target machine in computer memory. One company uses the approach to perform “virtual commissioning”, he explains. “They can have an operator define what the machine is going to do on the shopfloor. They can put a task plan on the machine: ‘This is what we want the machine to do’. If it works as expected, they can hit commit and the instruction goes down to the actual machine and is implemented. It saves a huge amount of time and effort.”
The digital twin need not be switched off once the programs have been developed. “What I can see is more and more of these digital twins being run in parallel with the real machine. If the machine starts drifting away from the model, then we know something is wrong and the machine may need preventive maintenance.”
Data from the shop floor can inform the creation of future models and more optimised control strategies using techniques such as machine learning, Goossens adds.
It does not mean the entire factory will be automated with parts flowing back and forth on self-driving trucks loaded by robotic fork lifts. Foellmer expects robots to work in collaboration with people who may walk in and out of a robot’s work cell to handle irregular jobs while the robot gets on with repetitive tasks. “The collaborative space is where we started in terms of early development. But, at the moment, there is simply a larger installed base of traditional industrial robots,” he says.
As robots become more commonplace and move out of their traditional safety cages and into common work cells, they will need to incorporate hardware and software interlocks to stop them from inadvertently injuring the people working around them. TUV Süd Product Service points to standards such as ISO 15066 which is being prepared to provide guidance as to how robot designers should implement safety checks to reduce the risk of injury.
Makers of collaborative robots such as Rethink have already put in place mechanisms to stop their robots from becoming a workplace menace. The Baxter and Sawyer robots, for example, uses their force sensors to detect obstructions and recoil from them and not simply push through the way a traditional industrial robot might. A hand or arm in the path of the robot’s own presents enough resistance for the robot to stop and move back.
The result of these different strands of development, from cloud to collaboration, point to a much more automated future and a revolution that deserves the tag Industry 4.0.