Road to robotic automation in food processing
The food industry is entering an era similar to one the automobile industry went through decades ago. Automation is coming, that we know. How and to what capacity remains to be seen.
Robotics was featured prominently at PACK EXPO Las Vegas last fall. Machines that can grip, sort and pack could be found throughout the massive show floor, and there was even a Robotics Zone area dedicated solely to the latest technology.
For produce processors, many of whom already are dealing with skilled and unskilled labor struggles, replacing human workers with robotics will be front of mind in the coming years. With computerized vision technology and artificial intelligence (AI) still evolving, these could be tricky waters to navigate.
Rick Brookshire, director of product development at Epson Robots, offered a few suggestions on things companies should detail when starting to think about automation.
“Instead of just saying, ‘We make these. Can you help?’ We want people to be thinking about how many they want to make in a month,” Brookshire said. “That helps us determine cycle times and how many robots.
“Also, do you have your current process documented? Is it a five-part process? Is it a 20-part process? What kind of tooling do you think you’re going to need? What is your payload? These are all things you need to think about before you go talk to a robot vendor.”
Safety regulations and transparency that food producers deal with is another important aspect when it comes to thinking about automation, said Lisa Donnelly, vice president of marketing for Soft Robotics.
“Primary and secondary food contact requires different food safety requirements. Robotic companies who do not have food experience may not be aware of all of the regulatory and safety requirements,” Donnelly said. “This is important when you consider these two industry forces at play: the food industry is a growing market for automation, while automotive is declining. Those robotics companies that have serviced the automotive industry are looking to enter new markets — such as the food industry — but lack expertise.”
It’s important to take into account the difference between industrial and collaborative robots (cobots), Donnelly added. Handling delicacy and speed come into play when it comes to fresh produce.
“Collaborative robots are growing in popularity due to their ease of deployment and use,” she said. “A common mistake is not understanding the differences in speed capability with collaborative robots, which are purposefully slower than industrial robots due to the safety aspects of collaborative robots. Collaborative robots also aren’t always less expensive than industrial robots.”
Gripping and picking apparatuses, such as Soft Robotics’ mGrip, should be able to adapt to size and shape of the fruit or vegetable to allow for minimal tooling so the produce does not get damaged or bruised.
Computer vision technology can detect defects in items, even if too small for the naked eye, in a production when the items are supposed to be uniform in shape and size. No two pieces of produce are exactly the same, however.
“In the industrial space, vision systems are trained to look for something very specific. Let’s say for a pen, the outside of the pen is always the same,” Brookshire said. “But if you put five strawberries down, they all look different.”
Currently, people are needed to pick out defective produce in a processing or production line. Brookshire said AI technology is close to robots be able to learn what to look for in a defective piece of produce the same way an experienced human picker has learned.
“Imagine if you could put 100 good fruits and 100 bad fruits in front of a vision system and it could learn the differences. Slowly, it builds up a database,” Brookshire said. “It’s not exactly what a good one looks like, but what the characteristics are of a good one and what the characteristics are of a bad one. You could put a fruit in front of it, and it would determine, based on what it knows right now, that’s a bad one.
“That’s really no different than a human. If I’m a new person on the line, I probably won’t be able to detect as many bad ones as someone who has been doing it for five years.”
Brookshire said such technology could be available to producers in the next five years.
“We’ll be able to do it in some scope,” he said. “Will it be perfect? No, but we’ll be able to start doing it. It might not be in the masses at that point; but in 10 years, it probably will be.”
With any new technology, companies are reluctant to depend on it until it’s been proven to work in a real production environment. During an interactive forum at PACK EXPO, titled “Robotics: Identifying Applications and Justifying the Investments,” some show attendees were openly skeptical of computerized vision defect detection, even for ready-to-ship packaged products, much less raw produce.
“Robotics hasn’t caught up to detecting end-of-line defects the way people can,” said one attendee, who added he’s avoided replacing human packers with automation for that reason. “If I need someone to inspect the product, I may as well have them box it too.”
Brookshire said automation in food production will likely occur in small measures, perhaps one trial production line for larger producers.
“It depends on the company. Some of the big ones are very quick to take this kind of thing on,” he said. “For the small ones, they might look at it as a way to compete and get their costs down.
“Once it’s proven and factories see other factories doing it, they’ll say, ‘We have to do that.’”
Ultimately, there is going to be trial and error with robotics in food production.
“Really, it’s the same thing that happened with the automobile industry 30 to 40 years ago,” Brookshire said. “We tried to do everything with robots, and we couldn’t.
“You have to learn which things you can do well with robots and which you can’t.”
— By Zeke Jennings, managing editor