New robotic technology appeals to Australian apple industry
A Monash University research team has developed a robot capable of identifying, picking and depositing apples in as little as seven seconds.
A research team led by Chao Chen in Monash University’s Department of Mechanical and Aerospace Engineering has developed an autonomous harvesting robot capable of identifying, picking and depositing apples in as little as seven seconds at full capacity.
Following extensive trials in February and March at Fankhauser Apples in Drouin, Victoria, Australia, the robot was able to harvest more than 85% of all reachable apples in the canopy as identified by its vision system.
Of all apples harvested, less than 6% were damaged due to stem removal. Apples without stems can still be sold, but don’t necessarily fit the cosmetic guidelines of some retailers.
With the robot limited to half its maximum speed, the median harvest rate was 12.6 seconds per apple. In streamlined pick-and-drop scenarios, the cycle time reduced to roughly nine seconds.
By using the robot’s capacity speed, individual apple harvesting time can drop to as little as seven seconds.
“Our developed vision system can not only positively identify apples in a tree within its range in an outdoors orchard environment by means of deep learning, but also identify and categorize obstacles, such as leaves and branches, to calculate the optimum trajectory for apple extraction,” Chen said.
Automatic harvesting robots, while a promising technology for the agricultural industry, pose challenges for fruit and vegetable growers.
Robotic harvesting of fruit and vegetables require the vision system to detect and localize the produce. To increase the success rate and reduce the damage of produce during the harvesting process, information on the shape, and stem-branch joint location and orientation are also required.
To counter this problem, researchers created a state-of-the-art motion-planning algorithm featuring fast-generation of collision-free trajectories to minimize processing and travel times between apples, reducing harvesting time and maximizing the number of apples that can be harvested at a single location.
The robot’s vision system can identify more than 90% of all visible apples seen within the camera’s view from a distance of approximately 1.2m. The system can work in all types of lighting and weather conditions, including intense sunlight and rain, and takes less than 200 milliseconds to process the image of an apple.
“We also implemented a ‘path-planning’ algorithm that was able to generate collision-free trajectories for more than 95% of all reachable apples in the canopy. It takes just eight seconds to plan the entire trajectory for the robot to grasp and deposit an apple,” Chen said. “The robot grasps apples with a specially designed, pneumatically powered, soft gripper with four independently actuated fingers and suction system that grasps and extracts apples efficiently, while minimizing damage to the fruit and the tree itself.
“In addition, the suction system draws the apple from the canopy into the gripper, reducing the need for the gripper to reach into the canopy and potentially damaging its surroundings. The gripper can extract more than 85% of all apples from the canopy that were planned for harvesting.”
Chen said the system can address the challenges of solving the current labour shortage in Australia’s agricultural sector, the future food crisis as population grows and decreased arable land. He said technological advances could also help increase the productivity of fruit and attract younger people to working in farms with this technology.
The research team includes Chao Chen, Wesley Au, Xing Wang, Hugh Zhou, and Hanwen Kang in LMGA at Monash. The project is funded by the Australian Research Council Industrial Transformation Research Hubs scheme.