Berry Picking Bots

MSU scientists have an "eye" on blackberry harvesting'

By: Meg Henderson

Berry Picking Bots

MAFES scientists, collaborating with Georgia Tech University and the University of Arkansas, have developed a robotic blackberry picker. (Photo by Anthony Gunderman, Georgia Tech University and University of Arkansas System Division of Agriculture)


Much of our modern agricultural crops are harvested by machines, but blackberries for fresh market must be hand-picked to preserve the delicate structure of the fruit at its peak ripeness. Scientists in the Mississippi Agricultural and Forestry Experiment Station, or MAFES, however, are turning to advanced technologies to automate the labor-intensive work of harvesting this high-value specialty crop.

Dr. Xin Zhang, an assistant professor in the agricultural and biological engineering department, is co-principal investigator on a $1 million multi-institutional effort, funded by the USDA National Institute of Food and Agriculture National Robotics Initiative 3.0 (NRI-3.0) program in collaboration with the National Science Foundation, to design a robotic system powered by AI-driven state-of-the-art deep learning. Zhang's contribution to the project is an efficient blackberry detection and localization system—the "eyes" and "brain" of the harvester.

The perception system designed by Zhang and her team is powered by YOLOv8 (You Only Look Once), a vision-based object detection model that identifies and locates objects of interest, in this case, ripe blackberries, quickly and accurately. This kind of technology is powerful enough to support robots, surveillance systems, and self-driving cars.

Zhang and her team trained a series of YOLO models to not only identify each blackberry on a bush but also detect its level of ripeness, from ripe (black in color), ripening (red) and unripe (green). The models were trained by loading over 1,000 images from plant canopies in various Arkansas commercial orchards and programming it to simultaneously single out the ripe berries for harvest and keep track of the others in preparation for the next round of harvest.

"Our main focus is detecting the ripe berries for the robotic harvester, but we added the other two categories to provide a total berry count," she said. "This way, the system not only identifies harvest-ready berries but also helps provide growers estimates of their total harvest."

During tests with multiple configurations and variants, their best-performing model was 94 percent accurate in identifying ripe berries, 91 percent for ripening berries and 88 percent for unripe berries. It also detected impressively high-resolution images in real time, clocking 21.5 milliseconds per image.

While the MSU team has been developing this critical component of the automated harvester, their partners at Georgia Institute of Technology are working on a patented soft-touch robotic gripper, a soft arm, and a bipedal mobile platform that will work hand-in-glove with the MSU-trained perception system. The prototype gripper is equipped with sensors located at the ends—like tiny fingertips—that allow it to grasp and pick the berry without squeezing and damaging it.

"The perception system identifies the berry and sends out 3-D coordinates, including distance, to the robotic arm, which uses that feedback to reach out and pick the berry," Zhang said. "It is critical that our perception system communicates quickly and accurately with the arm and gripper system.

Aside from their contributions to the harvester, Zhang and her team are beginning to develop a mobile app based on their image detection system.

"The app is a separate project, but it would give growers a quick and easy way to forecast their total harvest at the beginning of the harvest season so that they can swiftly adjust their marketing strategy," she said.


The research is funded by USDA NIFA's National Robotics Initiative 3.0 (NRI-3.0) program in collaboration with the National Science Foundation.

Sections