Recognition and localization of strawberries from 3D binocular cameras for a strawberry picking robot using coupled YOLO/Mask R-CNN
Abstract
Keywords: strawberry detection, 3D point cloud, mean-shift, clustering method
DOI: 10.25165/j.ijabe.20221506.7306
Citation: Hu H M, Kaizu Y, Zhang H D, Xu Y W, Imou K, Li M, et al. Recognition and localization of strawberries from 3D binocular cameras for a strawberry picking robot using coupled YOLO/Mask R-CNN. Int J Agric & Biol Eng, 2022; 15(6): 175–179.
Keywords
Full Text:
PDFReferences
Hikawa-Endo M. Improvement in the shelf-life of Japanese strawberry fruits by breeding and post-harvest techniques. The Horticulture Journal, 2020; 89(2): 115–123.
Nishizawa T. Current status and future prospect of strawberry production in East Asia and Southeast Asia. In: Proceedings of the IX International
Strawberry Symposium, 2021; pp.395–402.
Yoshida T, Fukao T, Hasegawa T. Fast detection of tomato peduncle using point cloud with a harvesting robot. Journal of Robotics and Mechatronics, 2018; 30(2): 180–186.
Takenaga. Strawberry harvesting robot for greenhouses. Japan Strawberry Seminar 1998 and Added Information. Tokyo, Japan: The Chemical Daily, 1998; pp.6–11. (in Japanese)
Hayashi S, Takahashi K, Yamamoto S, Saito S, Komeda T. Gentle handling of strawberries using a suction device. Biosystems Engineering, 2011; 109(4): 348–356.
Han K S, Kim S C, Lee Y B, Kim S C, Im D H, Choi H K, et al. Strawberry harvesting robot for bench-type cultivation. Journal of Biosystems Engineering, 2012; 37(1): 65–74.
Xiong Y, Ge Y, Grimstad L, From P J. An autonomous strawberry‐harvesting robot: Design, development, integration, and field evaluation. Journal of Field Robotics, 2020; 37(2): 202–224.
Xiong Y, Peng C, Grimstad L, From P J, Isler V. Development and field evaluation of a strawberry harvesting robot with a cable-driven gripper. Computers and Electronics in Agriculture, 2019; 157: 392–402.
Feng Q C, Wang X, Zheng W G, Qiu Q, Jiang K. A new strawberry harvesting robot for elevated-trough culture. Int J Agric & Biol Eng, 2012; 5(2): 1–8.
De Preter A, Anthonis J, De Baerdemaeker J. Development of a robot for harvesting strawberries. IFAC-Papers OnLine, 2018; 51(17): 14-19.
Cui Y, Gejima Y, Kobayashi T, Hiyoshi K, Nagata M. Study on cartesian-type strawberry-harvesting robot. Sensor Letters, 2013; 11(6-7): 1223–1228.
Yu Y, Zhang K, Liu H, Yang L, Zhang D. Real-time visual localization of the picking points for a ridge-planting strawberry harvesting robot. IEEE Access, 2020; 8: 116556–116568.
Xu L M, Zhang T Z. Influence of light intensity on extracted colour feature values of different maturity in strawberry. New Zealand Journal of Agricultural Research, 2007; 50(5): 559–565.
Zhang L, Ma X, Liu G, Zhou W, Zhang M. Recognition and positioning of strawberry fruits for harvesting robot based on convex hull. In: 2014 Montreal, Quebec Canada July 13–16, ASABE, 2014; doi: 10.13031/aim.20141902612.
Lei H, Huang K, Jiao Z, Tang Y, Zhong Z, Cai Y. Bayberry segmentation in a complex environment based on a multi-module convolutional neural network. Applied Soft Computing, 2022; 119: 108556. doi: 10.1016/ j.asoc.2022.108556.
Kai H, Huan L, Zeyu J, Tianlun H, Zaili C, Nan W. Bayberry maturity estimation algorithm based on multi-feature fusion. In: 2021 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA), 2021; pp.514–518. doi: 10.1109/ICAICA52286.2021.9498084.
Liu J, Wang X. Plant diseases and pests detection based on deep learning: a review. Plant Methods, 2021; 17(1): 1–18.
Redmon J, Divvala S, Girshick R, Farhadi A. You only look once: Unified, real-time object detection. In: Proceedings of the IEEE conference on computer vision and pattern recognition, 2016.
He K, Gkioxari G, Dollár P, Girshick R. Mask R-CNN. In Proceedings of the IEEE International Conference on Computer Vision, 2017; pp.2961–2969.
Yu Y, Zhang K, Yang L, Zhang D. Fruit detection for strawberry harvesting robot in non-structural environment based on Mask-RCNN. Computers and Electronics in Agriculture, 2019; 163: 104846. doi: 10.1016/jcompag.2019.06.001.
Kirsten E, Inocencio L C, Veronez M R, da Silveira L G, Bordin F, Marson F P. 3D data acquisition using stereo camera. In IEEE International Geoscience and Remote Sensing Symposium, 2018; pp.9214–9217. doi: 10.1109/igarss.2018.8519568.
Ortiz L E, Cabrera E V, Gonçalves L M. Depth data error modeling of the ZED 3D vision sensor from stereolabs. ELCVIA: Electronic Letters on Computer Vision and Image Analysis, 2018; 17(1): 1–15. doi: 10.5565/rev/elcvia.1084.
Gupta T, Li H. Indoor mapping for smart cities—an affordable approach: Using kinect sensor and ZED stereo camera. In 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN), 2017; pp.1–8.
Tagarakis A C, Kalaitzidis D, Filippou E, Benos L, Bochtis D. 3D scenery construction of agricultural environments for robotics awareness. Information and Communication Technologies for Agriculture—Theme III: Decision. Cham: Springer, 2022; pp. 125–142. doi: 10.1007/978-3-030-84152-2_6.
Rahul Y, Nair B B. Camera-based object detection, identification and distance estimation. 2nd International Conference on Micro-Electronics and Telecommunication Engineering (ICMETE), 2018; pp.203–205. doi: 10.1109/icmete.2018.00052.
Zhang Z. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000; 22(11): 1330–1334.
Copyright (c) 2022 International Journal of Agricultural and Biological Engineering
This work is licensed under a Creative Commons Attribution 4.0 International License.