Fine-tuning faster region-based convolution neural networks for detecting poultry feeding behaviors
Abstract
Keywords: faster R-CNN, feeding behavior, poultry, feature extractor, hyperparameter
DOI: 10.25165/j.ijabe.20241801.6344
Citation: Hui X, Zhang D L, Jin W, Ma Y C, Li G M. Fine-tuning faster region-based convolution neural networks for detecting poultry feeding behaviors. Int J Agric & Biol Eng, 2025; 18(1): 64–73.
Keywords
Full Text:
PDFReferences
Wang S Y, Jiang G P, Pan C H, Santos T, Elhadidi Y, Jado A, et al. Light demand characteristics, production performance, and changes in the feeding patterns of broilers. Int J Agric & Biol Eng, 2024; 17(2): 68–73.
Howie J A, Tolkamp B J, Avendano S, Kyriazakis I. The structure of feeding behavior in commercial broiler lines selected for different growth rates. Poultry Science, 2009; 88(6): 1143–1150.
Collins L M, Sumpter D J T. The feeding dynamics of broiler chickens. Journal of the Royal Society Interface, 2006; 4(12): 65–72.
Ferket P R, Gernat A G. Factors that affect feed intake of meat birds: A review. International Journal of Poultry Science, 2006; 5(10): 905–911.
Urton G, Keyserlingk M A G V, Weary D M. Feeding behavior identifies dairy cows at risk for metritis. Journal of Dairy Science, 2005; 88(8): 2843–2849.
Li L, Zhao Y, Oliveira J, Verhoijsen W, Liu K, Xin H. A UHF RFID system for studying individual feeding and nesting behaviors of group-housed laying hens. Transactions of the ASABE, 2017; 60(4): 1337–1347.
Li G, Zhao Y, Hailey R, Zhang N, Liang Y, Purswell J L. An ultra-high frequency radio frequency identification system for studying individual feeding and drinking behaviors of group-housed broilers. Animal, 2019; 13(9): 2060–2069.
Yang X, Dai H, Wu Z, Bist R B, Subedi S, Sun J, et al. An innovative segment anything model for precision poultry monitoring. Computers and Electronics in Agriculture, 2024; 222: 109045.
Li G, Zhao Y, Chesser D, Lowe J W, Purswell J L. Image processing for analyzing broiler feeding and drinking behaviors. 2019 ASABE Annual International Meeting, 2019. DOI:10.13031/aim.201900165.
Zou X G, Yin Z L, Li Y H, Gong F, Bai Y G, Zhao Z H, et al. Novel multiple object tracking method for yellow feather broilers in a flat breeding chamber based on improved YOLOv3 and deep SORT. Int J Agric & Biol Eng, 2023; 16(5): 44–55.
Zhou M, Zhu J H, Cui Z H, Wang H Y, Sun X Q. Detection of abnormal chicken droppings based on improved Faster R-CNN. Int J Agric & Biol Eng, 2023; 16(1): 243–249.
Gené-Mola J, Vilaplana V, Rosell-Polo J R, Morros J R, Ruiz-Hidalgo J, Gregorio E. Multi-modal deep learning for Fuji apple detection using RGB-D cameras and their radiometric capabilities. Computer and Electronic in Agriculture, 2019; 162: 689–698.
Kang H, Chen C. Fast implementation of real-time fruit detection in apple orchards using deep learning. Computer and Electronic in Agriculture, 2020; 168: 105108.
Kang H, Chen C. Fruit detection, segmentation and 3D visualisation of environments in apple orchards. Computer and Electronic in Agriculture, 2020; 171: 105302.
Liang C, Xiong J, Zheng Z, Zhong Z, Li Z, Chen S, Yang Z. A visual detection method for nighttime litchi fruits and fruiting stems. Computer and Electronic in Agriculture, 2020; 169: 105192.
Li G, Ji B, Li B, Shi Z, ZhaoY, Dou Y, et al. Assessment of layer pullet drinking behaviors under selectable light colors using convolutional neural network. Computer and Electronic in Agriculture, 2020; 172: 105333.
Li R, Wang R, Xie C, Liu L, Zhang J, Wang F, et al. A coarse-to-fine network for aphid recognition and detection in the field. Biosystem Engineering, 2019; 187: 39–52.
Li G, Xu Y, Zhao Y, Du Q, Huang Y. Evaluating convolutional neural networks for cage-free floor egg detection. Sensors, 2020; 20(2): 332.
Nasirahmadi A, Sturm B, Edwards S, Jeppsson K H, Olsson A C, Müller S, et al. Deep learning and machine vision approaches for posture detection of individual pigs. Sensors, 2019; 19(17): 3738.
Tu S, Xue Y, Zheng C, Qi Y, Wan H, Mao L. Detection of passion fruits and maturity classification using Red-Green-Blue Depth images. Biosystem Engineering, 2018; 175: 156–167.
Wang D, Tang J, Zhu W, Li H, Xin J, He D. Dairy goat detection based on Faster R-CNN from surveillance video. Computer and Electronic in Agriculture, 2018; 154: 443–449.
Xu B, Wang W, Falzon G K, Wan P, Guo L, Chen G, et al. Automated cattle counting using Mask R-CNN in quadcopter vision system. Computer and Electronic in Agriculture, 2020; 171: 105300.
Yang Q, Xiao D, Lin S. Feeding behavior recognition for group-housed pigs with the Faster R-CNN. Computer and Electronic in Agriculture, 2018; 155: 453–460.
Zhang Y, Cai J, Xiao D, Li Z, Xiong B. Real-time sow behavior detection based on deep learning. Computer and Electronic in Agriculture, 2019; 163: 104884.
Zhu X, Chen C, Zheng B, Yang X, Gan H, Zheng C, et al. Automatic recognition of lactating sow postures by refined two-stream RGB-D faster R-CNN. Biosystem Engineering, 2020; 189: 116–132.
Li G, Li B, Shi Z, Zhao Y, Ma H. Design and evaluation of a lighting preference test system for laying hens. Computer and Electronic in Agriculture, 2018; 147: 118–125.
Hy-Line International 2013. Growing management of commercial pullets. http://www.hyline.com/UserDocs/Pages/TB_PULLET_MGMT_ENG.pdf. Accessed on [2021-08-09] .
Ren S, He K, Girshick R, Sun J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE transactions on pattern analysis and machine intelligence, 2017; 39(6): 1137–1146.
Szegedy C, Vanhoucke V, Ioffe S, Shlens J, Wojna Z. Rethinking the inception architecture for computer vision. IEEE Conference on Computer Vision and Pattern Recognition, 2016; pp.2818–2826. DOI:10.1109/CVPR.2016.308
He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition, Proceedings of the Institute of Electrical and Electronics Engineers Conference on Computer Vision and Pattern Recognition, 2016; pp.770–778. DOI:10.1109/cvpr.2016.90
Szegedy C, Ioffe S, Vanhoucke V, Alemi A A. Inception-v4, inception-resnet and the impact of residual connections on learning. 31st AAAI Conference on Artificial Intelligence, 2017; pp.4278–4284. DOI: 10.48550/arXiv.1602.07261
Lotter W, Sorensen G, Cox D. A multi-scale CNN and curriculum learning strategy for mammogram classification. Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support, Springer, 2017; 10553: 169–177.
Lin T Y, Maire M, Belongie S, Hays J, Perona P, Ramanan D, et al. Microsoft coco: Common objects in context. 13th European Conference on Computer Vision, Springer, 2014; 8693: 740–755.
Geiger A, Lenz P, Stiller C, Urtasun R. Vision meets robotics: The kitti dataset. The International Journal of Robotics Research, 2013; 32(11): 1231–1237.
Gu C, Sun C, Ross D A, Vondrick C, Pantofaru C, Li Y, et al. AVA: A video dataset of spatio-temporally localized atomic visual actions. IEEE Conference on Computer Vision and Pattern Recognition, 2018; pp.6047–6056. DOI:10.1109/CVPR.2018.00633
Ruder S. An overview of gradient descent optimization algorithms, arXiv. preprint, 2016; p. 12. DOI:10.1109/CVPR.2018.00633
Everingham M, Van G L, Williams C K, Winn J, Zisserman A. The pascal visual object classes (voc) challenge. International Journal of Computer Vision, 2010; 88(2): 303–338.
Bylinskii Z, Judd T, Oliva A, Torralba A, Durand F. What do different evaluation metrics tell us about saliency models? IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018; 41(3): 740–757.
Dinga R, Penninx B W, Veltman D J, Schmaal L, Marquand A F. Beyond accuracy: Measures for assessing machine learning models, pitfalls and guidelines. BioRxiv, 2019; 743138. DOI:10.1101/743138
Razavian A S, Azizpour H, Sullivan J, Carlsson S. CNN features off-the-shelf: an astounding baseline for recognition. IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2014; pp.512–519. DOI:10.1109/CVPRW.2014.131
Copyright (c) 2025 International Journal of Agricultural and Biological Engineering

This work is licensed under a Creative Commons Attribution 4.0 International License.