Crop monitoring is highly important in terms of the efficient and stable performance of tasks such as planting, spraying, and harvesting, and for this reason, several studies are being conducted to develop and improve crop monitoring robots. In addition, the applications of deep learning algorithms are increasing in the development of agricultural robots, since deep learning algorithms that use convolutional neural networks have been proven to show outstanding performance in image classification, segmentation, and object detection.
However, most of these applications are focused on the development of harvesting robots, and thus there are only a few studies that improve and develop monitoring robots through the use of deep learning. For this reason, this research team aimed to develop a real-time robot monitoring system for the generative growth of tomatoes. The presented method detects tomato fruits grown in hydroponic greenhouses using the Faster R-CNN (region-based convolutional neural network). In addition, the research team sought to select a color model that was robust to external light and used hue values to develop an image-based maturity standard for tomato fruits. Furthermore, the developed maturity standard was verified through comparison with expert classification.
Finally, the number of tomatoes was counted using a centroid-based tracking algorithm. The team trained the detection model using an open dataset and tested the whole system in real-time in a hydroponic greenhouse. A total of 53 tomato fruits were used to verify the developed system, and the developed system achieved 88.6% detection accuracy when completely obscured fruits not captured by the camera were included. When excluding obscured fruits, the system’s accuracy was 90.2%. For the maturity classification, the team conducted qualitative evaluations with the assistance of experts.
Read the complete research at www.mdpi.com.
Seo, D.; Cho, B.-H.; Kim, K. Development of Monitoring Robot System for Tomato Fruits in Hydroponic Greenhouses. Agronomy 2021, 11, 2211. https://doi.org/10.3390/agronomy11112211