Sign up for our daily Newsletter and stay up to date with all the latest news!

Subscribe I am already a subscriber

You are using software which is blocking our advertisements (adblocker).

As we provide the news for free, we are relying on revenues from our banners. So please disable your adblocker and reload the page to continue using this site.
Thanks!

Click here for a guide on disabling your adblocker.

Sign up for our daily Newsletter and stay up to date with all the latest news!

Subscribe I am already a subscriber

US (OH): Hongkai Yu Receives Funding for Improving crop yields in greenhouses

Dr. Hongkai Yu, an associate professor in the Department of Electrical and Computer Engineering (ECE) will join researchers from the University of Kentucky to apply computer vision and artificial intelligence to the task of improving crop quality and yields in large scale greenhouses. The work is funded by a nearly $1.2 million grant from the National Science Foundation, titled "An Autonomous Robotic System for Precision and High-Throughput Tomato Phenotyping in Large-Scale Greenhouses."

Dr. Yu and his students will design computer vision and AI algorithms that use camera imaging of the plants for phenotyping. Phenotypes (visible/observable traits) like fruit size/weight, number of fruit, and plant architecture can be used to optimize yield. The computer vision/AI system will improve the speed and quality of plant data collection and analysis while also reducing potential work hazards for greenhouse workers.

© Cleveland State University

Dr. Yu directs the Cleveland Vision & AI Lab, which pursues innovative research in the areas of computer vision, machine learning, deep learning and artificial intelligence. His lab recently published CSU's first paper in the high impact journal Computer Vision and Pattern Recognition (CVPR). CSU PhD students Jinlong Li (below left) and Xinyu Liu (below right) presented the paper at the 2024 CVPR conference.

Source: Cleveland State University

Publication date:

Related Articles → See More