Precision plant phenotyping is an essential practice for breeders to make informed decisions to meet the demands of modern agriculture and to contribute to resilient and sustainable crop production. 

Decades of extensive research has resulted in the establishment of genetic resources for upland cotton, a globally significant crop that holds tremendous economic importance. With a staggering $123 billion impact in the United States annually, cotton production contributes significantly to global trade and the textile manufacturing industry. In this context, cotton phenotyping assumes a crucial role by equipping agronomists with high quality data and valuable insights which further advances research in the field. 

There is an urgent need for advanced technologies, such as robotics and computer vision, that can precisely capture plant phenotypes. With the advent of these technologies, wide scale genomic selection and cotton breeding can be undertaken. To achieve this goal, a team of researchers at the Robotic Vision Laboratory (RVL) at the University of Texas at Arlington is using Husky UGV for automated cotton phenotyping.

Tech Marvel Meets Cotton Fields

 

Manual crop measurements are labor-intensive, time consuming and susceptible to errors. Without the assistance of robotics, collection of data would necessitate manual efforts across multiple crops, with measurements then extrapolated for the entire farm. This exemplifies the long standing challenge in plant science of obtaining data with sufficient resolution both in time and space. The use of robotics enables the team to conduct field experiments in wide scale and remote conditions, acquiring spatial and temporal data with exceptional precision. In this case, the team is using the Husky robot to remotely operate within cotton crop fields, capturing multimodal high-resolution plant phenotyping data. 

Furthermore, the acquisition of data must be consistently repeated throughout the growth process, imposing significant burdens when estimating plant traits on a large scale. Additional challenges arise from variable natural lighting, high temperatures, uncontrolled rainfall and the obstruction of plant features between neighbouring plants. To address these concerns, the RVL, in collaboration with experienced agronomists specialized in precision plant phenotyping, is utilizing data collected by Husky to identify key plant features. These valuable datasets will be leveraged for machine learning algorithms to propel advancements in the field of plant science.

Preparing Husky for its Phenotyping Venture

 

Husky with a Custom-Designed Sensing Arm: Equipped with Instruments Providing Symmetric Views of the Cotton Canopy

 

A Stereo Camera and LED Light Ring Installed at the Bottom of the Husky for Estimating Visual Odometry

 

The team mounted a custom-designed sensing arm with a sensor system that includes dual stereotypic cameras. This innovative setup enables comprehensive observation of both sides of a cotton plant as the robot navigates through crop rows. Additionally, a high-precision inertial measurement unit (IMU) and global positioning system (GPS) sensors were mounted on the base of the arm to accurately estimate the position of the robot. The data is time-stamped and fused in real time using the sensor system (NVIDIA Xavier) to ensure precise time synchronization between various sensors on the robot. Lastly, a stereo camera was combined with an LED light ring to illuminate the surface directly beneath the robot for visually estimating the odometry of the robot.

Predicting Cotton Yields Like Never Before

By employing Husky for cotton phenotyping, the team was able to create a variable rate deep compression architecture that operates on raw 3D point clouds captured by the robot. This architecture aims to strike a balance between compression efficiency and preserving essential features of the data when processing large datasets. On top of that, a framework has been developed to detect and count the number of cotton bolls, the most significant cotton phenotype, from infield video data collected by the Husky.

Infield Sensor Data Captured by Husky

Husky’s top performance traits noted by the team were its reliability, flexibility and runtime. 

“We had no issues with the robot failing in the field and no concerns about its ability to support the weight and power needs of our payloads, “ said Dr. William Beksi, Assistant Professor in the Department of Computer Science and Engineering at the University of Texas at Arlington. “With the lithium battery option, we were able to operate a full day in the field without worrying about running out of power. Furthermore, when searching for a well-reputed UGV platform, we had to consider important factors such as hardware and open-source software support,” Dr. Beksi added. 

These requirements led the team to choose Husky for their ambitious undertaking. 

Watch a video of in-field experiments conducted with Husky:

Future of Smart Farming

The team has formed partnerships with Texas Tech University as well as Texas A&M University and have enterprising plans to enhance their research on estimating cotton yield via a stereoscopic multi-view system for detecting and tracking cotton bolls. Furthermore, they aim to investigate the possibilities of autonomously navigating the robot through dense crop rows using visual odometry and movement primitives. Lastly, their objective is to identify and incorporate reduced sensor payloads suitable for long-term robot deployment, ensuring  data collection throughout the entire growing season.  

The team members involved in this project consist of Joseph Cloud (Graduate Research Assistant), Md Ahmed Al Muzaddid (Graduate Research Assistant), Adly Noore (Graduate Research Assistant), Noah Wood (Undergraduate Research Assistant), Anthony Aiyedun (Undergraduate Research Assistant), William Beksi (Principal Investigator).

 

Publications

M.A.A. Muzaddid and W.J. Beksi, “NTrack: A Multiple-Object Tracker and Dataset for Infield Cotton Boll Counting,” IEEE Transactions on Automation Science and Engineering, 2023 (conditionally accepted).

M.A.A Muzaddid and W.J. Beksi, “Variable Rate Compression for Raw 3D Point Clouds,” IEEE International Conference on Robotics and Automation (ICRA), pp. 8748-8755, 2022.

If you would like to learn more about Robotic Vision Lab and their work, you can visit their website here.

If you would like to learn more about Husky UGV, you can visit our website here.