Precision agriculture has grown immensely in the past decades and is especially important for the future of sustainable agricultural production. To aid such a future, robots are being deployed to manage crops and control weeds. Machine-learning in computer vision plays a key role in robustly finding the targeted plant objects in such operations. Nowadays, deep learning is a popular choice when it comes to computational data analysis on image data such as object detection or segmentation in order for a robot to turn collected images into actionable results.

Specifically, weed detection in grasslands/dairy farming can be more challenging when compared to crop cultivation. For example, in crop cultivation, plants are clearly separable from the soil by parameters such as color thresholding or by considering near-infrared (NIR) imagery (which has higher reflectivity on chlorophyll content). However, in grasslands, weeds and background plants share similar colors and chlorophyll content, therefore, making dataset creation more expensive and weed detection more challenging. To tackle such a challenge, a team of researchers from the Technical University of Denmark is using Husky UGV to develop an agricultural robot platform that removes grassland weeds within organic dairy farming without the usage of herbicides.

The Technical University of Denmark (DTU) is an elite technical university dedicated to fulfilling the vision of H.C. Ørsted—the father of electromagnetism—who founded the university in 1829 to develop and create value using the natural sciences and the technical sciences to benefit society. DTU believes in technology as a means for necessary change, especially through the UN’s 17 Sustainable Development Goals as a platform for the university’s activities.

Improving the Future of Organic Farming

The Ph.D. project, executed by Ronja Güldenring (M.Sc./PhD Student) and Lazaros Nalpantidis (Ph.D./Associate Professor), focuses on weed identification and localization system and is a part of the “GALIRUMI” EU collaborative research project. The GALIRUMI project is focused on reducing manual labor for weeding, weed management costs, damage to grassland from herbicides, the impact of dairy farming on the environment, and dairy cow discomfort.

With the ultimate goal of creating a robotic weeder, the DTU team knew they wanted a robot that would replace manual and hard, back-breaking labor, which is often expensive and difficult to sufficiently supply. Their proposed platform would decrease some of the above-mentioned challenges in organic farming and eventually motivate conventional farms to transition to organic farming, which has a relevant impact on the sustainability of the world. Specifically, then, DTU’s Ph.D. project is responsible for the vision system on the robot platform, which includes vision sensor selection, weed detection in the fields, and more detailed weed analysis in order to provide relevant information for the weed treatment (such as root center and stem prediction).

“Our Husky UGV works every single time as expected, despite mud and dirt. It is comforting to have one less degree of uncertainty when going for field experiments.”

Lazaros Nalpantidis, Associate Professor

Identifying Plants vs. Weeds

To be able to focus on developing the vision system, the DTU team relied on Husky UGV as a mature, robust, and trustworthy robot platform. For the project, Husky UGV carries cameras, wanders on grasslands, and looks out for weeds. Its goal is to find and localize weeds so that it can afterward extinguish them in an herbicide-free manner. To this end, the team also plans to ultimately equip Husky UGV with organic-friendly weed control mechanisms (e.g. laser- or electrocution-based ones) and to guide the robot through very accurate Galileo-based localization (EGNSS).

For the vision system, Husky UGV takes the following sensors into account: IMU, EGNSS, odometry, a front camera, and a hole lidar camera. The robot drives through the field, detecting the rough location of weeds in the fields. Once a weed is detected, the robot drives over the plant. Through a hole within Husky UGV, the robot analyzes the plant more closely with the hole lidar camera, such as the root center and stem positions. Since the hole lidar camera is sensitive to ambient light, measurements are applied as covered by a cube in controlled lighting conditions. This information can then be further processed by weeding systems. For example, plant detection and analysis are performed with Deep Learning Algorithms, running on the Jetson Xavier NX.

The project’s core set-up, therefore, included the following:

  • IMU: Xsens MTI-630
  • EGNSS: NaviLock GNSS receiver
  • Front Camera: Basler ace 2 and a Basler Lens with a focal length of 4mm
  • Hole Lidar Camera: Intel Realsense L515

For the Front Camera, the team chose an RGB camera with a global shutter in order to perceive sharp images on a moving platform. Combining the camera with reasonable exposure control, they are able to deal with the varying lighting conditions on the outdoor fields.

For the hole camera, the team chose an RGB-D solution that works well on low distances. The additional depth map in the image is supposed to improve robustness during plant analysis. They chose components that allowed the payload to be relatively small: all above-mentioned sensors and mounts weighed no more than 20 kg. 

A Rugged Solution for Rugged Terrain

Since the GALIRUMI project takes place in grasslands, having a platform as rugged as Husky UGV was also essential. The out-of-the-box ready solution allowed the team to quickly set up a vision-equipped robot by fully concentrating on mounting their application-specific vision sensors onboard and getting them connected to Husky UGV’s well-documented ROS infrastructure. Some of their other favorite highlights included open-source software ROS, which made Husky UGV customizable for their needs, a robust design, and an optimal size & payload for their application. As Associate Professor Lazaros Nalpantidis said: “Our Husky UGV works every single time as expected, despite mud and dirt. It is comforting to have one less degree of uncertainty when going for field experiments.”

Without Husky UGV, the team would have resorted to finding or building another suitable platform, investing significant time and resources. Such an alternative, however, would have not allowed the DTU Team to focus all of their resources on their main objective: developing their vision system.

But the work is not done for the DTU team. The Ph.D. project is still ongoing and will approximately end in 2023. Up until this point, they have achieved the integration of the above-mentioned sensors with Husky UGV in a robust manner. As well, they performed dataset collection during the summer & autumn of 2021 using the front camera and the hole lidar camera. The data will be annotated and serve as training data for the team’s weed detection algorithms.

Supporting the Future of Farming Robotics

Future plans, on the other hand, include collecting further real-world data and making the Grassland Weed Dataset publicly available. As well, based on the team’s baseline results, they plan to make their weed detection system more robust using e.g. unsupervised contrastive learning or neural network training on different image views. In spring/summer 2022, the main focus will be on the integration of all subsystems of the overall GALIRUMI project, which includes localization via EGNSS, navigation on grassland fields, weed detection, and weed treatment without the usage of herbicides. This will be intended to showcase the functionality and effectiveness of the overall platform.

The Ph.D. project has been supported by the European Commission and European GNSS Agency through the project “Galileo-assisted robot to tackle the weed Rumex obtusifolius and increase the profitability and sustainability of dairy farming (GALIRUMI)”. As well, other European partners are greatly involved and responsible for other components of the wedding robot such as: Stitching Wageningen Research, Acorde Technologies, Steketee BV, Pekkeriet Dalfsen BV, Institut de l’elevage, Koonstra BV.

Finally, their current work has seen publication at the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems under the title “Few-leaf Learning: Weed Segmentation in Grasslands” and was among the four finalists for the IROS2021 Best Paper Award on Agri-Robotics. You can watch their accompanying video here

To learn more about the Technical University of Denmark, visit their website here.

To learn more about the GALIRUMI project, visit their website here.

To learn more about Husky UGV, visit our website here.

Clearpath Robotics Launches TurtleBot 4   |   Learn More
Hello. Add your message here.