Modern robotic algorithms and systems frequently lack the robustness required to operate reliably in unknown environments. Visual systems cannot always perform reliably in unstructured environments such as forests due to highly repetitive patterns or occlusion.

The University of Michigan‘s Computational Autonomy and Robotics Laboratory (CURLY) Lab was established to design autonomous robotics systems that perform well in unknown and unstructured environments. Their research covers the essential aspects of autonomy, such as state estimation, SLAM, semantic mapping, motion planning, robot control, and learning. They also create open-source libraries and test them in the field alongside research publications.

The CURLY Lab’s most recent project employed a Husky UGV and was undertaken to improve the current state of autonomous mobile robots. The CURLY Lab has succeeded with autonomous robotic systems in urban settings (e.g., self-driving cars and delivery robots). However, robotic systems continue to face challenges when operating in unstructured environments such as forests or mountains. The group then concentrates on developing algorithms that will allow robotic systems to operate in harsh environments. Such technology will aid in search and rescue, first-responder missions, and scientific exploration.

Understanding the Environment With Deep Learning

The team is developing their own algorithms based on localization and mapping testing performed with Husky UGV to better navigate unstructured environments. Mapping algorithms seek to create a three-dimensional model of the environment so that the robot can understand what is around it. The goal of the localization task, on the other hand, is to determine where the robot is in this 3D map. The team’s localization algorithm, Invariant Extended Kalman Filter (InEKF), uses IMU and velocity measurements to estimate the robot’s position and orientation by leveraging the symmetry-preserving property on matrix Lie Groups. In addition, their mapping algorithm creates a map using deep learning technology. The team is currently working on a full SLAM pipeline to provide a more robust robot pose estimation using multi-sensor data fusion.

Focus Your Energies

The CURLY Lab’s project was primarily focused on software development. The team did not want to invest the time, effort, and resources required to build a robot from the ground up. Furthermore, their testing process required the integration of specific sensors, which can be difficult when developing a robotic system. Instead, Husky UGV provided a simple solution that avoided the time-consuming nature of hardware development.

The charts above show two sequences of trajectories that were recorded at University of Michigan MAir motion capture facility. 

The green lines show the InEKF estimated trajectories using velocity estimated from the wheel encoders, and the blue lines show the InEKF results using velocity from the motion capture system.

The high-precision IMUs were important components of the Husky UGV. In environments where the vision sensors were unusable, the team was able to achieve high accuracy and robust localization. Furthermore, the Husky UGV was small enough for the CURLY Lab to ensure student safety while also being large enough to maintain stability in unstructured environments. As Assistant Professor, Maani Ghaffari put it:

“This is especially important because it allows us to focus on developing autonomous algorithms without having to worry about the robot’s stability control. Furthermore, Husky UGV is integrated with ROS, which simplifies communication between algorithms and sensors.”

Maani Ghaffari, Assistant Professor, University of Michigan

Husky UGV was ultimately the rugged and ROS native solution that made it a compelling platform for the team.

The team’s project, in collaboration with the National Science Foundation, Toyota Research Institute, the US Army DEVCOM Ground Vehicle Systems Center, and NVIDIA via hardware support, has successfully developed several open-sourced libraries for robot state estimation, SLAM, or mapping.  As well, their work is published in Frontier in Robotics and AI. You can read their full paper here. In the future, the team plans to integrate each module, including planning, SLAM, and control, into a fully integrated autonomous system on Husky UGV.

The CURLY Lab team involved with this project consists of Tzu-Yuan (Justin) Lin, Ray Zhang, Chien Erh (Cynthia) Lin, Sangli Teng, Joseph Wilson, Tingjun Li, Theodor Chakhachiro, Wenzhe Tong, Yuewei Fu, and Xihang Yu.

To learn more about the CURLY Lab, visit their website here.

To learn more about Husky UGV, visit our website here.