We provide some of the world’s top researchers with unmanned vehicles for their R&D initiatives, and the U.S. Army Research Lab (ARL) is no exception! This week, we had a chance to learn how they’re using two Clearpath Huskies to field test vision-based autonomous terrain classification.

Vision-based autonomous terrain classification

The ARL is a division of the US Army. David Baran, Computer Engineer and team lead at ARL, is heading up their latest project: develop and field test a novel terrain classifier algorithm to visually identify different terrains and enable the unmanned vehicle to choose – or stay on – the optimal path.

The traditional method of labeling data points for terrain classification require that a robot’s vision system draw polygons and outlines all pixels in its field of view. This process is tedious and inefficient because the system must categorize each identified data point individually. Now, however, Baran and his team have developed a vision-based autonomous terrain classification. This new method allows the robot to observe the ground with cameras, establish associated risks, and navigate along the optimal path. “We want the robot to stay on the sidewalk or a trail instead of a potentially shorter path that leads to its goal,” explains Baran. “…instead of driving across a field and a path, the vehicle can recognize the grouped pixels representing the path and navigate to stay on it.”

Fork in the road? No problem!

arl blog 2The ARL process uses unsupervised machine learning to make clusters for grouping similar-looking concepts in an image, and then presents the user with a large cluster of superpixels (similar regions) which can be grouped with a single label. Training these classifiers uses far less operator effort to generate training data. In other words, Husky is able to distinguish between paths, vegetation and other terrain classification with no human input. The processing is all done on-board the robot by itself.

Baran’s team has two Huskies: one running the traditional training approach (which draws polygons and maps pixels), and another with the new visual terrain classification method. Using both robots simultaneously, they can test whether or not any errors in the new method of training are inconsequential.

Ready for rugged terrain

The ARL team needed a rugged mobile robot for their work, and one that could easily integrate with a variety of sensors. Husky runs ROS out of the box, so it was easy to integrate two Nano ITXes (one for navigation and autonomy, the other for vision processing), a Preslica high-resolution scientific camera, and a Velodyne laser scanner for obstacle avoidance. “The platform is rugged enough to handle on-road and off-road situations, so it’s the one we’re using to prove out our results.”

 

Blog Husky in the Wild industry standard

Clearpath Robotics Announces PartnerBot Grant Program   Apply Today
Hello. Add your message here.