With robotics and AI making its way into every field, space exploration is no exception. Rovers, landers and other space vehicles are deployed for surface exploration of planets and other celestial bodies to make observations. Not only do these devices collect information about terrain, but can also be tasked to identify lunar and planetary rocks during surface exploration missions.
Cognition: Aiming to Improve Data Processing and Rover Autonomy
In the competitive space sector, reducing overall cost while yielding substantial results from space missions is becoming increasingly important. For this reason, researchers are now exploring cost-effective on-board rover processing by using commercial off-the-shelf parts. So is the purpose of project Cognition, being developed by the company KP Labs, the Computer Vision Lab and the Perception for Physical Interaction Lab at Poznań University of Technology (PUT).
Cognition is a part of the European Space Agency’s Open Space Innovation Platform (OSIP) program. The collaborative research aims to evaluate possibilities of advanced spacecraft, on-board data processing while reducing transmitted data payloads to the essential information. Furthermore, the teams aim to increase the level of autonomy of rovers, landers, and other space vehicles.
Preparing for an Analog Lunar Mission
Husky Rover at the LunAres Research Station
In order for the researchers to achieve their ambitious goal, they had to deploy a four-wheeled robotic platform rugged enough to survive challenging terrain. In addition, the platform needed to have various sensors integrated on it for data collection and processing. For such a task, the researchers at PUT chose a Husky Rover as a base platform. The rover would be used for platform teleoperation and deep learning inference on its analog lunar mission at the LunAres Research Station.
Data Processing and Deep Learning Inference
One of the goals of the project was to create a deep learning model to detect rocks on lunar surfaces while on the analog lunar mission. To prepare Husky for its mission, the team had to replace its onboard PC with a Xilinx Versal VCK190 developer kit – an embedded Field Programmable Gate Array (FPGA) board. The kit included a Xilinx Versal Adaptive Computer Acceleration Platform (ACAP), a programmable chip that can be used in various applications. The VCK190 also included the Xilinx Versal AI engine, a dedicated hardware accelerator for deep learning models. This FPGA board was used to decrease power consumption while enabling parallel data processing and deep learning inference.
ROS 2 Framework and Hardware Integration
To create a ROS 2-based framework for the Husky rover, the first step was to add ROS 2 packages to a Petalinux image. The meta-ros repository was used to build the image with ROS 2 packages. The team opted for ROS 2 Foxy due to its better support for their hardware. To build meta layers for the Husky rover package, the team had to modify the original packages to make it compatible with the Xilinx platform.
In addition, they created a ROS 2-based meta layer for an IMU sensor to provide odometry data. Another meta layer was created for the depth camera integration to capture and provide images and depth data to the rover. The last thing to be integrated into the ROS 2-based framework was an AI Engine endpoint. The endpoint was a Python script created by KP labs to receive images and output segmentation maps.
Segmentation on Acquired Camera Images
Husky Rover: An Ideal Platform for the Mission
The team was successfully able to create a ROS 2-based framework for their Husky rover. The rover was able to navigate remotely, detect obstacles and perform segmentation on acquired camera images. Watch a video of the Husky rover driving between rocks at an analog lunar mission that was conducted at the LunAres Research Station:
Given the project’s short timeframe, the team did not want to spend valuable resources building their own platform. Integrating components to build a system while being unsure about its reliability could cause setbacks in a time sensitive project.
“Husky performs well on difficult, sandy terrain. In addition, it makes it easier to carry out modifications, both on hardware and software. Clearpath Robotics also provides packages for ROS 2, which is crucial for our project and its future development.” – Bartosz Ptak – PhD Student, Computer Vision Lab, Poznań University of Technology.
The team plans to extend their project to planetary visual odometry with an aim to increase the autonomy of the Husky rover.
An abstract of their research; ‘Integration of Heterogeneous Computational Platform-Based, AI-Capable Planetary Rover Using ROS 2’ has been accepted for the IEEE IGARSS 2023.
The Cognition project was developed by joint teams from KP Labs (Project Leader) and Poznań University of Technology.
If you would like to learn more about the Computer Vision Lab at PUT, click here.
If you would like to learn more about KP Labs, you can visit their website here.
If you would like to learn about Husky UGV, you can visit our website here.