One of the main points of discussion in robotics research and development today is how robots can share space with humans to perform important tasks. In order to collaborate with humans, robots need to be able to carefully navigate and interact with objects and people around them. 

Robotic applications in medical fields and hospital environments have been growing rapidly, aiming to address concerns pertaining to increased workload of medical personnel. Repetitive and time consuming tasks such as on-demand deliveries of materials are mostly performed by nurses. This often hinders their ability to deliver value-added patient care.

Harmony: Assistive Robots for Healthcare

This is where Harmony comes in; an EU-funded project that aims to provide a compelling solution to this problem with autonomous mobile manipulation robots. The multidisciplinary teams involved in Harmony aim to provide robust, flexible and safe autonomous mobile robots that can easily be integrated into human-centered environments such as hospitals. 

Project Harmony has two targeted use cases: 

  • Automation of on-demand delivery tasks around the hospital so that medical staff can focus on other critical needs of patients. 
  • Automation of sample collection to improve efficiency of the hospital bio-analysis pipeline.

Facing Important Technical Challenges

One of the teams that play an integral role in this project is a group of researchers at Stachniss Lab at the University of Bonn. Their main focus for this project is long term localization and mapping in dynamic hospital environments. Traditional localization and mapping techniques that often focus on low-level geometric features of the environment or mapping of a static world can prove to be insufficient when deployed in ever-changing, complex surroundings. 

To address this, the researchers at Stachniss Lab are aiming to construct an object-based map that will enable socially aware navigation and manipulation in dynamic environments. The object-based representation will allow the robot to build a database of objects from semantic and geometric data collected over time that will serve as a means for map building, motion planning and task representation.

Dingo-O Rises to the Challenge

One of the main concerns of implementing such a task was collecting considerable amounts of data for testing and training due to a variety of objects, dynamics and long-term structural changes that can be expected in a busy, ever-changing environment. Collecting such data in hospitals has been difficult for the team because of privacy concerns. Due to these constraints, they opted to utilize Dingo-O as an easy and reliable solution. 

 

Integrated Sensors

The Dingo-O was used for data collection and online, on-board testing of algorithms that were developed by the lab to create object-based maps for dynamic hospital environments. The sleek and compact indoor base platform was equipped with a sensor suite consisting of four Intel RealSense D455 depth cameras mounted on top that were used to record RGB-D streams covering a 360° field-of-view. In addition, FLIR’s Blackfly S GigE was used as a fisheye camera to extract ground truth poses from the localization infrastructure. Furthermore, the team added a pair of 2D SICK LiDARs and utilized wheel odometry for additional data collection.

Intel RealSense D455 and FLIR Blackfly S GigE Output on RViz

Onboard the Dingo-O platform was an Intel NUC10i7FNK installed with ROS Noetic and an Nvidia Jetson Xavier AGX, both running Ubuntu 20.04. The NUC was used to run the sensor and Dingo-O’s drivers, and the Jetson was used for deep learning applications, such as object detection. The team’s localization approach was run on the NUC with mapping done on the Jetson.

 

Long-Term Indoor Localization With Metric-Semantic Mapping

By employing Dingo-O for the task, the team was successfully able to collect data on the fly without having to depend on third parties. Its smooth drive and ability to supply power to several sensors and computers is what made the team choose Dingo-O over traditional mobile platforms that use simulation. Setting up a simulation when realistic visuals are required can be an arduous and time-consuming task.

Dingo-O had the ability to support all the equipment we wanted to mount on it. Its size was suitable for indoor operation and it was within our price range.” – Nicky Zimmerman, Lead Maintainer and Robotics PhD candidate, Stachniss Lab, University of Bonn

Dingo-O will also be featured in a live demo as a part of the lab’s IROS 2023 project submission.

The Future of Assistive Robots in Healthcare

The team at Stachniss Lab aims to continue their research on long term localization and mapping in indoor environments using the Dingo-O and hope to have a successful demo in a hospital environment. 

The Stachniss Lab members involved in this project consist of Nicky Zimmerman (Lead Maintainer) and Matteo Sodano, Elias Marks, Haofei Kaung (Support). The team would also like to thank Holger Milz, Michael Plech, and Ralf Becker for their contribution in sensor integration.

If you would like to learn more about the Stachniss Lab at the University of Bonn, you can visit their website here.

If you would like to learn more about Project Harmony, you can visit their website here.

If you would like to learn about Dingo, visit our website here.

Rockwell Automation completes acquisition of autonomous robotics leader Clearpath Robotics and its industrial offering OTTO Motors. Learn more | Read More
Hello. Add your message here.