While the Defense Advanced Research Projects Agency (DARPA) Subt-T Challenge might be postponed because of COVID-19, CSIRO’s Data61 Robotics and Autonomous Systems Group undertook a home-grown challenge of their own at the Chillagoe-Mungana Caves in North Queensland recently.

Team members from Data61, Emesent and Georgia Institute of Technology (who were unfortunately unable to physically attend) deployed a fleet of autonomous robots to search the cave system, reporting the location of objects including mannequins (representing human survivors), ropes, backpacks, and cell phones. The robot team included two BIA5 OzBot Titan tracked platforms, one CSIRO built Dynamic Tracked Robot, and two Emesent DJI M210 drones.

The cave was separated into two courses – Alpha and Beta – with human supervisors receiving no information as to the location of objects, similar to a real-life search, report and rescue operation. According to team member Dr Navinda Kottege, the autonomous system controlling the robots was often more accurate at perceiving traversable terrain than human operators. “Over time, the operator learned to trust the autonomy of the vehicles and allow them to perform tasks independently,” explains Dr Kottege.

“In an urban environment, human operators can make some assumptions about the geometry of a location, such as flat ground and vertical walls, based on the point cloud data received from robots and their camera feeds. These visual cues are not present to the same extent in random cave formations and may prove to be an untrustworthy source of input for operators when making decisions.”

With that in mind, one of the biggest challenges of the three-day expedition was the complex and deformable nature of the caves. “In a cave environment, there are many soft and breakable ledges that may not bear the weight of a robot well. We encountered several instances of robots sliding & rolling as a result of this,” recalls Dr Kottege. “As a result, we will investigate more cautious approaches to ledges and an improved model for track placement and prediction of a robot’s orientation.”

Ground robots were equipped with Data61’s SLAM technology and CatPack, a perception system that contains Wildcat software that can be attached to any ground robot or vehicle to provide real-time localisation and mapping data for autonomous operation and high-fidelity mapping. Drones were enabled with Emesent’s Hovermap, an award-winning 3D LiDAR mapping and autonomy payload that features omni-directional collision avoidance, GPS-denied flight and advanced autonomy functions.

“We’re developing a third ATR (All-terrain Tracked Robot) platform named Drop Bear while making some design and control improvements to our Dynamic Tracked Robot (DTR) Pumpkin,” says Dr Kottege. “Our fleet is also expanding to include Ghost Robotics’ Vision60 quadrupeds, with legged robots having a definite advantage in traversing very rough terrain such as those found in natural caves.”

“Since we’re using an approach of homogeneous perception and autonomy with heterogenous platforms, we can add new robot platforms to our fleet while still using the same systems with minor configuration changes.”

Get a closer look at the courses, robots and teams in action with the highlights reel below.

 

 

A special thank you to team members Katrina Lo Surdo, Dennis Frousheger and Nicholas Hudson for providing photos and videos of the challenge.