Blog icon

By Alison Donnellan 3 November 2021 4 min read

Beating competitors from globally renowned institutions such as NASA Jet Propulsion Laboratory, MIT, California Institute of Technology and Carnegie Mellon University, team CSIRO’s Data61 claimed silver at DARPA’s (Defense Advanced Research Projects Agency) Subterranean Challenge Final Event in September.   

As the first Australian team to make it to a DARPA Challenge Final and place in the top three, team CSIRO’s Data61 have undoubtedly cemented their position as world leaders in robotic innovation.  

Here, Team Leader of team CSIRO’s Data61 and Group Leader of Data61’s Robotics and Autonomous Systems Group Dr Navinda Kottege shares a behind-the-scenes look at the techniques, strategies, and technology the team employed during the Challenge.  

Walk us through the tiebreaker in the final round between team CERBERUS and team CSIRO’s Data61.  

It was very, very, close. We were both at the top of the leader board after the final run with 23 points each.   

Tiebreaker rules were invoked, with the winner determined by which team scored the last point the fastest. We were behind Team CERBERUS by less than 46 seconds. 

A photo finish! Team CSIRO’s Data61 outdid the competition with their mapping and navigation capabilities, however?     

That’s right. The team won ‘Single Most Accurate Report’ because of our outstanding object detection and localisation aided by our Wildcat SLAM technology. Unfortunately, this wasn’t one of the scoring metrics, but it did contribute to us performing as well as we did!  

DARPA compared the mapping performance of the eight finalists, and we were the only team to have 0% deviation from the map DARPA created themselves. 

How did team CSIRO’s Data61 achieve such an excellent result? You mentioned Wildcat SLAM technology.   

Wildcat is our cutting-edge 3D SLAM (Simultaneous Localisation and Mapping) software enabling mapping and localisation for robots working in a team in challenging GPS-denied environments. This information is used by our autonomy system to allow robots to perform multi-agent navigation to effectively explore unknown environments.

The primary sensory modality on the UGVs (Uncrewed Ground Vehicles) was our ‘Catpack’ technology. It consists of spinning LiDAR (Light Detection and Ranging sensors), IMU (Inertial Measurement Unit), and four cameras for object detection and localisation.

Catpacks were mounted on the UGVs, while the drones had an equivalent capability in the Hovermap payload developed by our teammates Emesent. Having this uniform sensing method across all the robots made it much easier for us to develop high-quality and consistent maps that allowed the robots to localise and navigate the course.

The Wildcat SLAM system was matured throughout our involvement in DARPA Subt Challenge and is now commercially available through the early adopter program and in Emesent’s products.

Let’s talk about object detection. Teams were tasked with locating models representing lost or injured humans, backpacks, or phones, as well as variable conditions such as pockets of gas. What kind of technology did team CSIRO’s Data61 use to identify these items?  

Our main modality of sensing was vision-based complemented by carbon dioxide and Wi-Fi detection. Real-time object detection system DeNet developed by Data61’s Imaging and Computer Vision group enabled vision-based object detection for all robots.

The Robotics and Autonomous Systems Group tracked and localised these detections using LiDAR-based point clouds.

DARPA Subt Challenge Final Event course map created by CSIRO's Data61.

And what about communication? As part of the Challenge, each teams’ robots had to communicate back to a single human operator at the base station. When you are sending these bots so far underground, how was that possible?  

It was a challenge, but we had experienced similar circumstances in previous DARPA circuit events. The 2020 Urban Circuit was held in an uncommissioned nuclear facility with two-metre-thick lead and concrete walls, making radio transmission impossible.

We knew relying solely on radio communications was untenable, so we equipped the robots – particularly the all-terrain tracked robots – with communication nodes that could be dropped throughout the course to create a communications backbone. Robots were equipped with enough autonomy to go beyond the communication range to explore the course and identify objects, before returning in range to share that data with base station.

If one of the bots entered the non-comms range between another robot and base station, it acted as a mobile comms node to hop the data back from the first robot to base station.

How do the robots know when to return within the time frame if they have gone beyond communication range? Are they equipped with a weighted decision matrix?   

Developed in collaboration with team partners Georgia Institute of Technology,  the multi- robot task allocation (MRTA) framework enabled the robots to coordinate and successfully execute tasks, including relaying information and returning to communications range.

The team’s human operator at base station set the timing of these synchronised tasks.

What was the role of the sole human supervisor during the Challenge?  

The human supervisor is an extremely important role, even if the robots are autonomous. Mr. Brendan Tidd was the team’s human supervisor and performed remarkably

He set the high-level mission tasks for the robots and adapted the strategy as needed. This included deciding which robot was best suited to tackling certain types of terrain, setting exclusion zones, determining the best locations to drop communications nodes, and when and where to launch the drones.

What’s next for the team?  

The SubT Challenge has been an excellent testing ground for Data61 technology, with a suite of our matured solutions now available for commercialisation and use.

One of those is our Wildcat SLAM technology, which enables a vehicle to map and localise itself, in a GPS-denied environment without any human intervention. Australian and international SMEs are already using this technology to save resources, time, and money. Commercialisation of our multi-agent navigation stack is underway.

This robotic technology can be used in any scenario where you need to gather situational awareness data without putting humans in harm’s way. This could be in mining automation, confined space inspection, search and rescue, structural integrity surveying, and more.

The team also want this achievement to inspire the next generation of leaders in robotics and AI.

Numerous opportunities are now available for talented engineers and research scientists to join our team and help solve some of Australia’s greatest challenges using data and digital science.

So, if you’re interested in robot perception and learning, robot-world interaction, and autonomous vehicle navigation in complex environments, apply!

CSIRO thanks DARPA Challenge partners and team members Emesent and the Georgia Institute of Technology, as well as Queensland University of Technology and The University of Queensland for their collaboration. 

Contact us

Find out how we can help you and your business. Get in touch using the form below and our experts will get in contact soon!

CSIRO will handle your personal information in accordance with the Privacy Act 1988 (Cth) and our Privacy Policy.


First name must be filled in

Surname must be filled in

I am representing *

Please choose an option

Please provide a subject for the enquriy

0 / 100

We'll need to know what you want to contact us about so we can give you an answer

0 / 1900

You shouldn't be able to see this field. Please try again and leave the field blank.