Science

Video Shows Fleet of Drones Could Speed Up Search and Rescue Operations

Harnessing autonomous car technology for good.

Coordinating the efforts of experienced rangers, armies of volunteers, specially-trained dogs, and deafening helicopters is no easy task. And yet, though search and rescue is notoriously resource-intensive, it’s also obviously completely necessary.

Fortunately these operations are getting a lot more efficient. According to the latest data, the National Park Service successfully located 93 percent of its search and rescue calls within 24 hours between 2004 to 2014, per a Boston Globe report. Now, new research from NASA’s Langley Research Center and MIT has the potential to speed the process up even more by focusing on one of the world’s easiest places to get lost: forests.

Led by graduate student Yulun Tian, the group released a video last Thursday debuting an autonomous system of quadrotor drones designed to search an area and compile a map at high efficiency and speed. Rangers watching the map from a ground station would be given the freedom to focus on the rescue itself. The group will present their research at the International Symposium on Experimental Robotics conference next week.

Let Drones Do the Dirty Work

Though no search and rescue operation is simple, forests can prove particularly challenging. Helicopters cannot see through dense canopies, for one, and weak GPS signals can make drone use impractical.

Tian’s team, however, didn’t want to give up on the GPS-beholden drones, whose ability to bob and weave between branches had the potential to drastically reduce the number of eyes needed to carry out search missions. To address the GPS problem, the group took a leaf from autonomous cars (think Waymo) by using LIDAR to navigate.

When the image signatures of tree clusters match, the system combines the image to form a complete map.

MIT

LIDAR (Light Detection and Ranging) uses laser pulses to measure distance from an object. It’s difficult for drones to differentiate individual trees, but with LIDAR, drones can instead look at tree clusters. By measuring the distances between them, the drones can then create a signature of its location and draw a map. When the system recognizes signatures from different drones that means they’ve visited the same location, information which it can then use to knit the maps together.

Calculating the Speediest Course

The group’s research is a step up not only from using human power, but also from previous drone applications. In an attempt to maximize efficiency, previous search and rescue drones would choose their next search location by traveling to the closest area. Sounds reasonable, right? But finding the “closest” path may come with the price of reorientation.

“That doesn’t respect dynamics of drone [movement],” Tian says in a statement. “It has to stop and turn, so that means it’s very inefficient in terms of time and energy, and you can’t really pick up speed.”

The drones use LIDAR to navigate, making them GPS and satellite-free.

Melanie Gonick, MIT

In Tian’s system, drones calculate closest path while taking current orientation into account, which results in a spiraled path that allows drones to maintain momentum, conserving energy and time. And in search and rescue, every second counts.

The group tested the drones in simulations and tested two in a real forest, successfully mapping 20-square-meter areas in 2-5 minutes. For full application, the drones would be fitted with object detection systems that could identify a human form and drop a pin at their location, kicking off a rescue mission.

Bypassing both human and drone inefficiencies, MIT’s fleet of quadrotors could make a dent in the $51.4 million spent on search and rescue by the National Park Service between 2004 and 2014. But more importantly, the new system could take that 93 percent to the top.

Related Tags