Drones can often be faster and more efficient in search operations of lost or immobilized people in forests than helicopter or plane fly-overs, or rescue parties scouring terrain on foot. Still, their efficacity is limited by the inability of onboard tech to visually penetrate dense tree canopies. That may soon change.
Specialized forest-penetrating search and rescue drone
A study by researchers from the Computer Science Department at Johannes Kepler University in Austria has devised a drone search and rescue system that can cut through forest obstruction to scan ground-level terrain for human presence. The results of their testing are in a report published under the informative (but mouthful) title, “An Autonomous Drone for Search and Rescue in Forests Using Airborne Optical Sectioning.” In it, they describe their construction of a system using an uncrewed aerial vehicle (UAV) mounted with thermal cameras. Those are paired with onboard machine vision tech that immediately analyzes video streams to determine and autonomously execute the most promising search options based on available ground evidence.
The system basically “X-ray specs” its gaze through trees to look directly for human heat below, then decides which is the best course of search action to adopt next.
At the risk of getting overly wonky in describing hardware involved, the Johannes Kepler University team used a German MikroKopter Okto XL 6S12 drone equipped with Teledyne FLIR, and a Vue Pro camera with 9 mm fixed focal length lens imaging on a spectral band of 7.5 to 13.5 µm. (Let us pause for breath.) They then bunged on a Raspberry Pi 4B system-on-chip and an Intel Neural Computer Stick 2, along with other gizmos that included a SixFarb 3G/4G/LTE communications module – all mounted on a rotatable gimbal that positions the camera toward ground level during flight.
Here’s how the assembly works: The system-on-chip pilots the UAV’s flight and controls its thermal camera. It then downloads and preprocesses images shot from the camera’s memory, and puts together a final, combined image of what’s down below. That all occurs in a matter of seconds as the drone continues its flight.
Super-fast onboard tech peers through trees and processes images to locate human heat signatures
As it does, a computational imaging algorithm developed by the researchers under the name Airborne Optional Sectioning (AOS) essentially erases forest canopy from the raw thermal camera images. That permits the UAV to clearly see the floor beneath it. Deep learning classification algorithms take over from there, identifying any thermal images that reveal human heat signatures.
Though the onboard system can be fed navigational instructions from rescue managers monitoring the flight, it is conceived to make autonomous search decisions based on its own onboard analyses. It’s programmed, for example, to make an immediate second pass over an area to verify or invalidate initial readings suggesting human presence. If the UAV confirms people are below, the communication module sends an alert for rescue teams to scramble to the location, using an AOS thermal image of the detected person; the GPS coordinates involved; and the neural network’s confidence score that the location is correct.
The AOS system was tested in 17 field trials, with flights conducted over varied forest types and weather conditions. In pre-established flight modes the UAV’s detection rate was 86% – finding 30 out of 34 people playing lost. In missions where it was left on fully autonomous operation, the rescue drone’s own flight decisions came up perfect, finding eight out of eight missing subjects.
Photo: Oliver Bimber
FTC: DroneDJ is reader supported, we may earn income on affiliate links