Drone detection in dense city centers poses major challenges – when it’s not flat-out impossible. But one Duke University researcher thinks he may be on the way to solving that problem by pairing old-school radar technology up with self-teaching computers.
IDing potential drone security threats to cities
Radar is pretty good at identifying airborne planes and distinguishing those from, say, hang gliders. Ditto friendly jets from hostile aircraft. By contrast, it starts to struggle when beamed down into crowded city streets. It does okay with cars and buses, but throw joggers, skateboarders, or bikes moving with irregular trajectories and speeds into the mix, and things get confusing.
Drones, meantime, can come off looking like stationary objects or the spinning blades of air conditioning fans amid that visual stew. In failing to stand out, they sort of become invisible to surveillance technologies. And that would pose a security problem should someone decide to use a craft for destructive purposes.
So why not use networks of AI-enhanced friendly drones to spot possibly hostile craft ? That’s been done before, but the approach is usually limited to a relatively tight zone or single spot being secured. Broadening detection to broad urban swaths proves far trickier.
That’s where Duke University professor of electrical and computer engineering Jeffrey Krolik steps in. He’s developing a system capable of detecting drone activity in a significantly wide zone using radar and the machine learning program Deep Neural Networks (DNN).
The trick, he says, is to get computers to identify and differentiate the various objects commonly present in the area designated for protection first, so they can later be thrown off by introducing drones.
Mistaken drone-bird identities
Krolik has performed trials on the Duke campus in spaces initially kept drone-free. Radar feeds into the computer allow the DNN program to learn the kinematics of objects spotted. Their so-called micro-Doppler signatures and distinctive routes allow computers to eventually distinguish them. Cars tend follow roadways at steady speeds, for example, while bikes and people move in more varied yet distinctive fashion, respectively.
Once the DNN-equipped computer has stored a base of data for commonly present objects, a drone is introduced, which the program immediately recognizes and signals the anomaly. Thus far, the computer has weeded drones out from everything else with 98% accuracy.
“Most systems are designed in a laboratory to be taken out into the field,” Krolik told Wral TechWire. “This one learns from its environment, because most of the time a drone isn’t there.”
The next challenge Krolic encountered was that, when viewed through the speed and bearing of radar feedback, birds can wind up looking a lot like drones. That meant going back the drawing board among the feathered set, and letting the DNN file various kinds and birds into the computer’s data bank. Once that was done, Krolic introduced drones back onto the scene as the notable outlier.
Thus far, the machines have managed to distinguish the two types of flying bodies with 97% accuracy. Less clear, however, is whether the computers can tell a robin from a sparrow by their calls.
Photo: Duke University
FTC: We use income earning auto affiliate links. More.