Visual detection of drones could become more precise and reliable by combining cameras with machine learning, according to researchers with the U.S. Department of Homeland Security (DHL) Science and Technology (S&T) Directorate and Sandia National Laboratories.
DHS and Sandia are working to make visual drone detection as effective as thermal and acoustic signatures or radio signals. Acoustic detection is less reliable in urban areas where background noise can mask a drone’s presence. As autonomous operations become more common, detecting a drone’s presence through the radio signals used control it becomes more difficult.
Bryana Woo with Sandia National Laboratories said, “Current systems rely upon exploiting electronic signals emitted from drones, but they can’t detect drones that do not transmit and receive these signals.”
Currently, videos of drones are limited to raw data analysis—capturing and learning from the video alone. The novel temporal frequency analysis (TFA) being tested at Sandia dives deeper into the image. Instead of heat signatures, acoustic signatures or taking a video at face-value, TFA analyzes the frequency of pixel fluctuation in an image over time, eventually obtaining a “temporal frequency signature” for the drones it has observed.
TFA captures tens of thousands of frames in a video, enabling a machine to learn about an object and its associations from how it moves through time. If mastered, TFA could be the most precise discrimination method to date. “If you have a video of something, you can kind of identify it based on certain characteristics,” explained Jeff Randorf, an S&T engineering advisor. “You can train neural networks to recognize patterns, and the algorithm can begin to pick up on certain features.”
During tests at Sandia, impressions of three different multirotor drones were captured on a streaming video camera. Each drone traveled forward and back, side to side and up and down. The camera captured spatial and temporal locations. A machine learning algorithm was trained on the frames taken. The analysis renders the full flight path of the target object in all its directions.
To challenge the system, Sandia researchers began with more complex data in a cluttered environment around the drone, including birds, cars and helicopters. Over time, they noticed a considerable difference in the system’s ability to discern whether an object was a drone or a bird.
TFA work with Sandia is part of a larger S&T effort to stay abreast the latest drone technologies. The number of commercial and personal drones in the sky is expected to nearly triple within the current decade, raising concerns about traffic management, how nefarious drones can be identified and how to identify drones in their surrounding environment.
The DHS S&T Directorate has taken on the challenges of identifying nefarious drones from multiple angles. Demonstrations at Camp Shelby in Mississippi enable law enforcement to develop a DHS interface for a future unmanned aircraft system (UAS) traffic management system, while keeping up with state-of-the-art counter-UAS capabilities.
Beitrag im Original auf http://uasmagazine.com/articles/1950/dhs-explores-machine-learning-to-improve-drone-detection, mit freundlicher Genehmigung von The UAS Magazine automatisch importiert, Original in englischer Sprache. Der Beitrag gibt nicht unbedingt die Meinung von UAV DACH e.V. wieder.