MD5, in collaboration with Army Futures Command, offered up the challenge for developers, designers and hackers to come together in Austin, Texas, for a hackathon to explore nontraditional, innovative methods to counter small unmanned aerial systems (sUAS). The “A” Team, a multidisciplinary team from Texas A&M University and the Army Research Laboratory (ARL), stepped up to the challenge and was one of the three teams that won the MD5 A-Hack-of-the-Drones 2018 hackathon with the promise of $15,000 to further develop their ideas.
There is a growing need for technology and solutions to counter commercial, off-the-shelf sUAS that pose a potential threat when used by terrorists, state-sponsored operatives and others that mean to cause harm. The hackathon brought together individuals and teams who created solutions to this problem, including prototypes, models, software applications and platforms.
The hackathon drew 127 participants, with 14 submissions for final evaluation. Participants came from many environments, including Texas A&M, The University of Texas, the U.S. Army Armament Research, Development and Engineering Center, 75th Innovation, Naval Research Lab, QuantiTech and IBM Watson.
The “A” Team’s inspiration came from the major concern shared by both the United States and its allies like South Korea that the growing ubiquity of low-cost sUAS allows anyone with one of these devices to enter regions of civil or military interest and wreak havoc in ways unimaginable. Their belief was that it was of prime importance for agencies engaged in national security to be capable of detecting and tracking these devices to better protect the interests of those they serve.
A member of the team is an on-duty officer in the South Korean Air Force and is keenly aware of the swarms of sUAS administered by the North Korean military being used for reconnaissance and surveillance across the demilitarized zone between the two nations.
The “A” Team presented a computer vision solution to detect, track and intercept enemy drones in real time. From a live video stream, their solution uses a motion detection algorithm to detect any object moving in front of the camera. When motion is detected, machine learning is used to classify the object moving as a 'drone' or 'not-drone.' This helps to remove false-positives in case a bird or any other object is moving in front of the camera.
The same algorithm is then replicated on a different camera. If both cameras detect the same drone, this information is used to triangulate and track the 3D position of the enemy drone using estimation algorithms. Having the 3D position of the enemy drone in real time, a vehicle can be sent to intercept and minimize its threat.
The ability to accurately detect, estimate and track objects in the field has huge implications for the private industry - especially those that employ pattern/feature recognition in their services.
“It was great to work with many capable students and researchers during the competition. The multidisciplinary team was the key to the success during this competition: we were able to distribute the tasks according to each participant's strongest skills,” team member Vinicius G. Goecks, a doctoral student in the Department of Aerospace Engineering at Texas A&M said.
“We had great support from the mentors during the competition, which definitely helped to shape our final product. The ARL researchers helped us to define our computer vision approach and shape the product according to the Army's need.”
“A” Team members included Edan Coben, Emily Fojtik, Garrett Jares, Grayson Woods, Humberto Ramos, Nicholas Waytowich, Niladri Das, Sunsoo Kim, Vedang Deshpande, Venkata Tadiparthi, Vernon Lawhern and Goecks.
Support for the team came from faculty members Dr. Daniel Ragsdale from the Department of Computer Science and Engineering and Dr. John Valasek, Dr. John Hurtado, and Dr. Raktim Bhattacharya from the Department of Aerospace Engineering and the Army Research Laboratory.