Researchers Chuhao Liu and Shaojie Shen from the Hong Kong University of Science and Technology revealed an intriguing new way to control drones that uses holographic augmented reality hardware to create live 3D terrain maps, enabling drone pilots to simply point at targets visualized above any flat surface.

The holographic interface relies on a combination of interesting technologies: On the display side, a Microsoft HoloLens headset generates the augmented reality content as a colorful voxel map that can be viewed from any angle, using an autonomous drone’s depth cameras and raycasting for real-time location data. Critically, the system provides a live and highly spatial sense of environmental elevation and depth, allowing the drone to be easily seen from a third-person perspective and repositioned.

Then HoloLens feeds commands back to the drone, determining its next target within the holographic map by turning the wearer’s hand gestures and gazes into point-and-click-like controls. The autonomous drone then flies to the new location, updating the 3D map as it travels. The above demonstration video the researchers provided looks straight out of a sci-fi movie — at least on the holography side. Due to bandwidth limitations, the drone only supplies 3D map data to the AR interface, not the accompanying first-person video.

The Hong Kong University team still has a way to go before the holographic drone control system is ready to be deployed. Initially, the drone’s data was shared using Wi-Fi within an indoor testing space, though low-latency 5G cellular connections will likely work outdoors, once 5G networks progress past their currently drone-limited stage. The researchers also noted that HoloLens’ “very limited field of [AR] view … caused frequent complaints” in a group of testers, an issue that could be addressed using HoloLens 2 or another AR headset. Additionally, testers required practice to become proficient at 3D targeting, despite their prior familiarity with AR hardware, an issue that might trace to gesture recognition or an imperfect 3D UI.

The 6-page document can be accessed here.

Source: VentureBeat

UAV DACH: Beitrag im Original auf https://www.uasvision.com/2020/08/13/ar-interaction-interface-for-autonomous-drone-navigation/, mit freundlicher Genehmigung von UAS Vision automatisch importiert. Der Beitrag gibt nicht unbedingt die Meinung oder Position des UAV DACH e.V. wieder. Das Original ist in englischer Sprache. Für die Inhalte ist der UAV DACH e.V. nicht verantwortlich.