DronePose: Photorealistic UAV-Assistant Dataset Synthesis for 3D Pose Estimation via a Smooth Silhouette Loss
Jan 3, 2021·,,,·
1 min read
Georgios Albanis
Nikolaos Zioulis
Anastadios Dimou
Dimitrios Zarpalas
Petros Daras
Abstract
In this work we consider UAVs as cooperative agents supporting human users in their operations. In this context, the 3D localisation of the UAV assistant is an important task that can facilitate the exchange of spatial information between the user and the UAV. To address this in a data-driven manner, we design a data synthesis pipeline to create a realistic multimodal dataset that includes both the exocentric user view, and the egocentric UAV view. We then exploit the joint availability of photorealistic and synthesized inputs to train a single-shot monocular pose estimation model. During training we leverage differentiable rendering to supplement a state-of-the-art direct regression objective with a novel smooth silhouette loss. Our results demonstrate its qualitative and quantitative performance gains over traditional silhouette objectives. Our data and code are available at
.
Type
Publication
In 2020 European Conference on Computer Vision Workshops (ECCVW)
Click the Cite button above to copy/download publication metadata (*.bib).