Authors: Thomas Maugey, Xin Su, Christine Guillemot
Title: Reference camera selection for virtual view synthesis, submitted to IEEE Signal Processing Letters
Abstract: View synthesis using image-based rendering algorithms relies on one or more reference images. The latter has to be as close as possible than the virtual view that is generated. The notion of “closeness” is straightforward when the virtual view is parallel to the reference ones. Indeed the geometrical transformation between the cameras is a simple translation, whose amplitude can be naturally measured by a norm metric. However, we show in this paper that when the camera trajectory becomes general (i.e., translation and rotation are involved), no intuitive distance metric exists. In that case, choosing the best reference camera for view synthesis becomes a difficult problem. Some similarity metrics have been proposed in the literature, but they rely on the scene, and are thus complex to calculate. In this paper, we propose a distance metric that only relies on the camera parameters, and that is thus very simple to compute. We then use that distance to formulate and solve a reference camera selection problem in a general camera configuration. The obtained results show that our distance leads to an efficient and accurate choice of the reference views compared to a “naive” euclidian distance between camera parameters.