With the growing use of head-mounted displays for virtual reality (VR), generating 3D contents for these devices becomes an important topic in computer vision. For capturing full 360 degree panoramas in a single shot, the Spherical Panoramic Camera (SPC) are gaining in popularity. However, estimating depth from a SPC remains a challenging problem. In this paper, we propose a practical method that generates all-around dense depth map using a narrow-baseline video clip captured by a SPC. While existing methods for depth from small motion rely on perspective cameras, we introduce a new bundle adjustment approach tailored for SPC that minimizes the re-projection error directly on the unit sphere. It enables to estimate approximate metric camera poses and 3D points. Additionally, we present a novel dense matching method called sphere sweeping algorithm. This allows us to take advantage of the overlapping regions between the cameras. To validate the effectiveness of the proposed method, we evaluate our approach on both synthetic and real-world data. As an example of the applications, we also present stereoscopic panorama images generated from our depth results.
조회 수 933 댓글 0
|저 자||Sunghoon Im, Hyowon Ha, Francois Rameau, Hae-Gon Jeon, Gyeongmin Choe, In So Kweon|
|학 회||European Conference on Computer Vision (ECCV)|
|Notes||This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIP) (No.2010-0028680). Sunghoon Im and Hae-Gon Jeon were partially supported by Global PH.D Fellowship Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education (NRF-2016907531, NRF-2015034617).|