Dense Depth Estimation from Stereo Endoscopy Videos Using Unsupervised Optical Flow Methods
In the context of Minimally Invasive Surgery, estimating depth from stereo endoscopy plays a crucial role in three-dimensional (3D) reconstruction, surgical navigation, and augmentation reality (AR) visualization. However, the challenges associated with this task are three-fold: 1) feature-less surf...
Saved in:
| Published in | Lecture notes in computer science Vol. 12722; pp. 337 - 349 |
|---|---|
| Main Authors | , , , |
| Format | Book Chapter Journal Article |
| Language | English |
| Published |
Switzerland
Springer International Publishing AG
01.01.2021
Springer International Publishing |
| Series | Lecture Notes in Computer Science |
| Subjects | |
| Online Access | Get full text |
| ISBN | 3030804313 9783030804312 |
| ISSN | 0302-9743 1611-3349 1611-3349 |
| DOI | 10.1007/978-3-030-80432-9_26 |
Cover
| Summary: | In the context of Minimally Invasive Surgery, estimating depth from stereo endoscopy plays a crucial role in three-dimensional (3D) reconstruction, surgical navigation, and augmentation reality (AR) visualization. However, the challenges associated with this task are three-fold: 1) feature-less surface representations, often polluted by artifacts, pose difficulty in identifying correspondence; 2) ground truth depth is difficult to estimate; and 3) an endoscopy image acquisition accompanied by accurately calibrated camera parameters is rare, as the camera is often adjusted during an intervention. To address these difficulties, we propose an unsupervised depth estimation framework (END-flow) based on an unsupervised optical flow network trained on un-rectified binocular videos without calibrated camera parameters. The proposed END-flow architecture is compared with traditional stereo matching, self-supervised depth estimation, unsupervised optical flow, and supervised methods implemented on the Stereo Correspondence and Reconstruction of Endoscopic Data (SCARED) Challenge dataset. Experimental results show that our method outperforms several state-of-the-art techniques and achieves a close performance to that of supervised methods. |
|---|---|
| ISBN: | 3030804313 9783030804312 |
| ISSN: | 0302-9743 1611-3349 1611-3349 |
| DOI: | 10.1007/978-3-030-80432-9_26 |