Abstract
Sensors which capture 3D scene information provide useful data for tasks in vehicle navigation, gesture recognition, human pose estimation, and geometric reconstruction. Active illumination time-of-flight sensors in particular have become widely used to estimate a 3D representation of a scene. However, the maximum range, density of acquired spatial samples, and overall acquisition time of these sensors is fundamentally limited by the minimum signal required to estimate depth reliably. In this paper, we propose a data-driven method for photon-efficient 3D imaging which leverages sensor fusion and computational reconstruction to rapidly and robustly estimate a dense depth map from low photon counts. Our sensor fusion approach uses measurements of single photon arrival times from a low-resolution single-photon detector array and an intensity image from a conventional high-resolution camera. Using a multi-scale deep convolutional network, we jointly process the raw measurements from both sensors and output a high-resolution depth map. To demonstrate the efficacy of our approach, we implement a hardware prototype and show results using captured data. At low signal-to-background levels, our depth reconstruction algorithm with sensor fusion outperforms other methods for depth estimation from noisy measurements of photon arrival times.
Supplemental Material
- E. Abreu, M. Lightstone, S.K. Mitra, and K. Arakawa. 1996. A new efficient approach for the removal of impulse noise from highly corrupted images. IEEE Trans. Image Process. 5, 6 (1996), 1012--1025. Google Scholar
Digital Library
- S. Achar, J.R. Bartels, W.L. Whittaker, K.N. Kutulakos, and S.G. Narasimhan. 2017. Epipolar time-of-flight imaging. ACM Trans. Graph. (SIGGRAPH) 36, 4 (2017), 37. Google Scholar
Digital Library
- Y. Altmann, R. Aspden, M. Padgett, and S. McLaughlin. 2017. A Bayesian Approach to Denoising of Single-Photon Binary Images. IEEE Trans. Computat. Imaging 3, 3 (Sept 2017), 460--471.Google Scholar
- A. Bleiweiss and M. Werman. 2009. Fusing time-of-flight depth and color for real-time segmentation and tracking. In Dynamic 3D Imaging. 58--69. Google Scholar
Digital Library
- S. Burri, H. Homulle, C. Bruschini, and E. Charbon. 2016. LinoSPAD: A time-resolved 256X1 CMOS SPAD line sensor system featuring 64 FPGA-based TDC channels running at up to 8.5 giga-events per second. In Proc. SPIE, Vol. 9899. 98990D.Google Scholar
- D. Chan, H. Buisman, C. Theobalt, and S. Thrun. 2008. A noise-aware filter for real-time depth upsampling. In Workshop on Multi-Camera and Multi-Modal Sensor Fusion Algorithms and Applications.Google Scholar
- Q. Chen and V. Koltun. 2013. A simple model for intrinsic image decomposition with depth cues. In Proc. ICCV. 241--248. Google Scholar
Digital Library
- H. Dautet, P. Deschamps, B. Dion, A.D. MacGregor, D. MacSween, R.J. McIntyre, C. Trottier, and P.P. Webb. 1993. Photon counting techniques with silicon avalanche photodiodes. Applied optics 32, 21 (1993), 3894--3900.Google Scholar
- J. Diebel and S. Thrun. 2006. An application of Markov Random Fields to range sensing. In Prac. NIPS. 291--298. Google Scholar
Digital Library
- D. Ferstl, C. Reinbacher, R. Ranftl, M. Rüther, and H. Bischof. 2013. Image guided depth upsampling using anisotropic total generalized variation. In Proc. CVPR. 993--1000. Google Scholar
Digital Library
- P. Henry, M. Krainin, E. Herbst, X. Ren, and D. Fox. 2012. RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments. The International Journal of Robotics Research 31, 5 (2012), 647--663. Google Scholar
Digital Library
- R. Horaud, M. Hansard, G. Evangelidis, and C. Ménier. 2016. An overview of depth cameras and range scanners based on time-of-flight technologies. Machine Vision and Applications 27, 7 (2016), 1005--1020. Google Scholar
Digital Library
- T. Hui, C.C. Loy, and X. Tang. 2016. Depth map super-resolution by deep multi-scale guidance. In Proc. ECCV. 353--369.Google Scholar
- D. Kingma and J. Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).Google Scholar
- A. Kirmani, D. Venkatraman, D. Shin, A. Colaço, F.N.C. Wong, J.H. Shapiro, and V.K. Goyal. 2014. First-photon imaging. Science 343, 6166 (2014), 58--61.Google Scholar
- A. Kolb, E. Barth, R. Koch, and R. Larsen. 2009. Time-of-flight sensors in computer graphics. In Eurographics (STARs). 119--134.Google Scholar
- J. Kopf, M.F. Cohen, D. Lischinski, and M. Uyttendaele. 2007. Joint bilateral upsampling. In ACM Trans. Graph. (SIGGRAPH), Vol. 26. 96. Google Scholar
Digital Library
- M. Koskinen, J.T. Kostamovaara, and R.A. Myllylae. 1992. Comparison of continuous-wave and pulsed time-of-flight laser range-finding techniques. In Proc. SPIE 1614. 296--305.Google Scholar
- Y. Li, J. Huang, N. Ahuja, and M. Yang. 2016. Deep joint image filtering. In Proc. ECCV. 154--169.Google Scholar
- G. Lin, A. Milan, C. Shen, and I. Reid. 2017. Refinenet: Multi-path refinement networks for high-resolution semantic segmentation. In Proc. CVPR.Google Scholar
- D.B. Lindell, M. O'Toole, and G. Wetzstein. 2018. Towards transient imaging at interactive rates with single-photon detectors. In Proc. ICCP.Google Scholar
- J. Marco, Q. Hernandez, A. Muñoz, Y. Dong, A. Jarabo, M.H. Kim, X. Tong, and D. Gutierrez. 2017. DeepToF: Off-the-shelf real-time correction of multipath interference in time-of-flight imaging. ACM Trans. Graph. (SIGGRAPH Asia) 36, 6 (2017), 219:1--219:12. Google Scholar
Digital Library
- A. McCarthy, X. Ren, A. Della Frera, N.R. Gemmell, N.J. Krichel, C. Scarcella, A. Ruggeri, A. Tosi, and G.S. Buller. 2013. Kilometer-range depth imaging at 1550 nm wavelength using an InGaAs/InP single-photon avalanche diode detector. Optics express 21, 19 (2013), 22098--22113.Google Scholar
- D. O'Connor and D. Philips. 1984. Time-correlated single photon counting. Academic Press.Google Scholar
- M. O'Toole, S. Achar, S.G. Narasimhan, and K.N. Kutulakos. 2015. Homogeneous codes for energy-efficient illumination and imaging. ACM Trans. Graph. (SIGGRAPH) 34, 4, Article 35 (2015). Google Scholar
Digital Library
- M. O'Toole, F. Heide, D.B. Lindell, K. Zang, S. Diamond, and G. Wetzstein. 2017. Reconstructing transient images from single-photon sensors. In Proc. CVPR.Google Scholar
- J. Park, H. Kim, Y. Tai, M.S. Brown, and I. Kweon. 2011. High quality depth map upsampling for 3D-TOF cameras. In Proc. ICCV 1623--1630. Google Scholar
Digital Library
- A.M. Pawlikowska, A. Halimi, R.A. Lamb, and G.S. Buller. 2017. Single-photon three-dimensional imaging at up to 10 kilometers range. Optics Express 25, 10 (2017), 11919--11931.Google Scholar
Cross Ref
- C. Peng, X. Zhang, G. Yu, G. Luo, and J. Sun. 2017. Large kernel matters- Improve semantic segmentation by global convolutional network. In Proc. CVPR. 1743--1751.Google Scholar
- G. Petschnigg, R. Szeliski, M. Agrawala, M. Cohen, H. Hoppe, and K. Toyama. 2004. Digital photography with flash and no-flash image pairs. ACM Trans. Graph. (SIGGRAPH) 23, 3 (2004), 664--672. Google Scholar
Digital Library
- E.F. Pettersen, T.D. Goddard, C.C. Huang, G.S. Couch, D.M. Greenblatt, E.C. Meng, and T.E. Ferrin. 2004. UCSF Chimera-a visualization system for exploratory research and analysis. Journal of computational chemistry 25, 13 (2004), 1605--1612.Google Scholar
Cross Ref
- J. Rapp and V.K. Goyal. 2017. A few photons among many: Unmixing signal and noise for photon-efficient active imaging. IEEE Trans. Computat. Imaging 3 (2017), 445--459. Issue 3.Google Scholar
Cross Ref
- D. Renker. 2006. Geiger-mode avalanche photodiodes, history, properties and problems. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment 567, 1 (2006), 48--56.Google Scholar
Cross Ref
- D. Scharstein and C. Pal. 2007. Learning conditional random fields for stereo. In Proc. CVPR. 1--8.Google Scholar
- D. Shin, A. Kirmani, V.K. Goyal, and J.H. Shapiro. 2015. Photon-efficient computational 3-D and reflectivity imaging with single-photon detectors. IEEE Trans. Computat. Imaging 1, 2 (2015), 112--125.Google Scholar
Cross Ref
- D. Shin, F. Xu, D. Venkatraman, R. Lussana, F. Villa, F. Zappa, V.K. Goyal, F.N.C. Wong, and J.H. Shapiro. 2016. Photon-efficient imaging with a single-photon camera. Nature Communications 7 (2016).Google Scholar
- N. Silberman, D. Hoiem, P. Kohli, and R. Fergus. 2012. Indoor segmentation and support inference from RGBD images. In Proc. ECCV. Google Scholar
Digital Library
- S. Su, F. Heide, G. Wetzstein, and W. Heidrich. 2018. Deep end-to-end time-of-flight imaging. In Proc. CVPR.Google Scholar
- R. Tobin, A. Halimi, A. McCarthy, X. Ren, K.J. McEwan, S. McLaughlin, and G.S. Buller, 2017. Long-range depth profiling of camouflaged targets using single-photon detection. Optical Engineering 57 (2017).Google Scholar
- Q. Yang, R. Yang, J. Davis, and D. Nistér. 2007. Spatial-depth super resolution for range images. In Proc. CVPR. 1--8.Google Scholar
Index Terms
Single-photon 3D imaging with deep sensor fusion
Recommendations
Panorama light-field imaging
SIGGRAPH '12: ACM SIGGRAPH 2012 PostersWe present a first approach towards panorama light-field imaging. By converting overlapping sub-light-fields into individual focal stacks, computing a panoramic focal stack from them, and converting the panoramic focal stack back into a panoramic light ...
Panorama light-field imaging
SIGGRAPH '12: ACM SIGGRAPH 2012 TalksWe present a first approach towards panorama light-field imaging. By converting overlapping sub-light-fields into individual focal stacks, computing a panoramic focal stack from them, and converting the panoramic focal stack back into a panoramic light ...
Doppler time-of-flight imaging
Over the last few years, depth cameras have become increasingly popular for a range of applications, including human-computer interaction and gaming, augmented reality, machine vision, and medical imaging. Many of the commercially-available devices use ...





Comments