Presentation is loading. Please wait.

Presentation is loading. Please wait.

Speaker Min-Koo Kang November 14, 2012 Depth Enhancement Technique by Sensor Fusion: Joint Bilateral Filter Approaches.

Similar presentations


Presentation on theme: "Speaker Min-Koo Kang November 14, 2012 Depth Enhancement Technique by Sensor Fusion: Joint Bilateral Filter Approaches."— Presentation transcript:

1 Speaker Min-Koo Kang November 14, 2012 Depth Enhancement Technique by Sensor Fusion: Joint Bilateral Filter Approaches

2 2 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Outline 1. Introduction - Why depth data is important - How to acquire depth data - Depth upsampling: state-of-the-art approach 2. Background - Interpolation filters: Nearest Neighbor / Bilinear / Bicubic / Bilateral 3. Bilateral filter-based depth upsampling - Joint Bilateral Upsampling (JBU) filter / SIGGRAPH 2007 - Pixel Weighted Average Strategy (PWAS) / ICIP 2010 - Unified Multi-Lateral (UML) filter / AVSS 2011 - Generalized depth enhancement framework / ECCV 2012 4. Concluding remarks

3 3 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion - Why depth data is important - How to acquire depth data - State-of-the-art approaches Introduction

4 4 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Why is depth data important?  Used in various fields  One of the most important techniques in computer vision  Important factors  speed, accuracy, resolution 3D reconstruction Virtual view generation In 3DTV Human computer interaction

5 5 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion How to acquire depth data?  Depth acquisition method comparison Laser scanning methodStereo vision sensorRange sensor  Can be overcome by depth map up-sampling Range sensor method has the most appropriate performance except low-resolution

6 6 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Problem definition  Disparity estimation by range sensor delivers small resolution of depth map  Rendering requires full resolution depth map  Main objectives / requirements: - Cost-effective (potential for real-time at consumer electronics platforms) - Align depth map edge with image edge - Remove inaccuracies (caused by heuristics in disparity estimation) - Temporal stability (esp. at edges and areas with detail) Upsampling Refinement

7 7 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Depth upsampling  Definition  Conversion of depth map with low resolution into one with high resolution  Approach  Most state-of-the-art methods are based on sensor fusion technique; i.e., use image sensor and range sensor together Depth map up-sampling by using bi-cubic interpolation Depth map up-sampling by using image and range sensor

8 8 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Background - Interpolation filters: Nearest Neighbor / Bilinear / Bicubic / Bilateral

9 9 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Single Image-based Interpolation  The conventional filterings

10 10 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion  The main types of artifacts are most easily seen at sharp edges, and include aliasing (jagged edges), blurring, and edge halos (see illustration below)  The main types of artifacts are most easily seen at sharp edges, and include aliasing (jagged edges), blurring, and edge halos (see illustration below) Upsampling examples 0% Sharpening 16.7% Sharpening 25% Sharpening Nearest NeighborBilinearBicubicInput

11 11 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Single Image-based Interpolation  Bilateral filtering: smoothing an image without blurring its edges

12 12 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Bilateral filtering applications InputGaussian smoothing Bilateral smoothing noisy image naïve denoising by Gaussian filter better denoising by bilateral filter

13 13 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Bilateral filter-based depth upsampling - Joint Bilateral Upsampling (JBU) filter / SIGGRAPH 2007 - Pixel Weighted Average Strategy (PWAS) / ICIP 2010 - Unified Multi-Lateral (UML) filter / AVSS 2011 - Generalized depth enhancement framework / ECCV 2012

14 14 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Joint bilateral filtering  Multi-modal filtering  Range term defined by one modality  Filtering performed on an other modality  Propagates properties from one to an other modality  Edge preserving properties

15 15 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Joint bilateral upsampling (JBU)  First publication on bilateral filters for upsampling at SIGGRAPH 2007  J. Kopf, Univ. of Konstantz (Germany) provided reference sw.  [Kopf2007] solution:  High resolution image in range term  Low resolution input  high resolution output Kopf et al., “Joint Bilateral Upsampling”, SIGGRAPH 2007

16 16 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Joint bilateral upsampling (JBU)  Representative formulation:  N(P): targeting pixel P(i, j)’s neighborhood. f S (.): spatial weighting term, applied for pixel position P. f I (.): range weighting term, applied for pixel value I(q). f S (.), f I (.) are Gaussian functions with standard deviations, σ S and σ I, respectively. Kopf et al., “Joint Bilateral Upsampling”, SIGGRAPH 2007 Upsampled depth map Rendered 3D view

17 17 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Is JBU ideal enough?  Limitations of JBU:  It starts from the fundamental heuristic assumptions about the relationship between depth and intensity data  Sometimes depth has no corresponding edges in the 2-D image  Remaining problems:  Erroneous copying of 2-D texture into actually smooth geometries within the depth map  Unwanted artifact known as edge blurring High-resolution guidance image (red=non-visible depth discontinuities) Low-resolution depth map (red=zooming area) JBU enhanced depth map (zoomed)

18 18 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Pixel Weighted Average Strategy (PWAS)  Pixel Weighted Average Strategy for Depth Sensor Data Fusion  F. Garcia, proposed in ICIP 2010  [Garcia2010] solution:  Use of a *credibility map to cope with texture copy & edge blurring  Credibility map indicates unreliable regions in depth map  Representative formulation:  D: given depth map. Q: credibility map. Guiding intensity image. Garcia et al., “Pixel Weighted Average Strategy for Depth Sensor Data Fusion”, ICIP 2010 *credibility: 믿을 수 있음, 진실성; 신용, 신뢰성, 신빙성, 위신

19 19 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion High resolution image Low resolution depth JBU result PWAS result

20 20 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Again, is PWAS ideal enough?  Limitations of PWAS:  Degree of smoothing depends on gradient of low resolution depth map  Remaining problems:  Degree of smoothing depends on gradients of pixels in depth map  Erroneous depths around depth edge are not compensated well  Contradictive with spatial weight term (f S (.))  Texture copy issue still remains in homogeneous regions of depth map High-resolution guidance image (red=non-visible depth discontinuities) JBU enhanced depth map (zoomed) PWAS enhanced depth map (zoomed)

21 21 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Unified Multi-Lateral (UML) filter  In order to reduce texture copy issue, the same author proposed combined version of two PWAS  F. Garcia, proposed in AVSS 2011  [Garcia2011] solution:  Use of combined PWAS filters  The second filter has both spatial and range kernels acting onto D  Use of the credibility map Q as a blending function, i.e., β = Q  Representative formulation:  Depth pixels with high reliability are not influenced by the 2-D data avoiding texture copying Garcia et al., “A New Multilateral Filter for Real-Time Depth Enhancement”, AVSS 2011

22 22 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Depth map enhancement examples 2D guidance image JBUPWASUML 2D guidance image JBUPWASUML

23 23 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Again, is UML ideal enough?  Limitations of UML:  Features of the proposed filter strongly depends on the credibility map If reference pixel value in credibility map is low, The filter works as the normal PWAS filter by in order to reduce edge blurring artifact by weakening smoothing effect around depth edge. If reference pixel value in credibility map is high, Relatively high weigh is allocated to J 3, and the proposed filter works in direction of reducing texture copy artifact.  Remaining problems:  Is credibility map really credible?  It only considers depth gradient, but occlusion, shadowing, and homogeneous regions are really incredible in general depth data.  Edge blurring artifact still exists  when there’s no corresponding depth edge in the image due to similar object colors.

24 24 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Depth map enhancement examples Ground truth Downsampled (9x)Intensity image JBU PWASUML

25 25 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Generalized depth enhancement filter by sensor fusion  Generalize the previous UML filter not only for active sensors (RGB-D) but also more traditional stereo camera.  F. Garcia, proposed in ECCV 2012  [Garcia2012] solution:  Passive sensor: extension of credibility map for general depth data  Object boundary, occlusion, homogeneous regions are considered  Active sensor: adaptive blending function β(p) change to cope with edge blurring issue, and the second term (J 3 (p)) in UML is substituted by D(p)  Representative formulation:  Smoothing effect is reduced in credible depth regions  The same computational with PWAS complexity  New β(p) prevents edge blurring when image edges have similar color Garcia et al., “Generalized depth enhancement filter by sensor fusion”, ECCV 2012

26 26 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Generalized depth enhancement filter by sensor fusion  Formulation of a new credibility map (Q(p):  Boundary map Q b (p)  Q b (p) = Q(p) in J 2  Occlusion map Q o (p):  Homogeneous map Q h (p): the characteristics of correlation cost at each pixel is analyzed  Homogeneous region  flat correlation cost / repetitive pattern  multiple minima. cost  First minimum value at depth d1  C(p, d1) / second minimum at d2  C(p, d2) left/right consistency check Garcia et al., “Generalized depth enhancement filter by sensor fusion”, ECCV 2012

27 27 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Generalized depth enhancement filter by sensor fusion  Formulation of blending function β(p):  Q I is defined analogously to Q D but considering ∇ I  The function u(·) is a step function  If edge blurring condition is satisfied, β(p) = 1  i.e., Q D τ I  The constants τ I and τ D are empirically chosen thresholds  If not,  β(p) = Q D (p), and J 5 (p) works similarly to the conventional UML filter Garcia et al., “Generalized depth enhancement filter by sensor fusion”, ECCV 2012

28 28 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Experimental results – passive sensing Garcia et al., “Generalized depth enhancement filter by sensor fusion”, ECCV 2012

29 29 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion  RMS: Root Mean Square  PBMP : Percentage of Bad Matching Pixels  SSIM : Structural SIMilarity Experimental results – passive sensing Garcia et al., “Generalized depth enhancement filter by sensor fusion”, ECCV 2012

30 30 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Experimental results – active sensing ImageUIUI QDQD β Garcia et al., “Generalized depth enhancement filter by sensor fusion”, ECCV 2012

31 31 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Experimental results – active sensing Garcia et al., “Generalized depth enhancement filter by sensor fusion”, ECCV 2012

32 32 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Now then, do we have an optimal solution?  Limitations:  Initial depth 의 신뢰도가 낮을 경우에 image value 의 문제가 있으면 해당 position 의 depth 를 개선할 방법이 없다. 예를 들어, occlusion, homogeneous 영역에서 texture copying 문제가 여전히 발생 가능함.  UML filter 컨셉과 충동 ! Edge blurring 조건에서 depth edge 근처의 distortion 확산의 문제  Remaining problems:  Qb, Qo, Qh 의 역할이 완전히 독립적이지 못하기 때문에 over weighting 의 위험이 우려됨. 예를 들어, boundary 와 occlusion 영역이 겹치게 된다. 혹은 homogeneous 영역에서 잘못 추정된 depth 는 left/right consistency 결과 occlusion 영역으로 판단될 수도 있다.

33 33 2012-11-14 / Computer Vision Laboratory Seminar Depth enhancement technique by sensor fusion Conclusion  Joint bilateral upsampling approach  Propagates properties from one to an other modality  Credibility map decides system performance  Defining blending function can be another critical factor  Many empirical parameters make the practical automated usage of such fusion filter challenging  Another question is a clear rule on when a smoothing by filtering is to be avoided and when a simple binary decision is to be undertaken


Download ppt "Speaker Min-Koo Kang November 14, 2012 Depth Enhancement Technique by Sensor Fusion: Joint Bilateral Filter Approaches."

Similar presentations


Ads by Google