Presentation is loading. Please wait.

Presentation is loading. Please wait.

L1 sparse reconstruction of sharp point set surfaces

Similar presentations


Presentation on theme: "L1 sparse reconstruction of sharp point set surfaces"— Presentation transcript:

1 L1 sparse reconstruction of sharp point set surfaces
HAIM AVRON, ANDREI SHARF, CHEN GREIF and DANIEL COHEN-OR

2 Index 3d surface reconstruction Reconstruction model
Moving Least squares Moving away from least squares [l1 sparse recon] Reconstruction model Re-weighted l1 Results and discussions

3 3D surface reconstruction

4 Moving least squares Input: Output : How does one do that :
Dense set of sample points that lie near a closed surface F with approximate surface normals. [in practice the normals are obtained by local least squares fitting of a plane to the sample points in a neighborhood ] Output : Generate a surface passing near the sample points. How does one do that : Linear point function that represents the local shape of the surface near point s. Combine these by a weighted average to produce a 3D function {I}, the surface is the zero implicit surface set of I. How good is it ? How close Is the function I to the signed distance function.

5 2D -> 1D

6 Total variation The l1-sparsity paradigm has been applied successfully to image denoising and de- blurring using Total Variation (TV) methods [Rudin 1987; Rudin et al. 1992; Chan et al. 2001; Levin et al. 2007] Total variation utilizes the sparsity in variation of gradients in an image. Dij is the discrete gradient operator , u is the scalar value The corresponding term for gradient in a mesh is the normal of the simplex (triangle)

7 Reconstruction model Error term :
Smooth surfaces have smoothly varying normals Penalty function (error) defined on the normals Total curvature Quadratic ; instead use Pair wise normal difference l2 norm Pi and pj are adjacent points pairwise penalty

8 Reweighted l1 Consists of solving a sequence of weighted l1 minimization problems. where the weights used for the next iteration are computed from the value of the current solution. Each iteration solves a convex optimization, The over all algorithm does not. [Enhancing Sparsity by Reweighteed l1 Minimiaztion , Candes 2008]

9 Reweighted l1 What is the key difference between l1 and l0 ? Dependence on magnitude.

10 Reweighted l1

11 Geometric view error Minimize L2 –norm [sum of square errors]
Minimize L1 norm [sum of differences] Minimize L0 norm [number of non zeros terms]

12 2 steps

13 Orientation reconstruction
Orientation minimization consists of two terms Global l1 minimization of orientation (normal) distances. Constraining the solution to be close to the initial orientation.

14 Orientation reconstruction ctd
Orientation minimization consists of two terms Global l1 minimization of orientation (normal) distances. For a piece-wise smooth surface the set Is sparse … why ? Globally weighted l1 penalty function

15 Orientation reconstruction ctd
For a piece-wise smooth surface the set Is sparse Globally weighted l1 penalty function

16 Orientation reconstruction
Orientation minimization consists of two terms Global l1 minimization of orientation (normal) distances. Constraining the solution to be close to the initial orientation.

17 Key idea !!!

18 Geometric view error Minimize L2 –norm [sum of square errors]
Minimize L1 norm [sum of differences] Minimize L0 norm [number of non zeros terms]

19

20

21

22

23 Results and Discussions
Advantages Global frame work Till now sharpness was a local concept Criticisms Slow In reality the convex optimization although there are readily available solutions is a slow process.

24 Discussions A lot of room for improvement
Can I express this as a different form ? Specially like the low rank and sparse error form we had before.


Download ppt "L1 sparse reconstruction of sharp point set surfaces"

Similar presentations


Ads by Google