Presentation is loading. Please wait.

Presentation is loading. Please wait.

Detail Preserving Shape Deformation in Image Editing

Similar presentations


Presentation on theme: "Detail Preserving Shape Deformation in Image Editing"— Presentation transcript:

1

2 Detail Preserving Shape Deformation in Image Editing
Hui Fang Google, Inc. John C. Hart University of Illinois, Urbana-Champaign

3 Original Morphed Retextured

4 Relation to Previous Work
Detail preserving image deformation relies on texture synthesis Pixel based: [Heeger and Bergen 1995; Wei and Levoy 2000] Patch based: [Efros and Freeman 2001; Kwatra et al. 2003] Using the original image as the texture source Image Analogies [Hertzmann et al. 2001] Texture by Numbers [Brooks et al. 2003] We pay additional attention to the behavior of the texture synthesized around a user specified control curve Feature map [Wu and Yu 2004] Guidance vector field [Kwatra et al. 2005] Texture synthesis generates arbitrary amounts of texture from a sample texture swatch. Patch based texture synthesis is a quite successful approach in recent years. Texture synthesis is also used in image editing. The input image is used as texture source, such as in Image Analogies and Texture by Numbers. It may also be used to guide texture synthesis, such as in Textureshop. Additional user control also helps in texture synthesis, such as using Feature map or guidance vector field. We pay additional attention to the behavior of the texture synthesized around the control curve.

5 Relation to Previous Work
v. Image Completion [Sun et al. 2005] extended isotropic image textures via linear motion whereas we deform each texture patch non-linearly to fit a deformed silhouette v. Object-Based Image Editing [Barrett and Cheney 2002] populated masked regions with cut-and-paste patches our textures are feature aligned at pixel scale Image Completion extended an image’s textures by linear motion of isotropic textures. Our approach deforms each texture patch in a non-linear way to fit new silhouette of arbitrary shape. Object-Based Image Editing populate masked regions with cut-and-paste patches, whereas our textures are feature aligned at pixel scale

6 Feature Aligned Retexturing
Deformation of feature curves Curvilinear Coordinates generated from feature curves Textured Patch Generation Image Synthesis Our approach can be outlined in four steps. The user first manually selects and deforms several feature curves. Then Curvilinear Coordinates are generated from feature curves to capture the desired deformation. Texture patches are sampled from the input image, with an additional deformation according to the Curvilinear Coordinates. Finally the new image is synthesized by trying different texture patches and picking the one with least difference on overlapping area to existing texture patches, just like the approach in Graphcut Textures.

7 Deformation of Feature Curves
Each pair of pixels in control curves defines a deformation vector Deformation vectors interpolated smoothly across whole image by solving a Laplacian D In the first step, the user manually selects several feature lines. Note that pixel-level accurate feature detection is not necessary here. Even if the image feature does not match the selected lines, after deformation, the image feature can still be matched well due to the neighborhood searching in the image synthesis step. 2D(x,y) = 0

8 Curvilinear Coordinates
We create matching sets of curvilinear coordinates around source & target feature curves

9 Curvilinear Coordinates
We create matching sets of curvilinear coordinates around source & target feature curves We first define tangents along the source curve T

10 Curvilinear Coordinates
We create matching sets of curvilinear coordinates around source & target feature curves We first define tangents along the source curve And then solve a Laplacian 2T(x,y) = 0 to interpolate it across the source image T

11 Curvilinear Coordinates
We create matching sets of curvilinear coordinates around source & target feature curves We first define tangents along the source curve And then solve a Laplacian 2T(x,y) = 0 to interpolate it across the source image Do the same to the target T T

12 Curvilinear Coordinates
We apply Euler integration to source & target tangent fields Extend source & target spine curve parallel to tangent field qj+1,0 = qj,0 + eT(qj,0) Extend source & target rib curves perp. to tangent field qj,k+1 = qj,k + e [100-1] T(qj,k) Smooth result by iterating qj,k = qj,k – 70% 2qj,k We use Euler integration to extend a “spine” curve of positions q j,0 parallel to the feature curve, and perpendicular “rib” curves of positions q j,k

13 Textured Patch Generation
Synthesize texture on target coordinate patch using texture from source coordinate patch Source colors sampled at curvilinear coordinates are bilinearly filtered from surrounding integer pixel locations Target colors defined at curvilinear coordinates splat onto surrounding image pixels with a unit radius cone filter We synthesize a texture on the destination patch using the source coordinate patch as a texture swatch. We use a bilinear filter to find the color at the curvilinear coordinate patch position from its surrounding (integer) pixel locations in the source image. We use a unit-radius cone filter centered at each destination pixel locations to accumulate the synthesized texture at elements of the destination curvilinear coordinate patch.

14 Image Synthesis Individual feature-aligned texture patches grown and merged using GraphCut [Kwatra et al ] Patch synthesis order prioritized by proximity to feature curve and previously synthesized patches Poisson Image Editing [Perez et al. 2003] further hides boundary between neighboring patches. We use GraphCut to grow and merge these small feature-aligned synthesized texture patches into an even texturing over the entire image region surrounding each destination feature curve. Order of synthesis: generate patches whose origin pixel is closest to the feature curve first. Prefer those adjacent to a previously synthesized patch. Poisson Image Editing is used to further hide boundary between neighboring patches.

15

16

17

18 Scale Adaptive Retexturing
The deformation field can compress large source areas into small target areas Causes texture continuity problems and blockiness Use smaller texture patches in areas where deformation compresses highly

19 Video demo

20 Original Morphed Retextured

21 Original Morphed Retextured

22 Hair & Beards Morphed Retextured

23 Original Morphed Retextured

24 Morphed Retextured

25

26 Failure case This is an failure case that used only one feature curve. When sharp image changes (like shading changes) are not identified by feature curves, they are treated as isotropic texture and can cause unrealistic discontinuities in the result. Poisson image editing hides some of these artifacts, for examply by softly blending the misaligned petals

27 Running time

28 Questions?


Download ppt "Detail Preserving Shape Deformation in Image Editing"

Similar presentations


Ads by Google