Presentation is loading. Please wait.

Presentation is loading. Please wait.

SIGGRAPH 2003 Jingdan Zhang, Kun Zhou, Luiz Velho, Baining Guo, Heung-Yeung Shum.

Similar presentations


Presentation on theme: "SIGGRAPH 2003 Jingdan Zhang, Kun Zhou, Luiz Velho, Baining Guo, Heung-Yeung Shum."— Presentation transcript:

1 SIGGRAPH 2003 Jingdan Zhang, Kun Zhou, Luiz Velho, Baining Guo, Heung-Yeung Shum

2  We present an approach for decorating surfaces with progressively variant textures ◦ Can model local texture variations  Scale, orientation, color, shape variation  For 2D texture modeling, our feature-based warping technique allows the user to control the shape variations of texture elements  Our feature-based blending technique can create a smooth transition between two given homogeneous textures  We propose an algorithm based on texton masks ◦ To prevent texture elements breaking apart as they progressively vary

3  Most of the previous work on surface texture synthesis concentrated on homogeneous textures ◦ However, many textures, including the coating patterns of various animals such as the tiger, cannot be described by stationary stochastic models ◦ Intuitively, their texture elements change in a progressive fashion

4  Exemplar-based surface texture synthesis ◦ Gorla et al. 2001, Turk 2001, Wei and Levoy 2001, Ying et al. 2001, Dischler et al. 2002 ◦ Only for homogeneous texture synthesis

5  Reaction-diffusion textures ◦ Procedural ◦ Turk [1991], Witkin and Kass [1991] ◦ Parameter tweaking affects the result heavily ◦ Only a few kind of materials can be synthesized

6  Integrating Shape and Pattern in Mammalian Models ◦ Walter et al. SIGGRAPH 2001 ◦ By biological simulation

7  Garber 1981, Popat & Picard 1993, Efros & Leung 1999, Wei & Levoy 2000, Ashikhmin 2001, Hertzmann et al 2001, Tong et al 2002 … ExemplarSynthesized

8  We represent a progressively-variant texture by a tuple (T,M,F,V) ◦ Texture image T ◦ Texton mask M  Marks which type of texture elements pixel p belongs to ◦ Transition function F  Scalar function whose gradient determines how fast the texture T is changing ◦ Orientation field V  A unit vector field

9  A progressively-variant 2D texture can be created by our field distortion or feature-based techniques  The field distortion algorithm generates a texture by scaling and rotating the local coordinate frame at each pixel ◦ Using F, V  The feature-based techniques create texton masks first first, which then guide the synthesis of the target textures ◦ Feature-based warping & blending

10  To synthesize a progressively-variant texture on a mesh, we start with a 2D progressively-variant texture sample (T o,M o,F o,V o ) ◦ User needs to specify Fs and Vs over the target mesh  The synthesis algorithm controls the scale and orientation variation of texture elements by matching F s and V s with their 2D counterparts  Our algorithm synthesizes a texton mask M s in conjunction with the target texture T s and uses M s to prevent the breaking of texture elements

11

12  Synthesizes a progressively-variant texture T o  User specifies scale and orientation vectors at a few locations ◦ Interpolates these “key” scales and orientations to generate the entire F o and V o by using radial basis functions  Extends [Wei and Levoy 2000] by incorporating scale and orientation variations controlled F o and V o ◦ Pyramid-based sequential neighborhood matching algorithm

13  Fo and Vo control the target texture through the construction of the neighborhood N(p) ◦ N(p) is scaled using F o (p) and rotated using V o (p) ◦ Pixels in N(p) is resampled from To using bilinear interpolation  Does not consider pixel coverage  The synthesis order has a large effect on the synthesis quality

14

15  To apply feature-based techniques, the user must specify a texton mask on a given texture  Our user interface is based on color thresholding ◦ The user picks one or two pixel colors ◦ Provide dilation and erosion for refining texton masks  Our experiences suggest that a texton mask indicating one or two types of the most prominent texture elements is sufficient  Work well for most textures ◦ More sophisticated segmentation methods can be used to generate better texton masks

16  With input Texture T i and texton mask M i ◦ Produce new mask M o  Use F o to control the parameters in the editing operations  Our system synthesizes a progressively- variant texture T o using two texton masks, M i and M o, and known texture T i ◦ As an application of image analogies [Hertzmann et al. 2001] ◦ Refer to the step as ‘Texton mask filtering’

17

18

19  All masks used in this paper have fewer than four colors and usually the mask is binary ◦ Can easily apply morphological operations such as dilation, erosion  Can also apply image warping techniques such as mesh warping, field warping, and warping using radial basis functions ◦ Require feature points and feature lines

20  Takes two homogeneous textures T 0 and T 1 and generates a progressively-variant texture T b ◦ We assume T 0, T 1, and T b are all of the same size and are defined on a unit square ◦ Also use simple linear transition function and texton mask M 0, M 1  F b (x, y) = x  T b can be obtained by color blending T 0 ` and T 1 ` ◦ T 0 ` and T 1 ` can be obtained by synthesizing T 0 and T 1 using M b ◦ T 0 ` and T 1 ` have their features aligned thus does not cause ghosting when color blended

21  The key to generating T 0 ` and T 1 ` is the construction of M b  We want M b (x, y) to be like M 0 when x ≈ 0 and like M 1 when x ≈ 1 ◦ M(x, y) = xM 1 (x, y) + (1−x)M 0 (x, y) ◦ Gaussian blur M(x, y) to reduce discontinuity ◦ Convert M(x, y) to M b using user provided threshold

22

23  With (T o,M o,F o,V o ), synthesize T s over the mesh surface ◦ User needs to specify V s and F s at some key locations ◦ Interpolates over the entire surface  Standard L2-norm is a poor perceptual measure for neighborhood similarity ◦ Synthesis without texton mask may break apart texture elements

24  Our algorithm synthesizes a texton mask M s in conjunction with the texture T s ◦ Texton masks are resistant to damage caused by deficiencies in the L2-norm

25

26  Candidate pool C(v,ε) is constructed for each vertex v in mesh ◦ A candidate pixel p from T o must satisfy a condition  |F o (p)−F s (v)| < ε,ε = 0.1  Neighborhood N m (v) and N c (v) is in the tangent plane of the surface at v, with same orientation as V s (v) ◦ N m (p) and N c (p) is from T o, with same orientation as V o (p)

27  We use larger neighborhoods when searching for texton mask value, while smaller for color value ◦ Texton masks determine the layout of texture elements whereas the synthesis of pixel colors is simply a step to fill in the details  N c (p) should really be N c (p, s) where s = F o (p) is the scale at p ◦ N c (p, s min ) be the smallest neighborhood and N c (p, s max ) be the largest ◦ We determine the size of N c (p, s) by linearly interpolating between that of Nc(p, s min ) and Nc(p, s max ) and rounding the result up to the nearest integer ◦ Applies to all type of neighborhoods

28  We populate C(v,ε) using k-coherence technique ◦ With an additional check for the transition function condition  We pre-compute k-nearest neighbors for each pixels ◦ We use k = 20

29

30  An alternative approach to handle transition functions is to put the function values in the alpha channel ◦ However, this may not satisfy the condition from equ. 1  We need a orientation field for input texture as well

31  Texton masks are also useful for homogeneo us texture synthesis ◦ Previous methods would break some texture eleme nts due to insufficient texture measurement

32

33  Although color thresholding may not always g enerate a good segmentation in the traditiona l sense, the resulting texton masks are usuall y good enough  We hand painted a texton mask when color th resholding fails ◦ Our algorithm was still able to produce good results

34  Our main contribution in this paper is a framework fo r progressively variant textures on arbitrary surfaces ◦ Feature-based warping and blending ◦ The general framework we propose should be applicable to most textures  One area of future work is to add more user control t o feature based blending ◦ User may want more control over the way texture changes  Another topic is to explore the multi-way transition a mong more than two textures  Finally, we are interested in other ways to control the local variations of textures


Download ppt "SIGGRAPH 2003 Jingdan Zhang, Kun Zhou, Luiz Velho, Baining Guo, Heung-Yeung Shum."

Similar presentations


Ads by Google