Presentation is loading. Please wait.

Presentation is loading. Please wait.

Emil Praun Hugues Hoppe Matthew Webb Adam Finkelstein

Similar presentations


Presentation on theme: "Emil Praun Hugues Hoppe Matthew Webb Adam Finkelstein"— Presentation transcript:

1 Emil Praun Hugues Hoppe Matthew Webb Adam Finkelstein
Real–Time Hatching Emil Praun Hugues Hoppe Matthew Webb Adam Finkelstein Princeton University Microsoft Research Hello. My name is Emil Praun. I will present “Real-Time Hatching”, a project I worked on together with Hugues Hoppe from Microsoft and with Matthew Webb and my advisor from Princeton, Adam Finkelstein

2 Goal Stroke-based rendering of 3D models Strokes convey: tone material
shape Demo The goal of our project is the interactive rendering of 3D models in a non-photorealistic style based on strokes. We consider strokes interesting since, in addition to providing pleasant imagery, the simultaneously convey tone and material. As you can see in the figure, the arrangement of strokes can also enhance our perception of the object’s shape. Here is what our system looks like.

3 Challenges Interactive camera and lighting control
Temporal (frame to frame) coherence Spatial continuity Artistic freedom There are several challenges one has to address in such a project. We would like interactive control over viewing and lighting conditions. The animation has to look smooth, that is, to have frame to frame coherence. We also don’t want any visible discontinuities in the image. Finally, we would like our system to offer some degree of artistic freedom, allowing different stroke styles.

4 Approach Set of textures Example stroke Result Mesh Preprocess
Here is an overview of our approach. The main idea is to leverage the texturing power of recent graphics hardware, and to bundle strokes together in textures, rather than draw them as individual primitives. We start with a bitmap of a stroke that an artist draws, or that we import from a drawing package. As a preprocess, we create a set of textures using several instances of our example stroke. At runtime, we select and blend a subset of these texture images, applying them to the mesh surface. Mesh Preprocess Real-Time

5 Previous Work Off-line Real-Time Hatching [Hertzmann et al. 2000]
[Winkenbach et al. ’94, ’96] There has been a very large body of work in non-photorealistic rendering in general. For hatching, most systems have concentrated on producing high-quality renderings of static scenes. Most of these systems however are fairly slow and not not address frame to frame cohere, and are therefore unsuitable for on-line animation. [Sousa et al. ’99] & many others …

6 Previous Work NPR Real-Time Hatching Technical Illustration
[Gooch et al. ’99] Graftals [Kowalski et al. ’99, …] Silhouette rendering [Markosian et al. ’97] [Hertzmann et al. 2000] [Sander et al. 2000] In the realm of real time NRP, Gooch et al presented a system for interactive technical illustration; Kowalski and co-workers render scenes in NPR style by drawing small geometric elements called graftals, in a view-dependent way. Also, several systems have addressed the problem of finding object silhouettes efficiently.

7 Previous Work Real-Time Hatching
Screen-space “filter” [Lake et al. 2000] Fixed density strokes [Elber ’99] Two notable systems have addressed real time hatching. The system presented by Lake et al. at NPAR 2000 renders scenes using strokes embedded in screen space. We are aiming for a different kind of artistic effect in our project, one in which strokes are associated with the object, helping reveal its shape both through the stroke direction and the optical flow created by motion. Elber presented a system with object-space coherence, using strokes with fixed density on the object. In contrast, our system allows us to dynamically vary the stroke density in order to maintain constant spacing of strokes in the image.

8 Previous Work – Stroke Collections
Prioritized Stroke Textures [Salisbury et al. ’94] [Winkenbach et al. ’94] Art Maps [Klein et al. 2000]  scale  In a moment, I will present a construction that uses two previous ideas involving stroke collections. Prioritized stroke textures were introduced by Salisbury et al. and Winkenbach et al. at Siggraph in 94. They represent an ordered collection of strokes, where each stroke is annotated by the tone that would be produced by rendering it and all the previous strokes in the collection. You can see here several tones being rendered using prioritized stroke textures. At the bottom of the figure, you can see a continuous tone gradient. Klein and co-workers presented a system for NPR-endering of virtual environments. The basic idea was to apply NPR filters to the textures in the scene. The goal was to keep the character and size of the strokes, regardless of resolution at which the surfaces were being rendered. They achieved this by constructing custom mipmap levels, each level being produced by running the NPR filter with the same size strokes. Since the different levels were produced independently, there was no coherence between strokes at different levels, so the strokes appeared to be swimming as one zoomed in and out from a surface.  tone 

9 Tonal Art Maps Collection of stroke images
Will blend  design with high coherence Stroke nesting property demo  scale  Combining the 2 approaches, we produce Tonal Art Maps, a collection of stroke textures parameterized by both tone and resolution. When rendering a 3D object we will blend some of these textures, so we need to design them with a very high degree of coherence, in order to avoid blending artifacts. Specifically, we impose a stroke nesting property: all the strokes in one image appear in all the finer resolution images, and all the darker tone images. By transitive closure, they appear in ann the images to the right or down, in the Tonal Art Map array. [Demo] You can see that as the square gets darker, all the strokes persist, and a few others are gradually blended in. The same thing happens as we zoom in.  tone 

10 Stroke Nesting Property
Specifically, we impose a stroke nesting property: all the strokes in one image appear in all the darker and finer resolution images. Here is an example: [demo: square – change lighting, going from light to dark): as the tone gets darker, we keep the strokes that we have and gradually fade in an additional number of strokes. [zoom out quickly, then zoom in slowly, going back and forth around the mipmap level transition zones]: this also happens when we zoom in and out: many strokes persist while a few are being faded in or out.

11 Stroke Nesting Property
Strokes persist in finer & darker images Backup slide in case demo doesn’t work.

12 Approach Tonal Art Map Example stroke Result Mesh Preprocess Real-Time
The off-line part of our system consist of the creation of the Tonal Art Maps. Mesh Preprocess Real-Time

13 Generating Tonal Art Maps
Draw or import bitmap for one stroke Automatically fill TAM with strokes When placing stroke in an image, add it to all finer & darker images Fill table column by column, coarse to fine Space strokes evenly We fill the tonal art map table column by column. We start adding strokes in the lowest resolution image, until it reaches the desired tone. We add all the strokes at the same time in all the finer images. We then fix the image and continue adding strokes only to the higher resolution images. When the column is complete, we copy it over to the next column, and proceed to add strokes until we reach the tone corresponding to this new column.

14 Even Spacing of Strokes
Choose best stroke from large candidate pool Fitness = uniformity & progress towards tone candidate stroke candidate stroke To achieve an even distribution of strokes, we choose each stroke that we place from a large pool of candidate strokes. We measure the fitness of each candidate as the uniformity of all the images in which it is placed, combined with the progress towards the desired tone.

15 Even Spacing of Strokes
Choose best stroke from large candidate pool Fitness = uniformity & progress towards tone candidate stroke 1 TAM column (same tone) We measure uniformity using the stroke’s overlap with the previous strokes, in all the images of the column to which the stroke is being added.

16 Even Spacing of Strokes
Choose best stroke from large candidate pool Fitness = uniformity & progress towards tone 1 TAM column (same tone) Since in a fine resolution image we could place a stroke near, though not quite overlapping an existing stroke, we measure uniformity more accurately by keeping a Gaussian pyramid for all the images in the column. Keep Gaussian pyramid for all TAM images

17 Approach Tonal Art Map Example stroke Result Mesh Preprocess Real-Time
At runtime, for each mesh face we select and blend several TAM textures. Mesh Preprocess Real-Time

18 Continuity Stroke size continuity  mipmapping
Tone continuity  blend multiple textures Spatial continuity: same contribution for a texture on both sides of an edge spatial discontinuity For the resolution continuity of our renderings we rely on the mipmapping mechanism. For the tone continuity, we need to impose certain restrictions on our blending scheme. There are two kinds of continuity that we need to worry about: one is spatial continuity. Here is an example of a scheme that does not posess the spatial continuity: you can easily notice the face boundaries. We also desire temporal continuity for our scheme, in other words we want to avoid popping. Here is an example of popping. [demo] Temporal continuity: no “popping” demo

19 Texture Blending tone tone v1 v2 v3 6-way blend  final
To produce continuous tone, we blend up to six textures per face. Every vertex gets a blend of two textures, the floor and the ceiling of its desired tone value. We render this 2-way blend over the triangle, modulated by a linear ramp, corresponding to the vertex’ barycentric contribution to points inside the triangle. The ramp drops to 0 (no influence) at the edge opposing the vertex. According to its barycentric coordinates, each point inside the surface thus receives a linear combination of the 3 vertices, or up to 6 different textures. v1 v2 v3 6-way blend  final

20 Texture Blending Pack grayscale tones in R,G,B channels → 6 tones in 2 textures Use multitexture engine → single-pass 6-way blend Vertex programs compute blend weights → static vertex data !!VP1.0 #Vertex Program for Real-Time Hatching. //output vertex homogeneous coordinates DP4 R2.x, c[0], v[OPOS]; DP4 R2.y, c[1], v[OPOS]; DP4 R2.z, c[2], v[OPOS]; DP4 R2.w, c[3], v[OPOS]; MOV o[HPOS], R2; //stroke texture coordinates, transformed DP3 o[TEX0].x, c[4], v[TEX0]; DP3 o[TEX0].y, c[5], v[TEX0]; DP3 o[TEX1].x, c[4], v[TEX0]; DP3 o[TEX1].y, c[5], v[TEX0]; // splotch mask coordinates MOV o[TEX2], v[TEX0]; //get the Gouraud shade DP3 R1, c[8], v[NRML]; //apply clamp-linear tone transfer function MUL R1, R1, c[9].x; ADD R1, R1, c[9].y; MAX R1, R1, c[9].z; MIN R1, R1, c[9].w; //now look up the weights for the TAMs blending EXP R2.y, R1.x; //frac(tone) ARL A0.x, R1.x; MOV R3, c[A0.x + 10]; MAD R3, -R2.y, R3, R3; MAD o[COL1], R2.y, c[A0.x + 11], R3; MOV R4, c[A0.x + 20]; MAD R4, -R2.y, R4, R4; MAD o[COL0], R2.y, c[A0.x + 21], R4; END While blending 6 textures may seem daunting at first, it can be done very efficiently on current graphics. Since our tonal art maps are grayscale, we can pack them in the R,G,B channels of normal textures. We found that six tones are quite sufficient for representing scenes, so we can pack them in 2 traditional mipmapped textures. We can blend these using the multitexturing engine, by using dot products. On graphics cards that support programmable vertex shading, such as the NVidia GeForce3, the blend weights can be computed on the card, so all the vertex data is static on the card. This leads to very low bandwidth requirements between the CPU and graphics card, and high frame rates. For illustration purposes, here the vertex program that does the shading. As you can see, it is quite small. It has 23 instructions, compared to around 15 for normal textured rendering.

21 Approach Tonal Art Map Example stroke Result Mesh Lapped texture
I actually lied to you when I said we had one preprocessing step. To texture arbitrary surfaces, we first have to create a lapped texture , in an off-line process. Preprocess Real-Time

22 Texturing Arbitrary Surfaces
Lapped Textures [Praun et al. 2000] We presented Lapped Textures at Siggraph last year. The idea is that you can texture a surface by cutting patches from an example image and repeatedly pasting them in random places onto the surface until it is completely covered. For stochastic images, a small blend region is enough to hide the boundaries. Hatch-stroke images are by their nature very stochastic, and ideal to use with lapped textures. Of course, for this project we just use the lapped parameterization, since we don’t actually paste single images, but the whole stack of tones from out TAMs. However, the basic principle is the same.

23 Direction Field Based on surface principal curvatures
Optimized to be smooth [Hertzmann & Zorin 2000] Symmetry: 180º instead of 90º Sample on faces For anisotropic lapped textures, or textures that are not rotationally self-similar, you need a direction field. We use a method very similar to Hertzmann and Zorin’s. We initialize the direction field with principal curvature directions computed on a smooth coarse-resolution version of the mesh. We then smooth this field using a non-linear energy minimization. Our direction fields have 180 degree symmetry rather than 90 degrees as in Hertzmann’s work, and we sample it on faces instead of vertices.

24 Demo Here is a demo of our system. Notice when we zoom out how strokes fade out in order to simultaneously maintain the same tone and stroke width and density. Using our lapped parameterization, we can achieve a series of funny dynamic effects, such as rotating the strokes, translating them, or changing their size. As we zoom out the strokes, in the limit, the rendering reduces to standard Gouraud shading. [Venus] The venus is rendered with a Tonal Art Map consisting of curved strokes. [Rocker arm] Notice how the direction field helps reveal the shape of this mechanical piece: it follows the curvature along the inside cylinder and is parallel in the planar region. We can also animate the light. [Bunny] – The TAM applied to this bunny model has short strokes oriented at various angles, suggestive of fur. We painted the eyes and nose by hand, indicated that those regions of the mesh should be rendered using the darkest tone, regardless of the light position.

25 Demo Here is a stippling TAM. When applied to this gargoyle model, it gives the illusion of stone or rough concrete. Gargoyle

26 Demo  chalk gray charcoal  Venus
The demo is showing the venus being rendered in chalk and charcoal on gray paper style. Notice that the light tones and highlights are obtained using white strokes, while the shadow regions are rendered using black strokes. We use a tone transfer function on this model, to increase the size of the gray region, while reducing the clutter of strokes.  chalk gray charcoal  Venus

27 Result Back–up slide

28 Result Back-up slide

29 Summary Real-time hatching for NPR Strokes rendered as textures
High coherence TAMs prevent blend artifacts 6-way blend very fast on modern graphics In summary, we have demonstrated a system for the real-time rendering of 3D objects in stroke-based hatching styles. We render strokes efficiently by grouping them together in textures. To avoid blending artifacts, we design our TAM with a high degree of coherence between images. Finally, our high-valence blend can be done very efficiently on modern graphics, using the multitexturing engine.

30 Future Work More general TAMs View-dependent stroke direction
Automatic indication Bill Plympton As future work, we would like to investigate more general TAMs – what kinds of artistic effects can be achieved, what can you represent better using color, and so on. It is interesting to consider view-dependent stroke direction. It is known that you cannot comb a hairy ball –statically. However, if you can change the direction field smoothly and quickly enough maybe you can always hide the discontinuity pole or the part at the back of the ball. Finally, an interesting and challenging future direction is automatic indication. Indication is the process in which the artists draw only some graphics elements, as examples, letting the viewer mentally fill in the rest. In this example picture, we do not perceive this elephant as being white, even though very few strokes are actually drawn across its back.

31 Acknowledgements Support Microsoft Research, NSF Hardware Models
NVidia, Dell Models Viewpoint, Cyberware, Stanford, MIT Thanks Georges Winkenbach, Lee Markosian, Grady Klein Thank You.

32 NPAR 2002 International Symposium on Non-Photorealistic Animation and Rendering Annecy, France Submissions: November 12, 2001 Conference: June 3-5, 2002


Download ppt "Emil Praun Hugues Hoppe Matthew Webb Adam Finkelstein"

Similar presentations


Ads by Google