Download presentation
Presentation is loading. Please wait.
Published byLambert Rice Modified over 9 years ago
1
CHAPTER 10 Alpha Blending and Fog © 2008 Cengage Learning EMEA
2
LEARNING OBJECTIVES In this chapter you will learn about: – –Alpha blending – –Implementing blending – –Implementing blending using shaders – –Alpha testing – –Fog – –The physics of fog – –Implementing fog
3
ALPHA BLENDING Alpha blending is a technique used to render the transparent and/or semi-transparent areas in textures. An alpha channel can be considered a fourth color component, i.e. an add-on to our current red–green–blue (RGB) color model controlling the transparency of a pixel. An alpha channel can be considered a fourth color component, i.e. an add-on to our current red–green–blue (RGB) color model controlling the transparency of a pixel. The alpha channel, just like the RGB color components, stores an 8-bit value ranging from ‘0’ to ‘255’ (‘0’ indicating full transparency and ‘255’ a solid area). The alpha channel, just like the RGB color components, stores an 8-bit value ranging from ‘0’ to ‘255’ (‘0’ indicating full transparency and ‘255’ a solid area).
4
ALPHA BLENDING
5
Alpha blending works by merging the per- pixel color of one texture with that of another – specifically by combining the color of an alpha-enabled texture with a texture already present at the corresponding screen location.
6
ALPHA BLENDING The alpha blending equation calculates the final blended color by adding the overlaid pixel color, overlaidPixelColor (which has some specific alpha percentage, alphaPercentage) to the pixel color of the opaque surface, originalPixelColor, with the alpha percentage being subtracted from its opaque solid color:
7
ALPHA BLENDING The Truevision TGA (.tga) image format offers a solution to the bitmap file format’s lack of compression, transparency, and limited use. – –It was one of the first formats to support truecolor (millions of colors) and is a 32-bit image format with support for transparency and RLE (run-length encoding) compression.
8
ALPHA BLENDING
9
Implementing Blending [see the textbook and online source code for a detailed example and discussion].
10
Implementing Blending using Shaders Blending can also be performed using pixel and vertex shaders; specifically, textures are combined via fragment program operations and subsequently written to the frame buffer. Shaders also allow significantly more control over the blending operation, leading to a vast array of advanced blending effects. [see the textbook for a detailed example and discussion].
11
Alpha Testing Alpha testing is a technique that controls whether pixels are written to the render-target. Each pixel of a texture image is either rendered or discarded based on its alpha value. – –Thus, the pixel is only written to the render-target if its alpha value is more than 0.5, for example. Alpha testing is much simpler to implement than blending as it doesn’t require the initialization of an alpha blend state. [see the textbook and online source code for a detailed example and discussion].
12
FOG Fog can be described as dense vapour of condensed particles reducing visibility to less than a kilometre.
13
FOG
14
Fog Fog is used for countless environmental effects, its implementation specifically adding to the mood of outdoor environments while at the same time improving rendering performance when combined with the culling of fully obscured objects.
15
Fog Blending is controlled using a linear, exponential, or Gaussian equation. – –Varying these equations results in blending factor changes. The blending factor is just a way for us to measure exactly how much a certain color value will be blended with the fog color at a given time. For example, linear fog (as shown in Figure 10-8) will ‘dense up’ at a constant rate as an object’s distance from the viewer increases. Exponential fog will, on the other hand, increase much more rapidly after a specific distance from the viewer.
16
The Physics of Fog Fog particles, such as smoke and condensed vapor, scatter and absorb light travelling from one point to another. Scattered light can either be directed at or away from the viewer, resulting in a cloudy white color for water vapor and a greyish color for smog due to the higher light absorption rate of smoke particles (depending on its thickness). The calculation of fog, as previously mentioned, is based on a distance function – specifically, either a linear (fog increases linearly from the point of view to an end point) or exponential function (fog intensifies exponentially from the point-of-view to an end point).
17
Implementing Fog Implementing fog in a Direct3D 10 environment requires emulation of Direct3D 9’s fixed function pipeline via use of an HLSL vertex and pixel shader. The implementation is relatively simple with a pre- calculated fog factor determining the amount of fog obscuring a particular pixel. Instead of calculating the fog distance using the planar eye distance (cameraPosition.z), we can also use the HLSL or Cg length function to return the floating-point Euclidean eye distance. OpenGL offers support for Gaussian (GL_EXP2), exponential (GL_EXP) and linear fog (GL_LINEAR). [see the textbook and online source code for a detailed example and discussion].
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.