Presentation is loading. Please wait.

Presentation is loading. Please wait.

Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.

Similar presentations


Presentation on theme: "Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing."— Presentation transcript:

1 Week 8 - Wednesday

2  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing  Texture animation  Material mapping  Alpha mapping  Bump mapping  Normal maps  Parallax mapping  Relief mapping  Heightfield texturing

3

4

5

6  Radiometry is the measurement of electromagnetic radiation (for us, specifically light)  Light is the flow of photons  We'll generally think of photons as particles, rather than waves  Photon characteristics  Frequency ν = c/λ (Hertz)  Wavelength λ = c/ν (meters)  Energy Q = hν (joules) [h is Planck's constant]

7  We'll be interested in the following radiometric quantities QuantityUnit Radiant energyjoule (J) Radiant fluxwatt (W) IrradianceW/m 2 Radiant intensityW/sr RadianceW/(m 2 sr)

8  Radiant flux: energy per unit time (power)  Irradiance: energy per unit time through a surface  Intensity: energy per unit time per steradian

9  The radiance L is what we care about since that's what sensors detect  We can think of radiance as the portion of irradiance within a solid angle  Or, we can think of radiance as the portion of a light's intensity that flow through a surface  Radiance doesn't change with distance

10

11  Radiometry just deals with physics  Photometry takes everything from radiometry and weights it by the sensitivity of the human eye  Photometry is just trying to account for the eye's differing sensitivity to different wavelengths

12  Because they're just rescalings of radiometric units, every photometric unit is based on a radiometric one  Luminance is often used to describe the brightness of surfaces, such as LCD screens Radiometric Quantity UnitPhotometric Quantity Unit Radiant energyjoule (J)Luminous energytalbot Radiant fluxwatt (W)Luminous fluxlumen IrradianceW/m 2 Illuminancelux Radiant intensityW/srLuminous intensitycandela RadianceW/(m 2 sr)Luminancenit

13  Colorimetry is the science of quantifying human color perception  The CIE defined a system of three non-monochromatic colors X, Y, and Z for describing the human perceivable color space  RGB is a transform from these values into monochromatic red, green, and blue colors  RGB can only express colors in the triangle  As you know, there are others (HSV, HSL, etc.)

14

15  Real light behaves consistently (but in a complex way)  For rendering purposes, we often divide light into categories that are easy to model  Directional lights (like the sun)  Omni lights (located at a point, but evenly illuminate in all directions)  Spotlights (located at a point and have intensity that varies with direction)  Textured lights (give light projections variety in shape or color) ▪ Similar to gobos, if you know anything about stage lighting

16  With a programmable pipeline, you can express lighting models of limitless complexity  The old DirectX fixed function pipeline provided a few stock lighting models  Ambient lights  Omni lights  Spotlights  Directional lights  All lights have diffuse, specular, and ambient color  Let's see how to implement these lighting models with shaders

17  Ambient lights are very simple to implement in shaders  We've already seen the code  The vertex shader must simply transform the vertex into clip space (world x view x projection)  The pixel shader colors each fragment a constant color  We could modulate this by a texture if we were using one

18 float4x4 World; float4x4 View; float4x4 Projection; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity; struct VertexShaderInput { float4 Position : SV_Position; }; struct VertexShaderOutput { float4 Position : SV_Position; }; float4x4 World; float4x4 View; float4x4 Projection; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity; struct VertexShaderInput { float4 Position : SV_Position; }; struct VertexShaderOutput { float4 Position : SV_Position; };

19 VertexShaderOutput VertexShaderFunction( VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); return output; } VertexShaderOutput VertexShaderFunction( VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); return output; }

20 float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { return AmbientColor * AmbientIntensity; } float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { return AmbientColor * AmbientIntensity; } technique Ambient { pass Pass1 { VertexShader = compile vs_2_0 VertexShaderFunction(); PixelShader = compile ps_2_0 PixelShaderFunction(); } technique Ambient { pass Pass1 { VertexShader = compile vs_2_0 VertexShaderFunction(); PixelShader = compile ps_2_0 PixelShaderFunction(); }

21  Directional lights model lights from a very long distance with parallel rays, like the sun  It only has color (specular and diffuse) and direction  They are virtually free from a computational perspective  Directional lights are also the standard model for BasicEffect  You don't have to use a shader to do them  Let's look at a diffuse shader first

22  We add values for the diffuse light intensity and direction  We add a WorldInverseTranspose to transform the normals  We also add normals to our input and color to our output float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float3 DiffuseLightDirection = float3(1, 1, 0); float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 Color : COLOR; }; float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float3 DiffuseLightDirection = float3(1, 1, 0); float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 Color : COLOR; };

23  Color depends on the surface normal dotted with the light vector VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = mul(input.Normal, (float3x3)WorldInverseTranspose); float lightIntensity = dot(normalize(normal), normalize(DiffuseLightDirection)); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); return output; } VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = mul(input.Normal, (float3x3)WorldInverseTranspose); float lightIntensity = dot(normalize(normal), normalize(DiffuseLightDirection)); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); return output; }

24  No real differences here  The diffuse color and ambient colors are added together  The technique is exactly the same float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { return saturate(input.Color + AmbientColor * AmbientIntensity); } float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { return saturate(input.Color + AmbientColor * AmbientIntensity); }

25  Adding a specular component to the diffuse shader requires incorporating the view vector  It will be included in the shader file and be set as a parameter in the C# code

26  The camera location is added to the declarations  As are specular colors and a shininess parameter float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 Camera; static const float PI = 3.14159265f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float3 DiffuseLightDirection = float3(1, 1, 0); float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5; float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 Camera; static const float PI = 3.14159265f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float3 DiffuseLightDirection = float3(1, 1, 0); float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5;

27  The output adds a normal so that the half vector can be computed in the pixel shader  A world position lets us compute the view vector to the camera struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 Color : COLOR; float3 Normal : NORMAL; float4 WorldPosition : POSITIONT; }; struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 Color : COLOR; float3 Normal : NORMAL; float4 WorldPosition : POSITIONT; };

28  The same computations as the diffuse shader, but we store the normal and the transformed world position in the output VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); float lightIntensity = dot(normal, normalize(DiffuseLightDirection)); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); output.Normal = normal; return output; } VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); float lightIntensity = dot(normal, normalize(DiffuseLightDirection)); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); output.Normal = normal; return output; }

29  Here we finally have a real computation because we need to use the pixel normal (which is averaged from vertices) in combination with the view vector  The technique is the same float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 light = normalize(DiffuseLightDirection); float3 normal = normalize(input.Normal); float3 reflect = normalize(2 * dot(light, normal) * normal – light); float3 view = normalize(input.WorldPosition - Camera); float dotProduct = dot(reflect, view); float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color); return saturate(input.Color + AmbientColor * AmbientIntensity + specular); } float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 light = normalize(DiffuseLightDirection); float3 normal = normalize(input.Normal); float3 reflect = normalize(2 * dot(light, normal) * normal – light); float3 view = normalize(input.WorldPosition - Camera); float dotProduct = dot(reflect, view); float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color); return saturate(input.Color + AmbientColor * AmbientIntensity + specular); }

30  Point lights model omni lights at a specific position  They generally attenuate (get dimmer) over a distance and have a maximum range  DirectX has a constant attenuation, linear attenuation, and a quadratic attenuation  You can choose attenuation levels through shaders  They are more computationally expensive than directional lights because a light vector has to be computed for every pixel  It is possible to implement point lights in a deferred shader, lighting only those pixels that actually get used

31  We add light position float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 LightPosition; float3 Camera; static const float PI = 3.14159265f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1f; float LightRadius = 50; float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5f; float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 LightPosition; float3 Camera; static const float PI = 3.14159265f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1f; float LightRadius = 50; float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5f;

32  We no longer need color in the output  We do need the vector to the camera from the location  We keep the world location at that fragment struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 WorldPosition : POSITIONT; float3 Normal : NORMAL; }; struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 WorldPosition : POSITIONT; float3 Normal : NORMAL; };

33  We compute the normal and the world position VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); output.Normal = normal; return output; } VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); output.Normal = normal; return output; }

34  Lots of junk in here float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 normal = normalize(input.Normal); float3 lightDirection = LightPosition – (float3)input.WorldPosition; float intensity = pow(1.0f – saturate(length(lightDirection)/LightRadius), 2); lightDirection = normalize(lightDirection); //normalize after float3 view = normalize(Camera - (float3)input.WorldPosition); float diffuseColor = dot(normal, lightDirection); float3 reflect = normalize(2 * diffuseColor * normal – lightDirection); float dotProduct = dot(reflect, view); float specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(diffuseColor); return saturate(diffuseColor + AmbientColor * AmbientIntensity + specular); } float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 normal = normalize(input.Normal); float3 lightDirection = LightPosition – (float3)input.WorldPosition; float intensity = pow(1.0f – saturate(length(lightDirection)/LightRadius), 2); lightDirection = normalize(lightDirection); //normalize after float3 view = normalize(Camera - (float3)input.WorldPosition); float diffuseColor = dot(normal, lightDirection); float3 reflect = normalize(2 * diffuseColor * normal – lightDirection); float dotProduct = dot(reflect, view); float specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(diffuseColor); return saturate(diffuseColor + AmbientColor * AmbientIntensity + specular); }

35

36

37  BRDFs  Implementing BRDFs  Texture mapping in shaders

38  Finish reading Chapter 7  Summer REU opportunity:  Machine learning at the Florida Institute of Technology  Deadline March 31, 2015  http://www.amalthea-reu.org/


Download ppt "Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing."

Similar presentations


Ads by Google