Download presentation

Presentation is loading. Please wait.

Published byGary Simeon Modified over 2 years ago

1
Week 8 - Friday

2
What did we talk about last time? Radiometry Photometry Colorimetry Lighting with shader code Ambient Directional (diffuse and specular) Point

6
Adding a specular component to the diffuse shader requires incorporating the view vector It will be included in the shader file and be set as a parameter in the C# code

7
The camera location is added to the declarations As are specular colors and a shininess parameter float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 Camera; static const float PI = 3.14159265f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float3 DiffuseLightDirection = float3(1, 1, 0); float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5; float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 Camera; static const float PI = 3.14159265f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float3 DiffuseLightDirection = float3(1, 1, 0); float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5;

8
The output adds a normal so that the half vector can be computed in the pixel shader A world position lets us compute the view vector to the camera struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 Color : COLOR; float3 Normal : NORMAL; float4 WorldPosition : POSITIONT; }; struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 Color : COLOR; float3 Normal : NORMAL; float4 WorldPosition : POSITIONT; };

9
The same computations as the diffuse shader, but we store the normal and the transformed world position in the output VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); float lightIntensity = dot(normal, normalize(DiffuseLightDirection)); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); output.Normal = normal; return output; } VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); float lightIntensity = dot(normal, normalize(DiffuseLightDirection)); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); output.Normal = normal; return output; }

10
Here we finally have a real computation because we need to use the pixel normal (which is averaged from vertices) in combination with the view vector The technique is the same float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 light = normalize(DiffuseLightDirection); float3 normal = normalize(input.Normal); float3 reflect = normalize(2 * dot(light, normal) * normal – light); float3 view = normalize(input.WorldPosition - Camera); float dotProduct = dot(reflect, view); float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color); return saturate(input.Color + AmbientColor * AmbientIntensity + specular); } float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 light = normalize(DiffuseLightDirection); float3 normal = normalize(input.Normal); float3 reflect = normalize(2 * dot(light, normal) * normal – light); float3 view = normalize(input.WorldPosition - Camera); float dotProduct = dot(reflect, view); float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color); return saturate(input.Color + AmbientColor * AmbientIntensity + specular); }

11
Point lights model omni lights at a specific position They generally attenuate (get dimmer) over a distance and have a maximum range DirectX has a constant attenuation, linear attenuation, and a quadratic attenuation You can choose attenuation levels through shaders They are more computationally expensive than directional lights because a light vector has to be computed for every pixel It is possible to implement point lights in a deferred shader, lighting only those pixels that actually get used

12
We add light position float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 LightPosition; float3 Camera; static const float PI = 3.14159265f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1f; float LightRadius = 50; float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5f; float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 LightPosition; float3 Camera; static const float PI = 3.14159265f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1f; float LightRadius = 50; float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5f;

13
We no longer need color in the output We do need the vector to the camera from the location We keep the world location at that fragment struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 WorldPosition : POSITIONT; float3 Normal : NORMAL; }; struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 WorldPosition : POSITIONT; float3 Normal : NORMAL; };

14
We compute the normal and the world position VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); output.Normal = normal; return output; } VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); output.Normal = normal; return output; }

15
Lots of junk in here float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 normal = normalize(input.Normal); float3 lightDirection = LightPosition – (float3)input.WorldPosition; float intensity = pow(1.0f – saturate(length(lightDirection)/LightRadius), 2); lightDirection = normalize(lightDirection); //normalize after float3 view = normalize(Camera - (float3)input.WorldPosition); float diffuseColor = dot(normal, lightDirection); float3 reflect = normalize(2 * diffuseColor * normal – lightDirection); float dotProduct = dot(reflect, view); float specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(diffuseColor); return saturate(diffuseColor + AmbientColor * AmbientIntensity + specular); } float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 normal = normalize(input.Normal); float3 lightDirection = LightPosition – (float3)input.WorldPosition; float intensity = pow(1.0f – saturate(length(lightDirection)/LightRadius), 2); lightDirection = normalize(lightDirection); //normalize after float3 view = normalize(Camera - (float3)input.WorldPosition); float diffuseColor = dot(normal, lightDirection); float3 reflect = normalize(2 * diffuseColor * normal – lightDirection); float dotProduct = dot(reflect, view); float specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(diffuseColor); return saturate(diffuseColor + AmbientColor * AmbientIntensity + specular); }

18
The bidirectional reflectance distribution function is a function that describes the difference between outgoing radiance and incoming irradiance This function changes based on: Wavelength Angle of light to surface Angle of viewer from surface For point or directional lights, we do not need differentials and can write the BRDF:

19
We've been talking about lighting models Lambertian, specular, etc. A BRDF is an attempt to model physics slightly better A big difference is that different wavelengths are absorbed and reflected different by different materials Rendering models in real time with (more) accurate BRDFs is still an open research problem

20
They also have global lighting (shadows and reflections) Taken from www.kevinbeason.comwww.kevinbeason.com

21
The BRDF is supposed to account for all the light interactions we discussed in Chapter 5 (reflection and refraction) We can see the similarity to the lighting equation from Chapter 5, now with a BRDF:

22
If the subsurface scattering effects are great, the size of the pixel may matter Then, a bidirectional surface scattering reflectance distribution function (BSSRDF) is needed Or if the surface characteristics change in different areas, you need a spatially varying BRDF And so on…

23
Helmholtz reciprocity: f(l,v) = f(v,l) Conservation of energy: Outgoing energy cannot be greater than incoming energy The simplest BRDF is Lambertian shading We assume that energy is scattered equally in all directions Integrating over the hemisphere gives a factor of π Dividing by π gives us exactly what we saw before:

25
We'll start with our specular shader for directional light and add textures to it

26
The texture for the ship is below:

27
We add a texture variable called ModelTexture We also add a SamplerState structure that specifies how to filter the texture Texture2D ModelTexture; SamplerState ModelTextureSampler { Filter = MIN_MAG_MIP_LINEAR; AddressU = Clamp; AddressV = Clamp; }; Texture2D ModelTexture; SamplerState ModelTextureSampler { Filter = MIN_MAG_MIP_LINEAR; AddressU = Clamp; AddressV = Clamp; };

28
We add a texture coordinate to the input and the output of the vertex shader struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; float2 Texture : TEXCOORD; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 Color : COLOR; float3 Normal : NORMAL; float4 WorldPosition : POSITIONT; float2 Texture : TEXCOORD; }; struct VertexShaderInput { float4 Position : SV_POSITION; float3 Normal : NORMAL; float2 Texture : TEXCOORD; }; struct VertexShaderOutput { float4 Position : SV_POSITION; float4 Color : COLOR; float3 Normal : NORMAL; float4 WorldPosition : POSITIONT; float2 Texture : TEXCOORD; };

29
Almost nothing changes here except that we copy the input texture coordinate into the output VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); float lightIntensity = dot(normal, normalize(DiffuseLightDirection)); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); output.Normal = normal; output.Texture = input.Texture; return output; } VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); output.WorldPosition = worldPosition; float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); float lightIntensity = dot(normal, normalize(DiffuseLightDirection)); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); output.Normal = normal; output.Texture = input.Texture; return output; }

30
We have to pull the color from the texture and set its alpha to 1 Then scale the components of the color by the texture color float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 light = normalize(DiffuseLightDirection); float3 normal = normalize(input.Normal); float3 reflect = normalize(2 * dot(light, normal) * normal – light); float3 view = normalize(input.WorldPosition - Camera); float dotProduct = dot(reflect, view); float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color); float4 textureColor = ModelTexture.Sample(ModelTextureSampler, input.Texture); textureColor.a = 1; return saturate(textureColor * input.Color + AmbientColor * AmbientIntensity + specular); } float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target { float3 light = normalize(DiffuseLightDirection); float3 normal = normalize(input.Normal); float3 reflect = normalize(2 * dot(light, normal) * normal – light); float3 view = normalize(input.WorldPosition - Camera); float dotProduct = dot(reflect, view); float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color); float4 textureColor = ModelTexture.Sample(ModelTextureSampler, input.Texture); textureColor.a = 1; return saturate(textureColor * input.Color + AmbientColor * AmbientIntensity + specular); }

31
To use a texture, we naturally have to load a texture We have to set the texture, but as a resource, not as a value texture = Content.Load (" ShipTexture "); effect.Parameters["ModelTexture"].SetResource ( shipTexture); effect.Parameters["ModelTexture"].SetResource ( shipTexture);

32
It's easiest to do bump mapping in SharpDX using a normal map Of course, a normal map is hard to create by hand What's more common is to create a height map and then use a tool for creating a normal map from it xNormal is a free utility to do this http://www.xnormal.net/Downloads.aspx

33
The conversion from a grayscale height map to a normal map looks like this

34
We have a normal to a surface, but there are also tangent directions We call these the tangent and the binormal Apparently serious mathematicians think it should be called the bitangent The binormal is tangent to the surface and orthogonal to the other tangent We distort the normal with weighted sums of the tangent and binormal (stored in our normal map) Normal Tangent Binormal

36
Choosing BRDFs Implementing BRDFs Image-based approaches to sampling BRDFs

37
Finish reading Chapter 7 Finish Project 2 Due tonight by midnight

Similar presentations

OK

Radiometry. Outline What is Radiometry? Quantities Radiant energy, flux density Irradiance, Radiance Spherical coordinates, foreshortening Modeling surface.

Radiometry. Outline What is Radiometry? Quantities Radiant energy, flux density Irradiance, Radiance Spherical coordinates, foreshortening Modeling surface.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Musculoskeletal system anatomy and physiology ppt on cells Ppt on rational numbers for class 7 By appt only movie post Ppt on causes of land pollution Ppt on shell scripting and logic Ppt on general etiquettes meaning Ppt on taj hotel mumbai Ppt on track feed battery charger Ppt on global warming for class 5 Animated ppt on magnetism and electromagnetism