Presentation is loading. Please wait.

Presentation is loading. Please wait.

Week 5 - Friday CS361.

Similar presentations


Presentation on theme: "Week 5 - Friday CS361."— Presentation transcript:

1 Week 5 - Friday CS361

2 Last time What did we talk about last time? Quaternions
Vertex blending Morphing

3 Questions?

4 Project 2

5 Morphing Morphing is the technique for interpolating between two complete 3D models It has two problems: Vertex correspondence What if there is not a 1 to 1 correspondence between vertices? Interpolation How do we combine the two models?

6 Morphing continued We're going to ignore the correspondence problem (because it's really hard) If there's a 1 to 1 correspondence, we use parameter s[0,1] to indicate where we are between the models and then find the new location m based on the two locations p0 and p1 Morph targets is another technique that adds in weighted poses to a neutral model

7 Projections Finally, we deal with the issue of projecting the points into view space Since we only have a 2D screen, we need to map everything to x and y coordinates Like other transforms, we can accomplish projection with a 4 x 4 matrix Unlike affine transforms, projection transforms can affect the w components in homogeneous notation

8 Orthographic projections
An orthographic projection maintains the property that parallel lines are still parallel after projection The most basic orthographic projection matrix simply removes all the z values This projection is not ideal because z values are lost Things behind the camera are in front z-buffer algorithms don't work

9 Canonical view volume To maintain relative depths and allow for clipping, we usually set up a canonical view volume based on (l,r,b,t,n,f) These letters simply refer to the six bounding planes of the cube Left Right Bottom Top Near Far Here is the (OpenGL) matrix that translates all points and scales them into the canonical view volume

10 Stupid MonoGame OpenGL normalizes to a canonical view volume from [-1,1] in x, [-1,1] in y, and [-1,1] in z Just to be silly, MonoGame normalizes to a canonical view volume of [-1,1] in x, [-1,1] in y, and [0,1] in z Thus, its projection matrix is:

11 Perspective projection
A perspective projection does not preserve parallel lines Lines that are farther from the camera will appear smaller Thus, a view frustum must be normalized to a canonical view volume Because points actually move (in x and y) based on their z distance, there is a distorting term in the w row of the projection matrix

12 Perspective projection matrix
Here is the MonoGame projection matrix It is different from the OpenGL again because it only uses [0,1] for z

13 MonoGame Projection Examples

14 Student Lecture: Light and Materials

15 Light

16 Light [L]ight …travels so fast that it takes most races thousands of years to realise that it travels at all…. Douglas Adams Light is one of the most complex phenomena in the universe There are quantum effects, its dual wave/particle nature We will constantly be approximating the effect of light, since figuring out its real effect is virtually impossible

17 Break it down We will consider three processes in lighting a scene
Emitting light From the sun or light bulbs or whatever Interaction of light Light is absorbed by and scatters off objects in a scene Detection by a sensor A human eye (or a robot eye), camera, piece of film will sense the light We have to give at least cursory attention to each process to get realistic rendering

18 Light sources One of the easiest light sources to model are directional lights, such as the sun With directional lights, all the light travels the same direction, which we can model with a light vector l We assume that l is a unit vector l is defined in the opposite direction the light is traveling

19 Intensity of illumination
Besides direction, we need to know the amount of light Radiometry is the science of measuring light, and we'll talk more about it in two weeks Irradiance is the light's power passing through a unit area surface perpendicular to l Light can be colored by using RGB components

20 Surface irradiance Most light is not perpendicular to your surface
The surface irradiance is the perpendicular irradiance times cos θ, where θ is the angle between l and the surface normal n This is why l is the opposite of the direction of light flowing (so that we don't have to negate it) Also, we clamp the cos θ to [0,1] (no negative values)

21 Additive irradiance Real light is coming from many different directions The final effects of irradiance is additive Just sum up all the individual light effects Although we use RGB for light, there is not necessarily a maximum value Light is perceived logarithmically by humans High dynamic range displays and floating point color models can allow a better expression of light energy

22 Material

23 Material Once we know how much and what direction of light we're dealing with, the material it hits impacts the final effect a great deal These impacts are of two kinds: Scattering Absorption

24 Scattering Scattering is caused by an optical discontinuity
Difference in structure Change in density Scattering does not change the amount of light, only its direction There are two types of scattering Refraction (or transmission) Reflection

25 Refraction With refraction (or transmission) in (partially) transparent objects, the light continues to go through the object and may light other objects There are light bending effects Plus the Z-buffer algorithm doesn't work anymore We won't deal with that now

26 Specular and diffuse components
Light that is reflected will have a different direction and color than light that was transmitted into the surface, partially absorbed, and scattered back out We simplify by dividing into two terms Specular term (the reflected light) Diffuse term (the re-transmitted light)

27 Exitance Illumination reaching a surface is irradiance
Illumination leaving a surface is exitance (M) Although our perception of light is logarithmic, light-matter interaction is linear: Double the irradiance and you'll double the exitance The ratio between exitance and irradiance is essentially the surface color that you see back Surface color c = specular color + diffuse color

28 Modeling specular color
We will often assume that diffuse light has no directionality Specular light, however, bounces off a surface and spreads out less if the surface is smoother Color, texture, and the smoothness parameter are not absolute We may change them depending on how far we are from the object

29 Sensors

30 Don’t believe your eyes
We are going to describe mathematical models of sensors But how did humans investigate the nature of sensors in the first place? Can you trust your own sensors? Consider the following slide

31

32 Mach banding That slide is an example of Mach banding
In Mach banding, a lighter color on the edge of a darker color will appear to grow lighter as you get close to the border between them The darker color will do the reverse It's part of our brain's edge detection algorithm

33 Real sensors In general, sensors are made up of many tiny sensors
Rods and cones in the eye Photodiodes attached to a CCD in a digital camera Dye particles in traditional film Typically, an aperture restricts the directions from which the light can come Then, a lens focuses the light onto the sensor elements

34 Radiance Irradiance sensors can't produce an image because they average over all directions Lens + aperture = directionally specific Consequently, the sensors measure radiance (L), the density of light per flow area AND incoming direction

35 Idealized sensors In a rendering system, radiance is computed rather than measured A radiance sample for each imaginary sensor element is made along a ray that goes through the point representing the sensor and point p, the center of projection for the perspective transform The sample is computed by using a shading equation along the view ray v

36 Shading

37 Shading equations After all this hoopla is done, we need a mathematical equation to say what the color (radiance) at a particular pixel is There are many equations to use and people still do research on how to make them better Remember, these are all rule of thumb approximations and are only distantly related to physical law

38 Lambertian shading Diffuse exitance Mdiff = cdiff  EL cos θ
Lambertian (diffuse) shading assumes that outgoing radiance is (linearly) proportional to irradiance Because diffuse radiance is assumed to be the same in all directions, we divide by π (explained later) Final Lambertian radiance Ldiff =

39 Upcoming

40 Next time… Shading Lambertian Gouraud Phong Anti-aliasing

41 Reminders Keep working on Project 2, due Friday, March 17
Keep reading Chapter 5 Exam 1 is next Friday in class Start reviewing everything up to today


Download ppt "Week 5 - Friday CS361."

Similar presentations


Ads by Google