Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 28: Photometric Stereo

Similar presentations


Presentation on theme: "Lecture 28: Photometric Stereo"— Presentation transcript:

1 Lecture 28: Photometric Stereo
CS4670/5760: Computer Vision Kavita Bala Lecture 28: Photometric Stereo Thanks to Scott Wehrwein

2 Announcements PA 3 due at 1pm on Monday PA 4 out on Monday
HW 2 out on weekend Next week: MVS, sFM

3 Last Time: Two-View Stereo

4 Last Time: Two-View Stereo
Key Idea: use feature motion to understand shape

5 Today: Photometric Stereo
Key Idea: use pixel brightness to understand shape

6 Today: Photometric Stereo
Key Idea: use pixel brightness to understand shape

7 Photometric Stereo What results can you get? Input (1 of 12)
Normals (RGB colormap) Normals (vectors) Shaded 3D rendering Textured 3D rendering

8

9 Modeling Image Formation
Now we need to reason about: How light interacts with the scene How a pixel value is related to light energy in the world Track a “ray” of light all the way from light source to the sensor

10 Directional Lighting Key property: all rays are parallel
Equivalent to an infinitely distant point source

11 Lambertian Reflectance
Image intensity Surface normal Light direction Image intensity cos(angle between N and L)

12 Materials - Three Forms
Ideal diffuse (Lambertian) There are several types of surfaces that are assumed to make up the world. Ideal diffuse. It is similar to the diffuse emitter in that it emits all incident light equally in all outgoing directions. When light hits the surface it is reflected uniformly anywhere in the hemisphere above the surface. Ideal specular (perfect mirrors). Light is reflected exactly in the mirror direction of the incoming light. A perfect mirror is such a surface. Directional diffuse (almost all materials in the world). Incoming light is reflected into the hemisphere to different extents. If the object is more glossy then more of the incoming energy reflects close to the mirror direction. If it is more diffuse then the reflection is more uniform. Ideal specular Directional diffuse © Kavita Bala, Computer Science, Cornell University

13 Reflectance—Three Forms
Why spheres?! Ideal diffuse (Lambertian) Ideal specular Directional diffuse © Kavita Bala, Computer Science, Cornell University

14 Ideal Diffuse Reflection
Characteristic of multiple scattering materials An idealization but reasonable for matte surfaces Basis of most radiosity methods © Kavita Bala, Computer Science, Cornell University

15 © Kavita Bala, Computer Science, Cornell University
Ideal Diffuse Lambert’s Law © Kavita Bala, Computer Science, Cornell University

16 Ideal Specular Reflection
Calculated from Fresnel’s equations Exact for polished surfaces Basis of early ray-tracing methods © Kavita Bala, Computer Science, Cornell University

17 Lambertian Reflectance
Reflected energy is proportional to cosine of angle between L and N (incoming) Measured intensity is viewpoint-independent (outgoing)

18 Lambertian Reflectance: Incoming
Reflected energy is proportional to cosine of angle between L and N

19 Lambertian Reflectance: Incoming
Reflected energy is proportional to cosine of angle between L and N

20 Lambertian Reflectance: Incoming
Reflected energy is proportional to cosine of angle between L and N Light hitting surface is proportional to the cosine

21 Lambertian Reflectance: Outgoing
Radiance (what we see) is viewpoint-independent

22 Lambertian Reflectance: Outgoing
Radiance (what the eye sees) is viewpoint-independent Emission rate of photons: B0 cos theta Radiance: (Photons/second)/(area steradian) Looking down get B0/dA Looking at angle get B0 cos theta/(dA cos theta)

23 Lambertian Reflectance: Outgoing
Measured intensity is viewpoint-independent

24 Lambertian Reflectance: Outgoing
Measured intensity is viewpoint-independent A cos (q) Radiance (what eye sees)

25 Image Formation Model: Final
Diffuse albedo: what fraction of incoming light is reflected? Introduce scale factor Light intensity: how much light is arriving? Compensate with camera exposure (global scale factor) Camera response function Assume pixel value is linearly proportional to incoming energy (perform radiometric calibration if not)

26 A Single Image: Shape from Shading
Assume is 1 for now. What can we measure from one image? is the angle between N and L Add assumptions: A few known normals (e.g. silhouettes) Smoothness of normals In practice, SFS doesn’t work very well: assumptions are too restrictive, too much ambiguity in nontrivial scenes.

27 Multiple Images: Photometric Stereo
N V Write this as a matrix equation:

28 Solving the Equations 28

29 Solving the Equations When is L nonsingular (invertible)?
>= 3 light directions are linearly independent, or: All light direction vectors cannot lie in a plane. What if we have more than one pixel? Stack them all into one big system. 29

30 More than Three Lights Solve using least squares (normal equations):
Equivalently use SVD Given G, solve for N and as before. Or equivalently use the SVD

31 More than one pixel Previously: I = N L * 1 x # images 1 x 3

32 More than one pixel Stack all pixels into one system: Solve as before.
p x # images p x 3 3 x # images I = N L *

33 Depth Map from Normal Map
We now have a surface normal, but how do we get depth? Assume a smooth surface V1 V2 orthographic projection N Get a similar equation for V2 Each normal gives us two linear constraints on z compute z values by solving a matrix equation 33

34 Determining Light Directions
Trick: Place a mirror ball in the scene. The location of the highlight is determined by the light source direction. 34

35 Real-World HDR Lighting Environments
Funston Beach Eucalyptus Grove Grace Cathedral Uffizi Gallery Lighting Environments from the Light Probe Image Gallery: © Kavita Bala, Computer Science, Cornell University

36 © Kavita Bala, Computer Science, Cornell University
Mirrored Sphere © Kavita Bala, Computer Science, Cornell University

37 © Kavita Bala, Computer Science, Cornell University

38 Acquiring the Light Probe
© Kavita Bala, Computer Science, Cornell University

39 Assembling the Light Probe
© Kavita Bala, Computer Science, Cornell University

40 Extreme HDR Image Series
Focal length in terms of pupil diameter. 4 times pupil diameter.f/16 gets Less light. 1 sec f/4 1/4 sec f/4 1/30 sec f/4 1/30 sec f/16 1/250 sec f/16 1/1000 sec f/16 1/8000 sec f/16 © Kavita Bala, Computer Science, Cornell University

41 Extreme HDR Image Series sun closeup
1 sec f/4 1/4 sec f/4 1/30 sec f/4 1/30 sec f/16 1/250 sec f/16 1/1000 sec f/16 1/8000 sec f/16 only image that does not saturate! © Kavita Bala, Computer Science, Cornell University

42 © Kavita Bala, Computer Science, Cornell University
HDRI Sky Probe © Kavita Bala, Computer Science, Cornell University

43 Determining Light Directions
For a perfect mirror, the light is reflected across N: So the light source direction is given by: 43

44 Determining Light Directions
For a sphere with highlight at point H: R = direction of the camera from C = Compute N: 44

45 from Athos Georghiades
Results from Athos Georghiades 45

46 Normals (RGB colormap)
Results Input (1 of 12) Normals (RGB colormap) Normals (vectors) Shaded 3D rendering Textured 3D rendering

47 Questions?


Download ppt "Lecture 28: Photometric Stereo"

Similar presentations


Ads by Google