Computer Vision CS 776 Spring 2014 Cameras & Photogrammetry 3 Prof. Alex Berg (Slide credits to many folks on individual slides)

Slides:



Advertisements
Similar presentations
Computer Vision Radiometry. Bahadir K. Gunturk2 Radiometry Radiometry is the part of image formation concerned with the relation among the amounts of.
Advertisements

CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 5 – Image formation (photometry)
Lecture 35: Photometric stereo CS4670 / 5670 : Computer Vision Noah Snavely.
Capturing light Source: A. Efros. Image formation How bright is the image of a scene point?
Computer Vision CS 776 Spring 2014 Photometric Stereo and Illumination Cones Prof. Alex Berg (Slide credits to many folks on individual slides)
The Radiance Equation Mel Slater. Outline Introduction Light Simplifying Assumptions Radiance Reflectance The Radiance Equation Traditional Rendering.
Foundations of Computer Graphics (Spring 2012) CS 184, Lecture 21: Radiometry Many slides courtesy Pat Hanrahan.
16421: Vision Sensors Lecture 6: Radiometry and Radiometric Calibration Instructor: S. Narasimhan Wean 5312, T-R 1:30pm – 2:50pm.
Basic Principles of Surface Reflectance
RADIOSITY Submitted by CASULA, BABUPRIYANK. N. Computer Graphics Computer Graphics Application Image Synthesis Animation Hardware & Architecture.
Radiometry. Outline What is Radiometry? Quantities Radiant energy, flux density Irradiance, Radiance Spherical coordinates, foreshortening Modeling surface.
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Material Representation K. H. Ko School of Mechatronics Gwangju Institute.
Advanced Computer Graphics (Spring 2013) CS 283, Lecture 8: Illumination and Reflection Many slides courtesy.
Review Pinhole projection model What are vanishing points and vanishing lines? What is orthographic projection? How can we approximate orthographic projection?
Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Radiometry Questions: –how “bright” will surfaces be? –what is “brightness”?
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Fall 2014.
Torrance Sparrow Model of Reflectance + Oren Nayar Model of Reflectance.
AA II z f Surface Radiance and Image Irradiance Same solid angle Pinhole Camera Model.
Announcements Midterm out today due in a week Project2 no demo session artifact voting TBA.
Capturing light Source: A. Efros. Review Pinhole projection models What are vanishing points and vanishing lines? What is orthographic projection? How.
Light Readings Forsyth, Chapters 4, 6 (through 6.2) by Ted Adelson.
Lecture 30: Light, color, and reflectance CS4670: Computer Vision Noah Snavely.
3-D Computer Vision CSc83029 / Ioannis Stamos 3-D Computer Vision CSc Radiometry and Reflectance.
Stefano Soatto (c) UCLA Vision Lab 1 Homogeneous representation Points Vectors Transformation representation.
Image Formation (approximately) Vision infers world properties form images. So we need to understand how images depend on these properties. Two key elements.
Radiometry, lights and surfaces
© 2002 by Davi GeigerComputer Vision January 2002 L1.1 Image Formation Light can change the image (and appearances). What is the relation between pixel.
Computer Graphics (Fall 2008) COMS 4160, Lecture 19: Illumination and Shading 2
Introduction to Computer Vision CS / ECE 181B Tues, May 18, 2004 Ack: Matthew Turk (slides)
Basic Principles of Surface Reflectance
© 2004 by Davi GeigerComputer Vision February 2004 L1.1 Image Formation Light can change the image and appearances (images from D. Jacobs) What is the.
Computer Vision Sources, shading and photometric stereo Marc Pollefeys COMP 256.
Lecture 32: Photometric stereo, Part 2 CS4670: Computer Vision Noah Snavely.
Image formation 2. Blur circle A point at distance is imaged at point from the lens and so Points a t distance are brought into focus at distance Thus.
© 2003 by Davi GeigerComputer Vision September 2003 L1.1 Image Formation Light can change the image and appearances (images from D. Jacobs) What is the.
Computer Graphics (Fall 2004) COMS 4160, Lecture 16: Illumination and Shading 2 Lecture includes number of slides from.
Lecture 20: Light, reflectance and photometric stereo
Lighting affects appearance. Light Source emits photons Photons travel in a straight line When they hit an object they: bounce off in a new direction.
Light and shading Source: A. Efros.
EECS 274 Computer Vision Light and Shading. Radiometry – measuring light Relationship between light source, surface geometry, surface properties, and.
Reflectance Map: Photometric Stereo and Shape from Shading
Computer Vision - A Modern Approach Set: Sources, shadows and shading Slides by D.A. Forsyth Sources and shading How bright (or what colour) are objects?
Capturing light Source: A. Efros.
Sources, Shadows, and Shading
Radiometry and Photometric Stereo 1. Estimate the 3D shape from shading information Can you tell the shape of an object from these photos ? 2.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2012.
Announcements Office hours today 2:30-3:30 Graded midterms will be returned at the end of the class.
04/30/02(c) 2002 University of Wisconsin Last Time Subdivision techniques for modeling We are now all done with modeling, the standard hardware pipeline.
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Material Representation K. H. Ko School of Mechatronics Gwangju Institute.
CSCE 641 Computer Graphics: Reflection Models Jinxiang Chai.
Lecture 34: Light, color, and reflectance CS4670 / 5670: Computer Vision Noah Snavely.
Shape from Shading Course web page: vision.cis.udel.edu/cv February 26, 2003  Lecture 6.
776 Computer Vision Jan-Michael Frahm Fall Capturing light Source: A. Efros.
In the name of God Computer Graphics. Last Time Some techniques for modeling Today Global illumination and raytracing.
1Ellen L. Walker 3D Vision Why? The world is 3D Not all useful information is readily available in 2D Why so hard? “Inverse problem”: one image = many.
Radiometry of Image Formation Jitendra Malik. A camera creates an image … The image I(x,y) measures how much light is captured at pixel (x,y) We want.
01/27/03© 2002 University of Wisconsin Last Time Radiometry A lot of confusion about Irradiance and BRDFs –Clarrified (I hope) today Radiance.
CS552: Computer Graphics Lecture 33: Illumination and Shading.
EECS 274 Computer Vision Sources, Shadows, and Shading.
1 Ch. 4: Radiometry–Measuring Light Preview 。 The intensity of an image reflects the brightness of a scene, which in turn is determined by (a) the amount.
Announcements Project 3a due today Project 3b due next Friday.
CS580: Radiometry Sung-Eui Yoon ( 윤성의 ) Course URL:
MAN-522 Computer Vision Spring
CS262 – Computer Vision Lect 4 - Image Formation
Shading Revisited Some applications are intended to produce pictures that look photorealistic, or close to it The image should look like a photograph A.
Advanced Computer Graphics
Radiometry (Chapter 4).
© 2002 University of Wisconsin
Capturing light Source: A. Efros.
Lecture 28: Photometric Stereo
Presentation transcript:

Computer Vision CS 776 Spring 2014 Cameras & Photogrammetry 3 Prof. Alex Berg (Slide credits to many folks on individual slides)

Cameras & Photogrammetry 3 dresden.de/DMV2000/Impress/PI C003.jpg Gamma Scientific nlampen-im-test html Adapted from: cial_Integrating_Sphere.jpg

Stories of light Alex Berg 2012Jensen & Buhler Siggraph 2002

Image formation What determines the brightness of an image pixel? Light source properties Surface shape and orientation Surface reflectance properties Optics Sensor characteristics Slide by L. Fei-Fei Exposure

Fundamental radiometric relation L: Radiance emitted from P toward P’ E: Irradiance falling on P’ from the lens What is the relationship between E and L? Szeliski P P’ f z d α

Fundamental radiometric relation Image irradiance is linearly related to scene radiance Irradiance is proportional to the area of the lens and inversely proportional to the squared distance between the lens and the image plane The irradiance falls off as the angle between the viewing ray and the optical axis increases Szeliski P P’ f z d α

The interaction of light and surfaces What happens when a light ray hits a point on an object? Some of the light gets absorbed –converted to other forms of energy (e.g., heat) Some gets transmitted through the object –possibly bent, through “refraction” –Or scattered inside the object (subsurface scattering) Some gets reflected –possibly in multiple directions at once Really complicated things can happen –fluorescence Bidirectional reflectance distribution function (BRDF): how bright a surface appears when viewed from one direction when light falls on it from another Slide by Steve Seitz

BRDFs can be incredibly complicated… Slide by Svetlana Lazebnik

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Radiometry Questions: –how “ bright ” will surfaces be? –what is “ brightness ” ? measuring light interactions between light and surfaces Core idea - think about light arriving at a surface around any point is a hemisphere of directions Simplest problems can be dealt with by reasoning about this hemisphere

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Lambert ’ s wall

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth More complex wall

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Foreshortening Principle: two sources that look the same to a receiver must have the same effect on the receiver. Principle: two receivers that look the same to a source must receive the same amount of energy. “ look the same ” means produce the same input hemisphere (or output hemisphere) Reason: what else can a receiver know about a source but what appears on its input hemisphere? (ditto, swapping receiver and source) Crucial consequence: a big source (resp. receiver), viewed at a glancing angle, must produce (resp. experience) the same effect as a small source (resp. receiver) viewed frontally.

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Solid Angle By analogy with angle (in radians), the solid angle subtended by a region at a point is the area projected on a unit sphere centered at that point The solid angle subtended by a patch area dA is given by Another useful expression:

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Measuring Light in Free Space Desirable property: in a vacuum, the relevant unit does not go down along a straight line. How do we get a unit with this property? Think about the power transferred from an infinitesimal source to an infinitesimal receiver. We have total power leaving s to r = total power arriving at r from s Also: Power arriving at r is proportional to: –solid angle subtended by s at r (because if s looked bigger from r, there ’ d be more) –foreshortened area of r (because a bigger r will collect more power

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Radiance All this suggests that the light transferred from source to receiver should be measured as: Radiant power per unit foreshortened area per unit solid angle This is radiance Units: watts per square meter per steradian (wm -2 sr -1 ) Usually written as: Crucial property: In a vacuum, radiance leaving p in the direction of q is the same as radiance arriving at q from p –which was how we got to the unit

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Radiance is constant along straight lines Power 1->2, leaving 1: Power 1->2, arriving at 2: But these must be the same, so that the two radiances are equal

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Irradiance How much light is arriving at a surface? Sensible unit is Irradiance Incident power per unit area not foreshortened This is a function of incoming angle. A surface experiencing radiance L(x  ) coming in from d  experiences irradiance Crucial property: Total power arriving at the surface is given by adding irradiance over all incoming angles --- this is why it ’ s a natural unit Total power is

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Light at surfaces Many effects when light strikes a surface -- could be: –absorbed –transmitted skin –reflected mirror –scattered milk –travel along the surface and leave at some other point sweaty skin Assume that –surfaces don ’ t fluoresce e.g. scorpions, washing powder –surfaces don ’ t emit light (i.e. are cool) –all the light leaving a point is due to that arriving at that point

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth The BRDF Assuming that –surfaces don ’ t fluoresce –surfaces don ’ t emit light (i.e. are cool) –all the light leaving a point is due to that arriving at that point Can model this situation with the Bidirectional Reflectance Distribution Function (BRDF) the ratio of the radiance in the outgoing direction to the incident irradiance

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth BRDF Units: inverse steradians (sr -1 ) Symmetric in incoming and outgoing directions - this is the Helmholtz reciprocity principle Radiance leaving a surface in a particular direction: –add contributions from every incoming direction

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Suppressing Angles - Radiosity In many situations, we do not really need angle coordinates –e.g. cotton cloth, where the reflected light is not dependent on angle Appropriate radiometric unit is radiosity –total power leaving a point on the surface, per unit area on the surface (Wm -2 ) –note that this is independent of the direction Radiosity from radiance? –sum radiance leaving surface over all exit directions, multiplying by a cosine because this is per unit area not per unit foreshortened area

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Radiosity Important relationship: –radiosity of a surface whose radiance is independent of angle (e.g. that cotton cloth)

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Suppressing the angles in the BRDF BRDF is a very general notion –some surfaces need it (underside of a CD; tiger eye; etc) –very hard to measure,illuminate from one direction, view from another, repeat –very unstable minor surface damage can change the BRDF e.g. ridges of oil left by contact with the skin can act as lenses for many surfaces, light leaving the surface is largely independent of exit angle – surface roughness is one source of this property

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Directional hemispheric reflectance Directional hemispheric reflectance: –the fraction of the incident irradiance in a given direction that is reflected by the surface (whatever the direction of reflection) –unitless, range is 0-1 Note that DHR varies with incoming direction –eg a ridged surface, where left facing ridges are absorbent and right facing ridges reflect.

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Lambertian surfaces and albedo For some surfaces, the DHR is independent of illumination direction too –cotton cloth, carpets, matte paper, matte paints, etc. For such surfaces, radiance leaving the surface is independent of angle Called Lambertian surfaces (same Lambert) or ideal diffuse surfaces Use radiosity as a unit to describe light leaving the surface DHR is often called diffuse reflectance, or albedo for a Lambertian surface, BRDF is independent of angle, too. Useful fact:

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Specular surfaces Another important class of surfaces is specular, or mirror- like. –radiation arriving along a direction leaves along the specular direction –reflect about normal –some fraction is absorbed, some reflected –on real surfaces, energy usually goes into a lobe of directions –can write a BRDF, but requires the use of funny functions

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Phong ’ s model There are very few cases where the exact shape of the specular lobe matters. Typically: –very, very small --- mirror –small -- blurry mirror –bigger -- see only light sources as “ specularities ” –very big -- faint specularities Phong ’ s model –reflected energy falls off with

Computer Vision - A Modern Approach Set: Radiometry Slides by D.A. Forsyth Lambertian + specular Widespread model –all surfaces are Lambertian plus specular component Advantages –easy to manipulate –very often quite close true Disadvantages –some surfaces are not e.g. underside of CD ’ s, feathers of many birds, blue spots on many marine crustaceans and fish, most rough surfaces, oil films (skin!), wet surfaces –Generally, very little advantage in modelling behaviour of light at a surface in more detail -- it is quite difficult to understand behaviour of L+S surfaces

Diffuse reflection Light is reflected equally in all directions Dull, matte surfaces like chalk or latex paint Microfacets scatter incoming light randomly Effect is that light is reflected equally in all directions Brightness of the surface depends on the incidence of illumination brighter darker Slide by Svetlana Lazebnik

Diffuse reflection: Lambert’s law N S B: radiosity (total power leaving the surface per unit area) ρ: albedo (fraction of incident irradiance reflected by the surface) N: unit normal S: source vector (magnitude proportional to intensity of the source) θ Slide by Svetlana Lazebnik

Specular reflection Radiation arriving along a source direction leaves along the specular direction (source direction reflected about normal) Some fraction is absorbed, some reflected On real surfaces, energy usually goes into a lobe of directions Phong model: reflected energy falls of with Lambertian + specular model: sum of diffuse and specular term Slide by Svetlana Lazebnik

Specular reflection Moving the light source Changing the exponent Slide by Svetlana Lazebnik

Fundamental radiometric relation Application: S. B. Kang and R. Weiss, Can we calibrate a camera using an image of a flat, textureless Lambertian surface? ECCV 2000.Can we calibrate a camera using an image of a flat, textureless Lambertian surface? Slide by Svetlana Lazebnik

Photometric stereo (shape from shading) Can we reconstruct the shape of an object based on shading cues? Luca della Robbia, Cantoria, 1438 Slide by Svetlana Lazebnik

Photometric stereo Assume: A Lambertian object A local shading model (each point on a surface receives light only from sources visible at that point) A set of known light source directions A set of pictures of an object, obtained in exactly the same camera/object configuration but using different sources Orthographic projection Goal: reconstruct object shape and albedo SnSn ??? S1S1 S2S2 F&P 2 nd ed., sec

Surface model: Monge patch F&P 2 nd ed., sec

Image model Known: source vectors S j and pixel values I j (x,y) Unknown: normal N(x,y) and albedo ρ(x,y) Assume that the response function of the camera is a linear scaling by a factor of k Lambert’s law: F&P 2 nd ed., sec

Least squares problem Obtain least-squares solution for g(x,y) (which we defined as N(x,y)  (x,y) ) Since N(x,y) is the unit normal,  (x,y) is given by the magnitude of g(x,y) Finally, N(x,y) = g(x,y) /  (x,y) (n × 1) known unknown (n × 3)(3 × 1) For each pixel, set up a linear system: F&P 2 nd ed., sec

Example Recovered albedo Recovered normal field F&P 2 nd ed., sec

Recall the surface is written as This means the normal has the form: Recovering a surface from normals If we write the estimated vector g as Then we obtain values for the partial derivatives of the surface: F&P 2 nd ed., sec

Recovering a surface from normals Integrability: for the surface f to exist, the mixed second partial derivatives must be equal: We can now recover the surface height at any point by integration along some path, e.g. (for robustness, can take integrals over many different paths and average the results) (in practice, they should at least be similar) F&P 2 nd ed., sec

Surface recovered by integration F&P 2 nd ed., sec

Finding the direction of the light source I(x,y) = N(x,y) ·S(x,y) + A Full 3D case: For points on the occluding contour: P. Nillius and J.-O. Eklundh, “Automatic estimation of the projected light source direction,” CVPR 2001 N S Slide by Svetlana Lazebnik

Finding the direction of the light source P. Nillius and J.-O. Eklundh, “Automatic estimation of the projected light source direction,” CVPR 2001 Slide by Svetlana Lazebnik

Application: Detecting composite photos Fake photo Real photo M. K. Johnson and H. Farid, Exposing Digital Forgeries by Detecting Inconsistencies in Lighting, ACM Multimedia and Security Workshop, 2005.Exposing Digital Forgeries by Detecting Inconsistencies in Lighting Slide by Svetlana Lazebnik

From light rays to pixel values Camera response function: the mapping f from irradiance to pixel values Useful if we want to estimate material properties Enables us to create high dynamic range images For more info: P. E. Debevec and J. Malik, Recovering High Dynamic Range Radiance Maps from Photographs, SIGGRAPH 97Recovering High Dynamic Range Radiance Maps from Photographs Slide by Svetlana Lazebnik

Limitations Orthographic camera model Simplistic reflectance and lighting model No shadows No interreflections No missing data Integration is tricky Slide by Svetlana Lazebnik

More reading & thought problems (reconstructing from a single image using priors) Shape, Illumination, and Reflectance from Shading Jonathan T. Barron, Jitendra Malik Tech Report, 2013 (implementing this one is the extra credit) Recovering High Dynamic Range Radiance Maps from Photographs. Paul E. Debevec and Jitendra Malik. SIGGRAPH 1997 (people can perceive reflectance) Surface reflectance estimation and natural illumination statistics. R.O. Dror, E.H. Adelson, and A.S. Willsky. Workshop on Statistical and Computational Theories of Vision 2001