Light and Shading Computer Vision Derek Hoiem, University of Illinois 01/19/12 “Empire of Light”, Magritte.

Slides:



Advertisements
Similar presentations
Computer Vision Radiometry. Bahadir K. Gunturk2 Radiometry Radiometry is the part of image formation concerned with the relation among the amounts of.
Advertisements

1 Graphics CSCI 343, Fall 2013 Lecture 18 Lighting and Shading.
Week 5 - Friday.  What did we talk about last time?  Quaternions  Vertex blending  Morphing  Projections.
Announcements Project 2 due today Project 3 out today –demo session at the end of class.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #12.
Lecture 35: Photometric stereo CS4670 / 5670 : Computer Vision Noah Snavely.
Capturing light Source: A. Efros. Image formation How bright is the image of a scene point?
Computational Photography: Color perception, light spectra, contrast Connelly Barnes.
Color & Light, Digitalization, Storage. Vision Rods work at low light levels and do not see color –That is, their response depends only on how many photons,
Illumination Model How to compute color to represent a scene As in taking a photo in real life: – Camera – Lighting – Object Geometry Material Illumination.
Lecture 23: Photometric Stereo CS4670/5760: Computer Vision Kavita Bala Scott Wehrwein.
Announcements Midterm out today due in a week Project2 no demo session artifact voting TBA.
Light Readings Forsyth, Chapters 4, 6 (through 6.2) by Ted Adelson.
Lecture 30: Light, color, and reflectance CS4670: Computer Vision Noah Snavely.
Stefano Soatto (c) UCLA Vision Lab 1 Homogeneous representation Points Vectors Transformation representation.
Capturing Light… in man and machine : Computational Photography Alexei Efros, CMU, Fall 2006 Some figures from Steve Seitz, Steve Palmer, Paul Debevec,
Introduction to Computer Vision CS / ECE 181B Tues, May 18, 2004 Ack: Matthew Turk (slides)
Lecture 21: Light, reflectance and photometric stereo
Basic Principles of Surface Reflectance
Capturing Light… in man and machine : Computational Photography Alexei Efros, CMU, Fall 2010.
Capturing Light… in man and machine : Computational Photography Alexei Efros, CMU, Fall 2008.
Basic Principles of Surface Reflectance Lecture #3 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.
Colors and sensors Slides from Bill Freeman, Fredo Durand, Rob Fergus, and David Forsyth, Alyosha Efros.
What is wrong with this picture? Recap from Lecture 2 Pinhole camera model Perspective projections Focal length and field of view Remember to use your.
Lecture 20: Light, reflectance and photometric stereo
Course Website: Computer Graphics 16: Illumination.
Lighting affects appearance. Light Source emits photons Photons travel in a straight line When they hit an object they: bounce off in a new direction.
Recap from Friday Pinhole camera model Perspective projections Lenses and their flaws Focus Depth of field Focal length and field of view.
Light and Color Computational Photography Derek Hoiem, University of Illinois 09/08/11 “Empire of Light”, Magritte.
Light and shading Source: A. Efros.
Computer Vision Spring ,-685 Instructor: S. Narasimhan PH A18B T-R 10:30am – 11:50am Lecture #13.
Illumination.
Light and Color D.A. Forsyth. Key issues Physical what makes a pixel take its brightness values? Inference what can we recover from the world using those.
Lenses: Focus and Defocus A lens focuses light onto the film – There is a specific distance at which objects are “in focus” other points project to a “circle.
11/15/11 Image-based Lighting Computational Photography Derek Hoiem, University of Illinois Many slides from Debevec, some from Efros T2.
Computational Photography Derek Hoiem, University of Illinois
CSC418 Computer Graphics n Illumination n Lights n Lightinging models.
Physiology of Vision: a swift overview : Advanced Machine Perception A. Efros, CMU, Spring 2006 Some figures from Steve Palmer.
Capturing light Source: A. Efros.
Sources, Shadows, and Shading
1 Formation et Analyse d’Images Session 2 Daniela Hall 7 October 2004.
Taku KomuraComputer Graphics Local Illumination and Shading Computer Graphics – Lecture 10 Taku Komura Institute for Perception, Action.
Radiometry and Photometric Stereo 1. Estimate the 3D shape from shading information Can you tell the shape of an object from these photos ? 2.
Image Filtering Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/02/10.
How digital cameras work The Exposure The big difference between traditional film cameras and digital cameras is how they capture the image. Instead of.
Announcements Office hours today 2:30-3:30 Graded midterms will be returned at the end of the class.
Lecture 34: Light, color, and reflectance CS4670 / 5670: Computer Vision Noah Snavely.
Histograms and Color Balancing Computational Photography Derek Hoiem, University of Illinois 09/13/11 “Empire of Light”, Magritte.
1Ellen L. Walker 3D Vision Why? The world is 3D Not all useful information is readily available in 2D Why so hard? “Inverse problem”: one image = many.
1 CSCE 441: Computer Graphics Lighting Jinxiang Chai.
Illumination Model How to compute color to represent a scene As in taking a photo in real life: – Camera – Lighting – Object Geometry Material Illumination.
Capturing Light… in man and machine CS194: Image Manipulation & Computational Photography Alexei Efros, UC Berkeley, Fall 2015.
COMPUTER VISION D10K-7C02 CV03: Light and Optics Dr. Setiawan Hadi, M.Sc.CS. Program Studi S-1 Teknik Informatika FMIPA Universitas Padjadjaran.
CS552: Computer Graphics Lecture 33: Illumination and Shading.
EECS 274 Computer Vision Sources, Shadows, and Shading.
1 CSCE 441: Computer Graphics Lighting Jinxiang Chai.
1 of 32 Computer Graphics Color. 2 of 32 Basics Of Color elements of color:
Computer Graphics: Illumination
MAN-522 Computer Vision Spring
Capturing Light… in man and machine
CS262 – Computer Vision Lect 4 - Image Formation
Image-based Lighting Computational Photography
Physiology of Vision: a swift overview
Capturing Light… in man and machine
Perception and Measurement of Light, Color, and Appearance
Lighting.
Capturing Light… in man and machine
Lecture 28: Photometric Stereo
Physical Problem of Simultaneous Linear Equations in Image Processing
Computational Photography Derek Hoiem, University of Illinois
Presentation transcript:

Light and Shading Computer Vision Derek Hoiem, University of Illinois 01/19/12 “Empire of Light”, Magritte

Outline Light and wavelengths The eye – Physiology (rods and cones) – Receptivity, non-linear response – Trichromacity The camera and display – Bayer filter revisited – RGB display Physics of light and reflection – Direct light: spectrum of wavelengths – Reflection: influence of albedo, orientation, reflectivity, BRDF – Inter-reflection: light may be influenced by multiple surfaces – Illusions, ambiguity of albedo and illumination color, color constancy – Light sources and shadows Color spaces – RGB, HSV, YCrCb, XYZ, Lab

Outline 1.How observed intensity of one pixel depends on geometry, material, lighting, and viewpoint – Basic pinhole model – Diffuse reflectance, specular reflectance – Lambertian surfaces, Lambert’s law of cosines – Effects due to surroundings: interreflections, shadows – Color: wavelengths, Bayer filter – What does the pixel intensity tell us? Not much 2.How can we infer scene properties from a set of pixels – Local analysis (next class) – Look at differences in intensity – Lighting normalization – Photometric stereo

Today’s class What determines a pixel’s brightness? – How is light reflected? – How is light measured? What can be inferred about the world from those brightness values?

Administrative stuff Some people waiting to get in Tentative office hours – Derek: Wed 4pm, 8pm (skype only) – Ruiqi: Mon 7pm, Thurs 4pm Homework 1 released soon Any questions?

How light is recorded

Digital camera A digital camera replaces film with a sensor array Each cell in the array is light-sensitive diode that converts photons to electrons Two common types: Charge Coupled Device (CCD) and CMOS Slide by Steve Seitz

Sensor Array CMOS sensor Each sensor cell records amount of light coming in at a small range of orientations

The raster image (pixel matrix)

Today’s class: Light and Shading What determines a pixel’s intensity? What can we infer about the scene from pixel intensities?

How does a pixel get its value? Light emitted Sensor Lens Fraction of light reflects into camera

How does a pixel get its value? Major factors – Illumination strength and direction – Surface geometry – Surface material – Nearby surfaces – Camera gain/exposure Light emitted Sensor Light reflected to camera

Basic models of reflection Specular: light bounces off at the incident angle – E.g., mirror Diffuse: light scatters in all directions – E.g., brick, cloth, rough wood incoming light specular reflection incoming light diffuse reflection

Lambertian reflectance model light source absorption diffuse reflection

Diffuse reflection: Lambert’s cosine law

Specular Reflection Reflected direction depends on light orientation and surface normal – E.g., mirrors are fully specular – Most surfaces can be modeled with a mixture of diffuse and specular components light source specular reflection Flickr, by suzysputnik Flickr, by piratejohnny

Most surfaces have both specular and diffuse components Specularity = spot where specular reflection dominates (typically reflects light source) Photo: northcountryhardwoodfloors.com Typically, specular component is small

Intensity and Surface Orientation Slide: Forsyth

1 2

Recap specular reflection diffuse reflection absorption

Other possible effects light sourcetransparency light source refraction

λ1λ1 light source λ2λ2 fluorescence t=1 light source t>1 phosphorescence

λ light source subsurface scattering

BRDF: Bidirectional Reflectance Distribution Function surface normal Slide credit: S. Savarese Model of local reflection that tells how bright a surface appears when viewed from one direction when light falls on it from another

Application: photometric stereo Assume: – a set of point sources that are infinitely distant – a set of pictures of an object, obtained in exactly the same camera/object configuration but using different sources – A Lambertian object (or the specular component has been identified and removed) Photometric stereo slides by Forsyth

Each image is: So if we have enough images with known sources, we can solve for albedo times 3D normal

And the albedo (shown here) is given by: (the normal is a unit vector)

Dynamic range and camera response Typical scenes have a huge dynamic range Camera response is roughly linear in the mid range (15 to 240) but non-linear at the extremes – called saturation or undersaturation

Color Human Luminance Sensitivity Function Slide Credit: Efros Light is composed of a spectrum of wavelengths

Some examples of the spectra of light sources © Stephen E. Palmer, 2002

Some examples of the reflectance spectra of surfaces Wavelength (nm) % Photons Reflected Red Yellow Blue Purple © Stephen E. Palmer, 2002

More spectra metamers

The color of objects Colored light arriving at the camera involves two effects – The color of the light source (illumination + inter-reflections) – The color of the surface Slide: Forsyth

Color Sensing: Bayer Grid Estimate RGB at each cell from neighboring values Slide by Steve Seitz

Color Image R G B

Why RGB? If light is a spectrum, why are images RGB?

Human color receptors Long (red), Medium (green), and Short (blue) cones, plus intensity rods Fun facts – “M” and “L” on the X-chromosome That’s why men are more likely to be color blind (see what it’s like: – “L” has high variation, so some women are tetrachromatic – Some animals have 1 (night animals), 2 (e.g., dogs), 4 (fish, birds), 5 (pigeons, some reptiles/amphibians), or even 12 (mantis shrimp) types of cones

So far: light  surface  camera Called a local illumination model But much light comes from surrounding surfaces From Koenderink slides on image texture and the flow of light

Inter-reflection is a major source of light

Inter-reflection affects the apparent color of objects From Koenderink slides on image texture and the flow of light

Scene surfaces also cause shadows Shadow: reduction in intensity due to a blocked source

Shadows

Models of light sources Distant point source – One illumination direction – E.g., sun Area source – E.g., white walls, diffuser lamps, sky Ambient light – Substitute for dealing with interreflections Global illumination model – Account for interreflections in modeled scene

Recap Possible factors: albedo, shadows, texture, specularities, curvature, lighting direction

Area sources A point receives light as the summation of contributions of small elements over the whole source Slide: Forsyth

What does the intensity of a pixel tell us? im(234, 452) = 0.58

The plight of the poor pixel A pixel’s brightness is determined by – Light source (strength, direction, color) – Surface orientation – Surface material and albedo – Reflected light and shadows from surrounding surfaces – Gain on the sensor A pixel’s brightness tells us nothing by itself

Photo by nickwheeleroz, Flickr Slide: Forsyth

And yet we can interpret images… Key idea: for nearby scene points, most factors do not change much The information is mainly contained in local differences of brightness

Darkness = Large Difference in Neighboring Pixels

What is this?

What differences in intensity tell us about shape Changes in surface normal Texture Proximity Indents and bumps Grooves and creases Photos Koenderink slides on image texture and the flow of light

From Koenderink slides on image texture and the flow of light Shadows as cues Slide: Forsyth

Color constancy Interpret surface in terms of albedo or “true color”, rather than observed intensity – Humans are good at it – Computers are not nearly as good

One source of constancy: local comparisons

Perception of Intensity from Ted Adelson

Perception of Intensity from Ted Adelson

Color Correction

HW 1, Problem 1a

HW 1, Problem 1b

Things to remember Important terms: diffuse/specular reflectance, albedo, umbra/penumbra Observed intensity depends on light sources, geometry/material of reflecting surface, surrounding objects, camera settings Objects cast light and shadows on each other Differences in intensity are primary cues for shape

Thank you Next class: Image Filters