Midterm Review. World is practically continuous in time space color brightness 10 11 dynamic range brightness.

Slides:



Advertisements
Similar presentations
3-D Computational Vision CSc Image Processing II - Fourier Transform.
Advertisements

Spatial Filtering (Chapter 3)
Image Processing Lecture 4
Optics and Human Vision The physics of light
What is vision Aristotle - vision is knowing what is where by looking.
Image Processing IB Paper 8 – Part A Ognjen Arandjelović Ognjen Arandjelović
Digital Image Processing In The Name Of God Digital Image Processing Lecture3: Image enhancement M. Ghelich Oghli By: M. Ghelich Oghli
ECE 472/572 - Digital Image Processing Lecture 5 - Image Enhancement - Frequency Domain Filters 09/13/11.
Lecture 4 Linear Filters and Convolution
Image Filtering CS485/685 Computer Vision Prof. George Bebis.
Image processing. Image operations Operations on an image –Linear filtering –Non-linear filtering –Transformations –Noise removal –Segmentation.
Lecture 30: Light, color, and reflectance CS4670: Computer Vision Noah Snavely.
Sampling (Section 4.3) CS474/674 – Prof. Bebis. Sampling How many samples should we obtain to minimize information loss during sampling? Hint: take enough.
Digital Image Processing Chapter 4: Image Enhancement in the Frequency Domain.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 4: Introduction to Vision 1 Computational Architectures in Biological.
1 Image Filtering Readings: Ch 5: 5.4, 5.5, 5.6,5.7.3, 5.8 (This lecture does not follow the book.) Images by Pawan SinhaPawan Sinha formal terminology.
University of British Columbia CPSC 414 Computer Graphics © Tamara Munzner 1 Sampling Week 7, Fri 17 Oct 2003 p1 demos sampling.
CS443: Digital Imaging and Multimedia Filters Spring 2008 Ahmed Elgammal Dept. of Computer Science Rutgers University Spring 2008 Ahmed Elgammal Dept.
Comp 665 Convolution. Questions? Ask here first. Likely someone else has the same question.
Digital Audio, Image and Video Hao Jiang Computer Science Department Sept. 6, 2007.
30 August 2006 Sensing. Refresher on Complex Numbers
Linear View Synthesis Using a Dimensionality Gap Light Field Prior
Lecture 2: Image filtering
1 Images and Transformations Images by Pawan SinhaPawan Sinha.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 4: Introduction to Vision 1 Computational Architectures in Biological.
Most slides from Steve Seitz
lecture 2, linear imaging systems Linear Imaging Systems Example: The Pinhole camera Outline  General goals, definitions  Linear Imaging Systems.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean Hall 5409 T-R 10:30am – 11:50am.
ECE 472/572 - Digital Image Processing Lecture 4 - Image Enhancement - Spatial Filter 09/06/11.
ECE 472/572 – Digital Image Processing Lecture 2 – Elements of Visual Perception and Image Formation 08/25/11.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm.
Image Processing 고려대학교 컴퓨터 그래픽스 연구실 cgvr.korea.ac.kr.
Machine Vision ENT 273 Image Filters Hema C.R. Lecture 5.
© by Yu Hen Hu 1 Human Visual System. © by Yu Hen Hu 2 Understanding HVS, Why? l Image is to be SEEN! l Perceptual Based Image Processing.
Digital Image Fundamentals. What Makes a good image? Cameras (resolution, focus, aperture), Distance from object (field of view), Illumination (intensity.
Filtering Robert Lin April 29, Outline Why filter? Filtering for Graphics Sampling and Reconstruction Convolution The Fourier Transform Overview.
Image Processing Edge detection Filtering: Noise suppresion.
1 Chapter 2: Color Basics. 2 What is light?  EM wave, radiation  Visible light has a spectrum wavelength from 400 – 780 nm.  Light can be composed.
09/19/2002 (C) University of Wisconsin 2002, CS 559 Last Time Color Quantization Dithering.
Image Filtering Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/02/10.
CS 556 – Computer Vision Image Basics & Review. What is an Image? Image: a representation, resemblance, or likeness An image is a signal: a function carrying.
Linear filtering. Motivation: Noise reduction Given a camera and a still scene, how can you reduce noise? Take lots of images and average them! What’s.
Machine Vision ENT 273 Image Filters Hema C.R. Lecture 5.
Lecture 34: Light, color, and reflectance CS4670 / 5670: Computer Vision Noah Snavely.
Intelligent Vision Systems ENT 496 Image Filtering and Enhancement Hema C.R. Lecture 4.
02/05/2002 (C) University of Wisconsin 2002, CS 559 Last Time Color Quantization Mach Banding –Humans exaggerate sharp boundaries, but not fuzzy ones.
Sampling Pixel is an area!! – Square, Rectangular, or Circular? How do we approximate the area? – Why bother? Color of one pixel Image Plane Areas represented.
2D Sampling Goal: Represent a 2D function by a finite set of points.
Vision Overview 18 September Eye: Musculature.
Fourier Transform.
Human Visual System.
Instructor: Mircea Nicolescu Lecture 5 CS 485 / 685 Computer Vision.
Last Lecture photomatix.com. Today Image Processing: from basic concepts to latest techniques Filtering Edge detection Re-sampling and aliasing Image.
Lecture 1: Images and image filtering CS4670/5670: Intro to Computer Vision Noah Snavely Hybrid Images, Oliva et al.,
Non-linear filtering Example: Median filter Replaces pixel value by median value over neighborhood Generates no new gray levels.
Spatial Filtering (Chapter 3) CS474/674 - Prof. Bebis.
- photometric aspects of image formation gray level images
Human Visual System.
… Sampling … … Filtering … … Reconstruction …
Linear Filters and Edges Chapters 7 and 8
Perception and Measurement of Light, Color, and Appearance
Human Vision Nov Wu Pei.
Fourier Transform.
- photometric aspects of image formation gray level images
Image Processing - in short
Sampling Theorem told us …
Outline Linear Shift-invariant system Linear filters
Lecture 2: Image filtering
Image Filtering Readings: Ch 5: 5. 4, 5. 5, 5. 6, , 5
Presentation transcript:

Midterm Review

World is practically continuous in time space color brightness dynamic range brightness

Digital World Quantized in: –time –space –color –brightness how many frames/second? how many pixels? how many primaries? how many bits/pixel?

How TV Worked (in the old days) What is a cathode? What is a cathode ray? What is a cathode ray tube? What is a phosphor? What are "phosphorescence" and "fluorescence"? How does a "cathode ray tube" CRT work? How was the TV image acquired? What is a "raster" scan? What are horizontal and vertical "sync"? What is "interlace"? Why do that? What determines the "resolution"? What is "gamma" and why should it be corrected? How did they get color? How do decisions made in the 40's affect CG, CV, and IP today?

Impulse Response Function Point Spread Function What is the image of a point? –Shape of pinhole for points at infinity –Typically a little blob for a good lens –Could have aberrations and are distance, color, or position dependent. What happens as we enlarge the pinhole?

Blurring as convolution: IRF’s and apertures Blurring by convolution with impulse response function: I blurred (x)=  I input (y) h(x-y) dy –Replace each point y by y’s intensity times IRF h centered at y h(x-y) is effect of a fixed y on x over various image points x –Sum up over all such points Aperture in image space x: –effect of y on x over various y: h(-[y-x]) – weighting of various input image points in producing image at a fixed point x

Linear Systems Favorite model because we have great tools F(a+b) = F(a) + F(b), F(k a) = k F(a) Shift Invariant Time Invariant Is camera projection linear?

Properties of convolution: I out (x) =  I in (y) h(x-y) dy h(x) is called the convolution kernel Linear in both inputs, I in and h Symmetric in its inputs, I in and h Cascading convolutions is convolution with the convolution of the two kernels: (I * h 1 ) * h 2 = I * (h 1 * h 2 ) –Thus cascading of convolution of two Gaussians produces Gaussian with  = (   2 2 ) ½ Any linear, shift invariant operator can be written as a convolution or a limit of one

Linear Shift-Invariant Operators Blurring with IRF that is constant over the scene Viewing scene through any fixed aperture All derivatives D –So D(I * h 1 ) = DI * h 1 = I * Dh 1 Designed operations, e.g., for smoothing, noise removal, sharpening, etc. Can be applied to parametrized functions of u –E.g., smoothing surfaces

Example Impulse Responses

Properties of Convolution Commutative: a*b = b*a Associative: (a*b)*c = a*(b*c) Distributive: a*b + a*c = a*(b+c) Central Limit Theorem: convolve a pulse with itself enough times you get a Gaussian

Properties of convolution, continued For any convolution kernel h(x), if the input I in (x) is a sinusoid with wavelength (level of detail) 1/, i.e., I in (x) = A cos(2  x) + B sin(2  x), then the output of the convolution is a sinusoid with the same wavelength (level of detail), i.e., I in * h = C cos(2  x) + D sin(2  x), for some C and D dependent on A, B, and h(x)

Sampling and integration (digital images) Model –Within-pixel integration at all points Has its own IRF, typically rectangular –Then sampling Sampling = multiplication by pixel area  brush function –Brush function is sum of impulses at pixel centers –Sampling = aliasing: in sinusoidal decomposition higher frequency components masquerading as and thus polluting lower frequency components –Nyquist frequency: how finely to sample to have adequately low effect of aliasing

The eye and retina

Retina 120e6 rod cells (scotopic vision) 6e6 cone cells (photopic vision) Sensors operate by polarization of proteins by photons Produce a “pulse train” with rate proportional to log of intensity

Rods and cones density in the retina

Dynamic Range 11 orders of magnitude! Single photons when dark adapted! Scotopic in the dark Photopic in the light Maybe only 20 levels at one point Maybe only 1000 levels at one average brightness

Visual Performance 20:20 vision corresponds to 1 arc minute Fovea: 20 minutes max density uniform, 2 degrees “rod-free” area 400 to 700 nanometer wavelength 510nm maximum rod sensitivity (green) 560nm maximum cone sensitivity (orange)

Contrast Sensitivity

A Model of Human Vision with Limited Feedback

Receptive Fields

Perception of Brightness Affected strongly by boundariness signals of form system Determined by relative intensity changes –Weber’s law: just noticeable difference constant  I/I –Averaging within boundaries –Enhancement/sharpening at boundaries Mach effect

Weber’s Law Just noticeable differences

“Simultaneous Contrast” -- Brightness Determined by Relative Luminance

Fourier Transform Pairs and how to use them in reasoning Interpretation of DFT as a matrix Computing FT in N dimensions Lowest and highest frequencies present Convolution Effects of non-linearity Resizing images Sampling and Reconstruction Physical origins of bandlimits

More FT Apply what we learned to some other problem –What is the FT of a Gabor function? –What is the impulse response of a low-pass filter? How about a band-pass filter? –Suppose you wanted to estimate which note on a piano keyboard best corresponds to a signal? –What is the likely effect of those on-center/off- surround receptive fields in the visual field on our visual sensitivity to varying frequencies?

The exam Wednesday 25 October Open books, notes, etc. You can use a calculator or computer but no communication with other people. Bring your own paper and pencil. Write legibly! Pledge your paper. Don’t Panic!