Introduction to Computer Vision CS / ECE 181B Thursday, April 13, 2004  Image Formation.

Slides:



Advertisements
Similar presentations
Geometry of Aerial Photographs
Advertisements

Week 5 - Friday.  What did we talk about last time?  Quaternions  Vertex blending  Morphing  Projections.
Digital Camera Essential Elements Part 1 Sept
16421: Vision Sensors Lecture 6: Radiometry and Radiometric Calibration Instructor: S. Narasimhan Wean 5312, T-R 1:30pm – 2:50pm.
Radiometry. Outline What is Radiometry? Quantities Radiant energy, flux density Irradiance, Radiance Spherical coordinates, foreshortening Modeling surface.
Camera Models A camera is a mapping between the 3D world and a 2D image The principal camera of interest is central projection.
Announcements. Projection Today’s Readings Nalwa 2.1.
Lecture 5: Projection CS6670: Computer Vision Noah Snavely.
Capturing Light… in man and machine : Computational Photography Alexei Efros, CMU, Fall 2006 Some figures from Steve Seitz, Steve Palmer, Paul Debevec,
Announcements Mailing list (you should have received messages) Project 1 additional test sequences online Talk today on “Lightfield photography” by Ren.
CS485/685 Computer Vision Prof. George Bebis
Image Formation1 Projection Geometry Radiometry (Image Brightness) - to be discussed later in SFS.
Introduction to Computer Vision CS / ECE 181B Thursday, April 1, 2004  Course Details  HW #0 and HW #1 are available.
Introduction to Computer Vision CS / ECE 181B Tues, May 18, 2004 Ack: Matthew Turk (slides)
CSCE641: Computer Graphics Image Formation Jinxiang Chai.
Announcements Mailing list Project 1 test the turnin procedure *this week* (make sure it works) vote on best artifacts in next week’s class Project 2 groups.
Capturing Light… in man and machine : Computational Photography Alexei Efros, CMU, Fall 2008.
Lecture 12: Projection CS4670: Computer Vision Noah Snavely “The School of Athens,” Raphael.
Lecture 2 Photographs and digital mages Friday, 7 January 2011 Reading assignment: Ch 1.5 data acquisition & interpretation Ch 2.1, 2.5 digital imaging.
History of Digital Camera By : Dontanisha Williams P2.
Ch 25 1 Chapter 25 Optical Instruments © 2006, B.J. Lieb Some figures electronically reproduced by permission of Pearson Education, Inc., Upper Saddle.
Digital Images The nature and acquisition of a digital image.
Video Basics – Chapter 4 The Video Camera.
Digital Technology 14.2 Data capture; Digital imaging using charge-coupled devices (CCDs)
Digital Cameras (Basics) CCD (charge coupled device): image sensor Resolution: amount of detail the camera can capture Capturing Color: filters go on.
1/22/04© University of Wisconsin, CS559 Spring 2004 Last Time Course introduction Image basics.
Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.
1 University of Palestine Faculty of Applied Engineering and Urban Planning Software Engineering Department Introduction to computer vision Chapter 2:
Image formation & Geometrical Transforms Francisco Gómez J MMS U. Central y UJTL.
Introduction to Machine Vision Systems
Digital Imaging Systems –I/O. Workflow of digital imaging Two Competing imaging format for motion pictures Film vs Digital Video( TV) Presentation of.
Reflectance Map: Photometric Stereo and Shape from Shading
Measurements in Fluid Mechanics 058:180 (ME:5180) Time & Location: 2:30P - 3:20P MWF 3315 SC Office Hours: 4:00P – 5:00P MWF 223B-5 HL Instructor: Lichuan.
Image Formation. Input - Digital Images Intensity Images – encoding of light intensity Range Images – encoding of shape and distance They are both a 2-D.
Digital Photography A tool for Graphic Design Graphic Design: Digital Photography.
How A Camera Works Image Sensor Shutter Mirror Lens.
The Television Camera The television camera is still the most important piece of production equipment. In fact, you can produce and show an impressive.
CPSC 641: Computer Graphics Image Formation Jinxiang Chai.
Image Formation Fundamentals Basic Concepts (Continued…)
© 1999 Rochester Institute of Technology Introduction to Digital Imaging.
Video Video.
DIGITAL Video. Video Creation Video captures the real world therefore video cannot be created in the same sense that images can be created video must.
1 Computer Vision 一目 了然 一目 了然 一看 便知 一看 便知 眼睛 頭腦 眼睛 頭腦 Computer = Image + Artificial Vision Processing Intelligence Vision Processing Intelligence.
Image Formation Dr. Chang Shu COMP 4900C Winter 2008.
Intelligent Vision Systems Image Geometry and Acquisition ENT 496 Ms. HEMA C.R. Lecture 2.
Course 9 Texture. Definition: Texture is repeating patterns of local variations in image intensity, which is too fine to be distinguished. Texture evokes.
Cameras 1 Cameras. Cameras 2 Introductory Question If you’re building a camera and want to make a larger image (a telephoto lens) you should: If you’re.
Cameras. Question: If you’re building a camera and want to make a larger image (a telephoto lens) you should: 1.increase the diameter of the lens 2.decrease.
DIGITAL CAMERAS Prof Oakes. Overview Camera history Digital Cameras/Digital Images Image Capture Image Display Frame Rate Progressive and Interlaced scans.
Computer Vision Introduction to Digital Images.
Sounds of Old Technology IB Assessment Statements Topic 14.2., Data Capture and Digital Imaging Using Charge-Coupled Devices (CCDs) Define capacitance.
Digital Camera TAVITA SU’A. Overview ◦Digital Camera ◦Image Sensor ◦CMOS ◦CCD ◦Color ◦Aperture ◦Shutter Speed ◦ISO.
Introduction to Image Processing. What is Image Processing? Manipulation of digital images by computer. Image processing focuses on two major tasks: –Improvement.
Intelligent Vision Systems Image Geometry and Acquisition ENT 496 Ms. HEMA C.R. Lecture 2.
Instructor: Mircea Nicolescu Lecture 3 CS 485 / 685 Computer Vision.
Introduction to Image Processing Course Notes Anup Basu, Ph.D. Professor, Dept of Computing Sc. University of Alberta.
Announcements Project 1 grading session this Thursday 2:30-5pm, Sieg 327 –signup ASAP:signup –10 minute slot to demo your project for a TA »have your program.
Lecture 18: Cameras CS4670 / 5670: Computer Vision KavitaBala Source: S. Lazebnik.
CSE 185 Introduction to Computer Vision
Electronics Lecture 5 By Dr. Mona Elneklawi.
Unit 1 The History of Photography & The Camera
Capturing Light… in man and machine
CS262 – Computer Vision Lect 4 - Image Formation
What Is Spectral Imaging? An Introduction
Lecture 13: Cameras and geometry
Capturing Light… in man and machine
Announcements Midterm out today Project 1 demos.
Lecture 2 Photographs and digital mages
Projection Readings Nalwa 2.1.
Announcements Midterm out today Project 1 demos.
Presentation transcript:

Introduction to Computer Vision CS / ECE 181B Thursday, April 13, 2004  Image Formation

Course web site Not a substitute for attending the class –Note that not all lecture are in PPT. I will be using the board for most “math” related stuff.

Prereqs and background knowledge E.g., I assume you know: –Basic linear algebra –Basic probability –Basic calculus –Programming languages (C, C++) or MATLAB  You had two sessions on MATLAB

Your job You are expected to: –Attend the lectures and discussion sessions  You're responsible for everything that transpires in class and discussion session (not just what’s on the slides) –Ask questions in class – participate! –Do the homework assignments on time and with integrity  “Honest effort” will get you credit –Give us feedback during the quarter

First part of course: Image Formation Geometry of image formation (Camera models and calibration) –Where? Radiometry of image formation –How bright?

Digital images We’re interested in digital images, which may come from –An image originally recorded on film  Digitized from negative or from print –Analog video camera  Digitized by frame grabber –Digital still camera or video camera –Sonar, radar, ladar (laser radar) –Various kinds of spectral or multispectral sensors  Infrared, X-ray, Landsat… Normally, we’ll assume a digital camera (or digitized analog camera) to be our source, and most generally a video camera (spatial and temporal sampling)

What is a Camera? A camera has many components –Optics: lens, filters, prisms, mirrors, aperture –Imager: array of sensing elements (1D or 2D) –Scanning electronics –Signal processing –ADC: sampling, quantizing, encoding, compression  May be done by external frame grabber (“digitizer”) And many descriptive features –Imager type: CCD or CMOS –Imager number –SNR –Lens mount –Color or B/W –Analog or digital (output) –Frame rate –Manual/automatic controls –Shutter speeds –Size, weight –Cost

Camera output: A raster image Raster scan – A series of horizontal scan lines, top to bottom –Progressive scan – Line 1, then line 2, then line 3, … –Interlaced scan – Odd lines then even lines Raster pattern Progressive scan Interlaced scan

Pixels Each line of the image comprises many picture elements, or pixels –Typically 8-12 bits (grayscale) or 24 bits (color) A 640x480 image: –480 rows and 640 columns –480 lines each with 640 pixels –640x480 = 307,200 pixels At 8 bits per pixel, 30 images per second –640x480x8x30 = 73.7 Mbps or 9.2 MBs At 24 bits per pixel (color) –640x480x24x30 = 221 Mbps or 27.6 MBs

Aspect ratio Image aspect ratio – width to height ratio of the raster –4:3 for TV, 16:9 for HDTV, 1.85:1 to 2.35:1 for movies –We also care about pixel aspect ratio (not the same thing)  Square or non-square pixels

Sensor, Imager, Pixel An imager (sensor array) typically comprises n x m sensors –320x240 to 7000x9000 or more (high end astronomy) –Sensor sizes range from 15x15  m down to 3x3  m or smaller Each sensor contains a photodetector and devices for readout Technically: –Imager – a rectangular array of sensors upon which the scene is focused (photosensor array) –Sensor (photosensor) – a single photosensitive element that generates and stores an electric charge when illuminated. Usually includes the circuitry that stores and transfers it charge to a shift register –Pixel (picture element) – atomic component of the image (technically not the sensor, but…) However, these are often intermingled

Color sensors CCD and CMOS chips do not have any inherent ability to discriminate color (i.e., photon wavelength/energy) –They sense “number of photons”, not wavelengths –Essentially grayscale sensors – need filters to discriminate colors! Approaches to sensing color –3-chip color: Split the incident light into its primary colors (usually red, green and blue) by filters and prisms  Three separate imagers –Single-chip color: Use filters on the imager, then reconstruct color in the camera electronics  Filters absorb light (2/3 or more), so sensitivity is low

3-chip color Incident light Lens Neutral density filter Infrared filter Low-pass filter To R imager To G imager To B imager Prisms How much light energy reaches each sensor?

Single-chip color Incident light To imager Uses a mosaic color filter –Each photosensor is covered by a single filter –Must reconstruct (R, G, B) values via interpolation

Eye is not a (digital) camera! (or, is it?)

Image Formation Projection Geometry Radiometry (Image Brightness) - to be discussed later in SFS.

Pinhole Camera (source: A Guided tour of computer vision/Vic Nalwa)

Perspective Projection (source: A Guided tour of computer vision/Vic Nalwa)

Perspective Projection

Some Observations/questions Note that under perspective projection, straight-lines in 3- D project as straight lines in the 2-D image plane. Can you prove this analytically? –What is the shape of the image of a sphere? –What is the shape of the image of a circular disk? Assume that the disk lies in a plane that is tilted with respect to the image plane. What would be the image of a set of parallel lines –Do they remain parallel in the image plane?

Note: Equation for a line in 3-D (and in 2-D) Line in 3-D: Line in 2-D By using the projective geometry equations, it is easy to show that a line in 3-D projects as a line in 2-D.

Vanishing Point Vanishing point of a straight line under perspective projection is that point in the image beyond which the projection of the straight line can not extend. –I.e., if the straight line were infinitely long in space, the line would appear to vanish at its vanishing point in the image. –The vanishing point of a line depends ONLY on its orientation is space, and not on its position. –Thus, parallel lines in space appear to meet at their vanishing point in image.

Vanishing Point (source: A Guided tour of computer vision/Vic Nalwa)

The Vanishing Point (source: A Guided tour of computer vision/Vic Nalwa)

Vanishing point (last slide!) For any given spatial orientation, the vanishing point is located at that point on the projection surface where a straight line passing through the center of projection with the given orientation would intersect the projection surface.

Planar vs Spherical Perspective Projection (source: A Guided tour of computer vision/Vic Nalwa)

Spherical Perspective Projection Under parallel perspective projection, straight line map onto straight line. Question: What do straight lines map onto under spherical perspective projection?

Orthographic Projection Projection onto a plane by a set of parallel rays orthogonal to this plane. (source: A Guided tour of computer vision/Vic Nalwa)

Approximation of Perspective Projection A. object dimensions are small compared to the distance of the object from the center of projection. B. Compared to this distance, the object is close to the straight line that passes through COP and is orthogonal to the IP.

Approximation by Parallel Projection (source: A Guided tour of computer vision/Vic Nalwa)

Parallel Projection Parallel Projection is a generalization of orthographic projection in which the object is projected onto the image plane by a set of parallel rays that are not necessarily orthogonal to this plane. Perspective projection can be approximated by parallel projection up to a uniform scale factor whenever the object’s dimensions are small compared to the average distance of the object from the center of projection.

Note: Imaging with a lens

Misfocus Blur

Brightness Irradiance, as a measure of image brightness –Irradiance is the power per unit area (Watts per square meter) of radiant energy falling on a surface. Irradiance

Brightness Scene Brightness -- Radiance –Radiance is the power emitted per unit area into a cone of directions having unit solid angle ( Watts per square meter per steridian.)

Image Formation: Summary –Projection Geometry  What determines the position of a 3D point in the image? –Image Brightness  What determines the brightness of the image of some surface?  This we will discuss later when we talk about shape from shading.

Summary Projection Geometry - determines the position of a 3D point in the image. –Perspective projection –approximations using  orthographic projection  parallel projection –terminology  center of projection  vanishing point  optic axis  focal point, focal length