Presentation is loading. Please wait.

Presentation is loading. Please wait.

Computer Vision Cameras, lenses and sensors Cosimo Distante Introduction to Image Processing Image.

Similar presentations


Presentation on theme: "Computer Vision Cameras, lenses and sensors Cosimo Distante Introduction to Image Processing Image."— Presentation transcript:

1 Computer Vision Cameras, lenses and sensors Cosimo Distante Cosimo.distante@cnr.it Cosimo.distante@unisalento.it Introduction to Image Processing Image Processing

2 Image Processing Camera Models – Pinhole Perspective Projection Camera with Lenses Sensing The Human Eye Cameras, lenses and sensors

3 Image Processing Images are two-dimensional patterns of brightness values. They are formed by the projection of 3D objects. Figure from US Navy Manual of Basic Optics and Optical Instruments, prepared by Bureau of Naval Personnel. Reprinted by Dover Publications, Inc., 1969.

4 Image Processing Animal eye: a looonnng time ago. Pinhole perspective projection: Brunelleschi, XV th Century. Camera obscura: XVI th Century. Photographic camera: Niepce, 1816.

5 Image Processing Pinhole model

6 Image Processing Distant objects appear smaller

7 Image Processing Parallel lines meet vanishing point

8 Image Processing Vanishing points VPL VPR H VP 1 VP 2 VP 3 To different directions correspond different vanishing points

9 Image Processing Geometric properties of projection Points go to points Lines go to lines Planes go to whole image or half-plane Polygons go to polygons Degenerate cases: –line through focal point yields point –plane through focal point yields line

10 Image Processing Pinhole Camera Model Image plane Optical axis X Of P=(X,Z) P=(x,f)X xZ

11 Image Processing Pinhole Camera Model Image plane Optical axis Y Of P=(Y,Z) P=(y,f)Y yZ

12 Image Processing Pinhole Perspective Equation Focal length Camera frame Scene / world points Optical axis Image plane

13 Image Processing Affine projection models: Weak perspective projection is the magnification. When the scene relief is small compared its distance from the Camera, m can be taken constant: weak perspective projection.

14 Image Processing Affine projection models: Orthographic projection When the camera is at a (roughly constant) distance from the scene, take m=1.

15 Image Processing Planar pinhole perspective Orthographic projection Spherical pinhole perspective

16 Image Processing Limits for pinhole cameras

17 Image Processing Limits for pinhole cameras

18 Image Processing Camera obscura + lens 

19 Image Processing Lenses Snell ’ s law n 1 sin  1 = n 2 sin  2 Descartes ’ law

20 Image Processing Paraxial (or first-order) optics Snell ’ s law: n 1 sin  1 = n 2 sin  2 Small angles: n 1  1  n 2  2

21 Image Processing Thin Lenses spherical lens surfaces; incoming light  parallel to axis; thickness << radii; same refractive index on both sides

22 Image Processing Thin Lenses http://www.phy.ntnu.edu.tw/java/Lens/lens_e.html

23 Image Processing Field of view

24 Parameters of an optical system. Two parameters characterize an optical system focal lenght f focal lenght f Diameter D Diameter D that determines the amount of light hitting the image plane focal point F optical center (Center Of Projection)

25 Parameters of an optical system. Relative Aperture Relative Aperture is the ratio D/f Its inverse is named diaphragm aperture a, defined as: a = f/D f/# The diaphragm is a mechanism to limit the amount of light throug the optical system and reaching the imag plane where photosensors are deposited (i.e CCD sensor) The diaphragm is composed of many lamellae hinged on a ring which rotate in a synchronized manner by varying the size of the circular opening, thus limiting the passage of light Diaphragm F

26 Parameters of an optical system. Aperture scale varies with square of first value is 1 Other values are 1.4, 2, 2.8, 4, 5.6, 8, 11, 16, 32, 45, 60, … Normally an optical system is dinamically configured to project the right amount of light, by compensating with the exposure time

27 Parameters of an optical system. 35mm set at f/ 11, Aperture varies from f/ 2.0 to f/ 22

28 Parameters of an optical system.

29 Lens field of view computation Lens choise depend on the wanted acquired scene. Per le telecamere con CCD 1/4” Focal lenght (mm) = Target distance (m.) x 3,6 : width (m.) Per tutte le altre telecamere con CCD 1/3" Focal lenght (mm) = Target distance (m.) x 4,8 : width (m.)

30 Focus and depth of field. Changing the aperture size affects depth of field A smaller aperture increases the range in which the object is approximately in focus f / 5.6 f / 32 Flower images from Wikipedia http://en.wikipedia.org/wiki/Depth_of_field http://en.wikipedia.org/wiki/Depth_of_field

31 Depth from focus [figs from H. Jin and P. Favaro, 2002] Images from same point of view, different camera parameters 3d shape / depth estimates

32 Field of view Angular measure of portion of 3d space seen by the camera Images from http://en.wikipedia.org/wiki/Angle_of_view K. Grauman

33 As f gets smaller, image becomes more wide angle – more world points project onto the finite image plane As f gets larger, image becomes more telescopic – smaller part of the world projects onto the finite image plane Field of view depends on focal length from R. Duraiswami

34 Field of view depends on focal length Smaller FOV = larger Focal Length Slide by A. Efros

35 Vignetting http://www.ptgui.com/examples/vigntutorial.html http://www.tlucretius.net/Photo/eHolga.html

36 Vignetting “natural”: “mechanical”: intrusion on optical path

37 Chromatic aberration

38

39 Image Processing Deviations from the lens model 3 assumptions : 1. all rays from a point are focused onto 1 image point 2. all image points in a single plane 3. magnification is constant deviations from this ideal are aberrations 

40 Image Processing Aberrations chromatic : refractive index function of wavelength 2 types : 1. geometrical 2. chromatic geometrical : small for paraxial rays  study through 3 rd order optics

41 Image Processing Geometrical aberrations q spherical aberration q astigmatism q distortion q coma aberrations are reduced by combining lenses 

42 Image Processing Spherical aberration rays parallel to the axis do not converge outer portions of the lens yield smaller focal lenghts 

43 Image Processing Astigmatism Different focal length for inclined rays

44 Image Processing Distortion magnification/focal length different for different angles of inclination Can be corrected! (if parameters are know) pincushion (tele-photo) barrel (wide-angle)

45 Image Processing Coma point off the axis depicted as comet shaped blob

46 Image Processing Chromatic aberration rays of different wavelengths focused in different planes cannot be removed completely sometimes achromatization is achieved for more than 2 wavelengths 

47 Digital cameras Film  sensor array Often an array of charge coupled devices Each CCD is light sensitive diode that converts photons (light energy) to electrons camera CCD array optics frame grabber computer K. Grauman

48 Historical context Pinhole model: Mozi (470-390 BCE), Aristotle (384-322 BCE) Principles of optics (including lenses): Alhacen (965-1039 CE) Camera obscura: Leonardo da Vinci (1452-1519), Johann Zahn (1631-1707) First photo: Joseph Nicephore Niepce (1822) Daguerréotypes (1839) Photographic film (Eastman, 1889) Cinema (Lumière Brothers, 1895) Color Photography (Lumière Brothers, 1908) Television (Baird, Farnsworth, Zworykin, 1920s) First consumer camera with CCD: Sony Mavica (1981) First fully digital camera: Kodak DCS100 (1990) Niepce, “La Table Servie,” 1822 CCD chip Alhacen’s notes Slide credit: L. Lazebnik K. Grauman

49

50 Digital Sensors

51 Image Processing CCD vs. CMOS Mature technology Specific technology High production cost High power consumption Higher fill rate Blooming Sequential readout Recent technology Standard IC technology Cheap Low power Less sensitive Per pixel amplification Random pixel access Smart pixels On chip integration with other components

52 Resolution sensor: size of real world scene element a that images to a single pixel image: number of pixels Influences what analysis is feasible, affects best representation choice. [fig from Mori et al]

53 Digital images Think of images as matrices taken from CCD array. K. Grauman

54 im[176][201] has value 164 im[194][203] has value 37 width 520 j=1 500 height i=1 Intensity : [0,255] Digital images K. Grauman

55 Color sensing in digital cameras Source: Steve Seitz Estimate missing components from neighboring values (demosaicing) Bayer grid

56 Filter mosaic Coat filter directly on sensor Demosaicing (obtain full colour & full resolution image)

57 new color CMOS sensor Foveon ’ s X3 better image quality smarter pixels

58 RG B Color images, RGB color space K. Grauman Much more on color in next lecture…

59 Issues with digital cameras Noise –big difference between consumer vs. SLR-style cameras –low light is where you most notice noisenoise Compression –creates artifacts except in uncompressed formats (tiff, raw)artifacts Color –color fringing artifacts from Bayer patternscolor fringing Bayer patterns Blooming –charge overflowing into neighboring pixelsoverflowing In-camera processing –oversharpening can produce haloshalos Interlaced vs. progressive scan video –even/odd rows from different exposureseven/odd rows from different exposures Are more megapixels better? –requires higher quality lens –noise issues Stabilization –compensate for camera shake (mechanical vs. electronic) More info online, e.g., http://electronics.howstuffworks.com/digital-camera.htm http://www.dpreview.com/

60 Image Processing Other Cameras: Line Scan Cameras Line scanner The active element is 1-dimensional Usually employed for inspection They require to have very intense light due to small integration time (from100msec to 1msec)

61 Image Processing The Human Eye Helmoltz ’ s Schematic Eye Reproduced by permission, the American Society of Photogrammetry and Remote Sensing. A.L. Nowicki, “ Stereoscopy. ” Manual of Photogrammetry, Thompson, Radlinski, and Speert (eds.), third edition, 1966.

62 Image Processing The distribution of rods and cones across the retina Reprinted from Foundations of Vision, by B. Wandell, Sinauer Associates, Inc., (1995).  1995 Sinauer Associates, Inc. Cones in the fovea Rods and cones in the periphery Reprinted from Foundations of Vision, by B. Wandell, Sinauer Associates, Inc., (1995).  1995 Sinauer Associates, Inc.


Download ppt "Computer Vision Cameras, lenses and sensors Cosimo Distante Introduction to Image Processing Image."

Similar presentations


Ads by Google