Modeling the imaging system Why? If a customer gives you specification of what they wish to see, in what environment the system should perform, you as.

Slides:



Advertisements
Similar presentations
Computer Vision Radiometry. Bahadir K. Gunturk2 Radiometry Radiometry is the part of image formation concerned with the relation among the amounts of.
Advertisements

R2-29 report ppt/Bp © METAS - 1 maets metrology and accreditation switzerland R2-29 Characterization of imaging luminance measurement devices.
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Digital Camera Essential Elements Part 1 Sept
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
16421: Vision Sensors Lecture 6: Radiometry and Radiometric Calibration Instructor: S. Narasimhan Wean 5312, T-R 1:30pm – 2:50pm.
Basic Principles of Surface Reflectance
Photo-realistic Rendering and Global Illumination in Computer Graphics Spring 2012 Material Representation K. H. Ko School of Mechatronics Gwangju Institute.
Computer vision: models, learning and inference
Camera Calibration. Issues: what are intrinsic parameters of the camera? what is the camera matrix? (intrinsic+extrinsic) General strategy: view calibration.
Remote sensing in meteorology
Temperature Dependence of FPN in Logarithmic CMOS Image Sensors Dileepan Joseph¹ and Steve Collins² ¹University of Alberta, Canada ²University of Oxford,
Digital Image Processing Chapter 5: Image Restoration.
Stefano Soatto (c) UCLA Vision Lab 1 Homogeneous representation Points Vectors Transformation representation.
CS485/685 Computer Vision Prof. George Bebis
© 2002 by Davi GeigerComputer Vision January 2002 L1.1 Image Formation Light can change the image (and appearances). What is the relation between pixel.
Basic Principles of Surface Reflectance
CS223b, Jana Kosecka Rigid Body Motion and Image Formation.
Lecture 4: The spectrum, color theory and absorption and photogrammetry Thursday, 14 January Ch 2.3 (color film)
CIS 681 Distributed Ray Tracing. CIS 681 Anti-Aliasing Graphics as signal processing –Scene description: continuous signal –Sample –digital representation.
Describing Visual Air Quality Is A Complex Issue Depending On: characteristics of observer optical characteristics of target illumination of scene optical.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm.
Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan.
Image formation & Geometrical Transforms Francisco Gómez J MMS U. Central y UJTL.
Mohammed Rizwan Adil, Chidambaram Alagappan., and Swathi Dumpala Basaveswara.
Spectral contrast enhancement
Cameras Course web page: vision.cis.udel.edu/cv March 22, 2003  Lecture 16.
Laws of Radiation Heat Transfer P M V Subbarao Associate Professor Mechanical Engineering Department IIT Delhi Macro Description of highly complex Wave.
Radiation: Processes and Properties -Basic Principles and Definitions- Chapter 12 Sections 12.1 through 12.3.
Remote Sensing Image Rectification and Restoration
Image Formation. Input - Digital Images Intensity Images – encoding of light intensity Range Images – encoding of shape and distance They are both a 2-D.
1 CS6825: Image Formation How are images created. How are images created.
Camera Geometry and Calibration Thanks to Martial Hebert.
Image Formation Fundamentals Basic Concepts (Continued…)
Image Formation Dr. Chang Shu COMP 4900C Winter 2008.
High-Resolution Interactive Panoramas with MPEG-4 발표자 : 김영백 임베디드시스템연구실.
Module 2 : Linearity AGENDA TVI Vision, Kari Siren Linearity in general Theory, what does non-linearity mean when measuring “true” colour How to measure,
Geometric and Radiometric Camera Calibration Shape From Stereo requires geometric knowledge of: –Cameras’ extrinsic parameters, i.e. the geometric relationship.
1 An Observatory for Ocean, Climate and Environment SAC-D/Aquarius HSC - Radiometric Calibration H Raimondo M Marenchino 7th SAC-D Aquarius Science Meeting.
Geometric Camera Models
Course 9 Texture. Definition: Texture is repeating patterns of local variations in image intensity, which is too fine to be distinguished. Texture evokes.
Radiometric Correction and Image Enhancement Modifying digital numbers.
A Simple Image Model Image: a 2-D light-intensity function f(x,y)
7 elements of remote sensing process 1.Energy Source (A) 2.Radiation & Atmosphere (B) 3.Interaction with Targets (C) 4.Recording of Energy by Sensor (D)
1 Leonardo Pinheiro da Silva Corot-Brazil Workshop – October 31, 2004 Corot Instrument Characterization based on in-flight collected data Leonardo Pinheiro.
How to startpage 1. How to start How to specify the task How to get a good image.
Development of a Gamma-Ray Beam Profile Monitor for the High-Intensity Gamma-Ray Source Thomas Regier, Department of Physics and Engineering Physics University.
A Flexible New Technique for Camera Calibration Zhengyou Zhang Sung Huh CSPS 643 Individual Presentation 1 February 25,
1 Chapter 2: Geometric Camera Models Objective: Formulate the geometrical relationships between image and scene measurements Scene: a 3-D function, g(x,y,z)
Chapter 26 Lecture 22: Current: II
1Ellen L. Walker 3D Vision Why? The world is 3D Not all useful information is readily available in 2D Why so hard? “Inverse problem”: one image = many.
Electro-optical systems Sensor Resolution
# x pixels Geometry # Detector elements Detector Element Sizes Array Size Detector Element Sizes # Detector elements Pictorial diagram showing detector.
Damian Luna Yetziel Sandoval – Alberto Gonzales – 80546
CSE 185 Introduction to Computer Vision
1 Ch. 4: Radiometry–Measuring Light Preview 。 The intensity of an image reflects the brightness of a scene, which in turn is determined by (a) the amount.
Computer vision: geometric models Md. Atiqur Rahman Ahad Based on: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince.
April / 2010 UFOAnalyzerV2 1 UFOAnalyzerV2 (UA2) the key of accuracy UA2 inputs video clip files and outputs meteor trajectories. UA2 does following steps.
CS580: Radiometry Sung-Eui Yoon ( 윤성의 ) Course URL:
Computer vision: models, learning and inference
MECH 373 Instrumentation and Measurement
UFOAnalyzerV2 (UA2) the key of accuracy
Pre-launch Characteristics and Calibration
- photometric aspects of image formation gray level images
Hyperspectral Image preprocessing
Distributed Ray Tracing
Part One: Acquisition of 3-D Data 2019/1/2 3DVIP-01.
Remote sensing in meteorology
Distributed Ray Tracing
Computed Tomography (C.T)
Presentation transcript:

Modeling the imaging system Why? If a customer gives you specification of what they wish to see, in what environment the system should perform, you as the designer should be able to specify the space of observables, that is the observable reachable set. We can specify the range of parameters in which we can guarantee a certain performance of the system.

What are the components of an imaging system? Optics: Field of view (FOV) Focal length f Center of optical axes Spherical aberration which leads to blur circle,where b is the blur, d is the diameter of the lens and z’ is the imaged distance form the lens, while z is the real distance that we wish to have in focus

The effects of light on our imaging system Denote radial flux measured in Watts Then the amount of energy on a patch A Will be Irradiance E The Radiant Intensity I will be Finally Radiance L is the power per unit projection to solid angle

Light Sources We consider point source (isotropic) and area source (hemispheric) If point but isotropic source, how much light is getting to the surface patch? The radiant intensity over a full sphere is

Point isotropic source The amount of flux projected on solid angle will be proportional to intensity over the full sphere and the solid angle Hence the irradiance on the patch A will be

Reflectance If the source is hemispherical then the Irradiance E will be proportional to radiance multiplied by PI. Reflectance f is defined as the ratio between the amount of light that gets reflected (Radiance L) and the amount of light collected on the surface patch (Irradiance E). In this case the surface becomes the light source! I stands for incident and r for reflected

Diffuse Reflectance Stands for the assumption that the light from the surface appears equally bright from all viewing directions, hence the radiance L is constant. We can write:where B stands for Body, s stands for surface

Lambertian reflection Same assumption as in the ideal diffuse reflection but the surface absorbs some light

Know your sensor:Sensor Errors These errors are called intrinsic. Any undesired feature causing discrepancies in digital image is considered noise. Systematic noise (errors) effect the ACCURACY of vision algorithms. Random errors primarily effect the PRECISION, i.e the variability in the results due to random noise in the digital images. In ordre to establish the accuracy, the results must be compared with ground truth models which are difficult to obtain.

Precision A complete characterization of the precision consists of a probability distribution for output noise. It is very difficult to get this since usually there are too many factors influencing the results. A methodology for performance evaluation should include characterization not only of the errors depending on the environmental conditions (extrinsic errors) but also on sensor characteristics.

The Video Sensor The Video sensor consists of a lens, a CCD camera and a frame-grabber. The image is formed on the CCD array of identical sensing elements (sells) and then transferred by the frame-grabber in a linear order (line by line) to the computer memory. The geometric and radiometric uncertainties and discrepancies in the digital image are due to the optics, the CCD camera and the joint operation of the camera, the frame-grabber and other electronic components.

Camera related noise The total random noise of the CCD has three major components: Photon (shot) noise Read noise Fixed pattern noise The source of the photon noise is external, due to fluctuations of the photon flux and is always present in the data.

Camera related noise The read noise is related to the physics of the camera and the measurement process (the background noise and the output amplifier noise). Noticeable components of the background noise are the dark current (thermally generated charges) and the internal luminescence. Dark current doubles with increase of temperature by 8 degrees but is also due to irregularities in the crystal structure which contributes to a fixed pattern noise. There is also non uniformity in photo response in individual photocells which is observed in flat fields

Radiometric correction Given even, absolutely the same scene and illumination, physically different sensors see differently. Radiometric correction is also called flat fielding. Corrected image is achieved with zero offset and constant gain, we subtract from the original image the offset observed in averaged dark image, and scale the resulting image inversely proportional to the photo response observed in average flat field.

Intrinsic parameters-Geometric Image formation

Intrinsic parameters (contd.) Skew pixels Overall intrinsic parameter matrix

CAMERA PARAMETERS – Radial Distortion Nonlinear transformation along the radial direction Distortion correction: make lines straight

Frame-grabber related noise Geometric discrepancies due to the digitization in the frame-grabber are Aliasing Line jitter Systematic fall-ff of the intensity in a line. An effect of radiometric distortion due to interlacing is the shift in gray level between odd and even fields in a frame.

Example of Multi-camera setup The calibration in the multi-camera setup is more important than otherwise, because the cameras as imaging system must be normalized and coordinated.

Dark Image analysis The Method: We took 3 black and white cameras and closed their lenses in the dark room. For each camera we recorded 100 dark images with delay of 20ms between two consecutive images. Stability of background noise was examined by capturing 100 dark images over 100 second. For each set of dark images average per pixel intensity was calculated and max and min of pixel values was determined

Example of modeling dark noise

One “bad” camera

Table of results of the mean intensity of 1000 dark images

Conclusion on dark image noise Is that by and large this effect and the fixed patter noise is negligible for reconstruction purposes. What remains to be examined is the flat field,that is the response of the sensor to uniform light illumination.

Radial distortion Will assume compensated (Tsai ’86) see Intel OpenCV in lab assignment

Conclusion What is the observable reachable set? It is a multidimensional space delineated by the range limits of all the Image acquisition parameters plus the assumptions that one makes about the illumination system, the environment and the task/context. What is the benefit? One can make performance guarantees within the bounds.

Summary of parameters Optics: FOV, focal length diameter of lens; CCD: light /spectral sensitivity range The dark current, the saturation to light level, The homogeneity of the array of light sensitive cells; Assumptions about illumination source (Intensity ), its extend (point source vs hemispherical source), the distance from the scene and if possible the angle of incident light;

Summary of parameters, cont. Assumptions about the scene materials (lambertian, metallic, plastic, surface texture smooth vs. rough,and so on) Geometric intrinsic parameters of the camera, scale of pixels, center of the imaging device and radial distortion; Spatial temporal and Signal resolution. This includes the speed of Analog to Digital conversion; Finally, if the task is to have a system in varied atmospheric conditions, such as fog, rain and night, different spectral bands will be required

How to obtain these parameters? Some come from the manufacturer and it is the designer who selects them. The other have to be measured during the setup time, calibration time. The parameters on the environment, can only be estimated but the range can be given.