Digital Image Fundamentals

Slides:



Advertisements
Similar presentations
Image Processing Ch2: Digital image Fundamentals Part 2 Prepared by: Tahani Khatib.
Advertisements

Digital Image Processing
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Digital Image Processing: Digital Imaging Fundamentals.
Digital Image Processing Chapter 2: Digital Image Fundamentals.
Digital Image Fundamentals
Chapter 2 Digital Image Fundamentals. Outline Elements of Visual Perception Light and the Electromagnetic Spectrum Image Sensing and Acquisition Image.
Digital Image Fundamentals Human Vision Lights and Electromagnetic spectrum Image Sensing & Acquisition Sampling & Quantization Basic Relationships b/w.
Chapter 2: Digital Image Fundamentals Fall 2003, 劉震昌.
ECE 472/572 – Digital Image Processing Lecture 2 – Elements of Visual Perception and Image Formation 08/25/11.
Digital Image Fundamentals
Digital Image Processing
: Office Room #:: 7
Image Formation. Input - Digital Images Intensity Images – encoding of light intensity Range Images – encoding of shape and distance They are both a 2-D.
Digital Image Processing Lecture 2
Digital Image Processing in Life Sciences March 14 th, 2012 Lecture number 1: Digital Image Fundamentals.
Digital Image Fundamentals II 1.Image modeling and representations 2.Pixels and Pixel relations 3.Arithmetic operations of images 4.Image geometry operation.
Human Visual Perception The Human Eye Diameter: 20 mm 3 membranes enclose the eye –Cornea & sclera –Choroid –Retina.
Digital Image Fundamentals. What Makes a good image? Cameras (resolution, focus, aperture), Distance from object (field of view), Illumination (intensity.
Digital Image Processing & Analysis Dr. Samir H. Abdul-Jauwad Electrical Engineering Department King Fahd University of Petroleum & Minerals.
Chapter Two Digital Image Fundamentals. Agenda: –Light and Electromagnetic Spectrum –Image Sensing & Acquisition –Image Sampling & quantization –Relationship.
Chapter Teacher: Remah W. Al-Khatib. This lecture will cover:  The human visual system  Light and the electromagnetic spectrum  Image representation.
Digital Image Fundamentals Faculty of Science Silpakorn University.
A Simple Image Model Image: a 2-D light-intensity function f(x,y)
3. Image Sampling & Quantisation 3.1 Basic Concepts To create a digital image, we need to convert continuous sensed data into digital form. This involves.
Lecture 3 The Digital Image – Part I - Single Channel Data 12 September
Digital Image Processing (DIP) Lecture # 5 Dr. Abdul Basit Siddiqui Assistant Professor-FURC 1FURC-BCSE7.
Medical Image Processing & Neural Networks Laboratory 1 Medical Image Processing Chapter 2 Digital Image Fundamentals 國立雲林科技大學 資訊工程研究所 張傳育 (Chuan-Yu Chang.
Digital Image Processing In The Name Of God Digital Image Processing Lecture2: Digital Image Fundamental M. Ghelich Oghli By: M. Ghelich Oghli
CS654: Digital Image Analysis Lecture 5: Pixels Relationships.
Elements of Visual Perception
University of Kurdistan Digital Image Processing (DIP) Lecturer: Kaveh Mollazade, Ph.D. Department of Biosystems Engineering, Faculty of Agriculture,
Image Processing Ch2: Digital image Fundamentals Prepared by: Tahani Khatib.
Digital Image Processing
Some Basic Relationships Between Pixels Definitions: –f(x,y): digital image –Pixels: q, p –Subset of pixels of f(x,y): S.
Image Processing Chapter(3) Part 1:Relationships between pixels Prepared by: Hanan Hardan.
Relationship between pixels Neighbors of a pixel – 4-neighbors (N,S,W,E pixels) == N 4 (p). A pixel p at coordinates (x,y) has four horizontal and vertical.
Digital Image Processing CCS331 Relationships of Pixel 1.
Digital image basics Human vision perception system Image formation
Chapter 10 Digital Signal and Image Processing
What is Digital Image Processing?
Image Sampling and Quantization
Some Basic Relationships Between Pixels
Digital Image Fundamentals
Miguel Tavares Coimbra
EE663-Digital Image Processing & Analysis Dr. Samir H
Image Processing Digital image Fundamentals. Introduction to the course Grading – Project: 30% – Midterm Exam: 30% – Final Exam : 40% – Total: 100% –
Digital Image Processing (DIP)
IMAGE PROCESSING Questions and Answers.
IT – 472 Digital Image Processing
Digital Image Processing
Digital Image Processing
Computer Vision Lecture 4: Color
T490 (IP): Tutorial 2 Chapter 2: Digital Image Fundamentals
Visual Perception, Image Formation, Math Concepts
EEE 501 Applied Digital Image Processing Dr. Türker İnce
Aum Amriteswaryai Namah:
Pixel Relations.
CSC 381/481 Quarter: Fall 03/04 Daniela Stan Raicu
Subject Name: IMAGE PROCESSING Subject Code: 10EC763
Digital Image Fundamentals
Digital Image Processing
Digital Image Fundamentals
IT523 Digital Image Processing
Miguel Tavares Coimbra
IT523 Digital Image Processing
Lecture 2 Digital Image Fundamentals
DIGITAL IMAGE PROCESSING Elective 3 (5th Sem.)
Course No.: EE 6604 Course Title: Advanced Digital Image Processing
Presentation transcript:

Digital Image Fundamentals 4TN3 Image Processing Digital Image Fundamentals

Digital Image Fundamentals Elements of visual perception Image sensing and acquisition Sampling and quantization Relationship between pixels

The Human Visual System (HVS) A true measure of image processing quality is how well the image appears to the observer. The HVS is very complex and is not understood well in a complete sense. However, many of its properties can be identified and used to our advantage.

Structure of Human Eye

Image formation in the eye Lens is flexible and its refraction is controlled by its thickness. Thickness is controlled by the tension of muscles. Focus on distance objects: lens is relatively flattened, refractive power is minimum. Focus on near objects: lens is thicker, refractive power is maximum Perception takes place by excitation of receptors which transform radiant energy into electrical impulses that are decoded by the brain.

Brightness & Intensity The dynamic range of light intensity to which eye can adapt is enormous. Brightness is a logarithmic function of light intensity. HVS cannot operate over the entire range simultaneously. It accomplishes large variations due to brightness adaptation.

Brightness & Intensity Relationship between brightness and intensity is not a simple function!

Brightness & Intensity Mach Band Effect

Image sensing and acquisition If a sensor can be developed that is capable of detecting energy radiated by a band of the EM spectrum, we can image events in that band. Image is generated by energy of the illumination source reflected (natural scenes) or transmitted through objects (X-ray). A sensor detects the energy and converts it to electrical signals. Sensor should have a material that is responsive to the particular type of energy being detected.

A simple image formation model The image magnitude depends on: The amount of source illumination incident on the scene, i(x,y) The amount of illumination reflected by the objects in the scene, r(x,y) (x,y): coordinates Total absorption: r(x,y)=0 Total reflection: r(x,y)=1

Sampling & Quantization Computer processing: image f(x,y) must be digitized both spatially and in amplitude Digitization in spatial coordinates: sampling Digitization in amplitude: quantization Image: [f(i,j)]NxM What should be the values of N, M and the number of gray levels G? Normally: N=2n , M=2m , G=2k

Sampling & Quantization

Sampling & Quantization Number of bits required to store the image: N x M x k. The more the values of N,M and G, the better approximation of a continuous image. Storage and processing requirements increase as well.

Effects of Reducing Spatial Resolution

Effects of Reducing Spatial Resolution

Effects of Reducing Gray Levels

Effects of Reducing Gray Levels

Zooming: increasing the resolution (size) of an image. Zooming and Shrinkage Zooming: increasing the resolution (size) of an image. Shrinkage: decreasing the resolution of an image. zooming shrinkage

Zooming Pixel replication: interpolated pixel in the enlarged image is set to the gray-level of its nearest pixel in the original image. Bilinear interpolation: interpolated pixel in the enlarged image is set to the average its four nearest pixels in the original image.

Zooming: example

Shrinkage Shrinkage by an integer number can be done by deleting some of the rows and columns of the image. Shrinkage by a non-integer factor can be done as the inverse of zooming.

Relationship between pixels 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0

Relationship between pixels Neighbors Adjacency Path Connectivity Region Boundary Distance

Basic relationships between pixels A pixel p at coordinates (x,y) has four horizontal and vertical neighbors: N4(P)={(x+1,y), (x-1,y),(x,y+1),(x,y-1)} The four diagonal neighbors of P ND(P)={(x+1,y+1), (x-1,y-1),(x-1,y+1),(x+1,y-1)} The eight point neighbors of P N8(P)=N4(P)U ND(P)

Adjacency Two pixels are adjacent if they are neighbors and their gray levels are similar V: set of gray levels Similar gray level means that the gray levels of both pixels belong to set V Exp: Binary images: V={1} Gray level image: V={32,33, …,63,64}

Adjacency 4-adjacency: Two pixels p and q with values from V are 4-adjacent if q is in N4(p) 8-adjacency: Two pixels p and q with values from V are 8-adjacent if q is in N8(p) 4-adjacency: broken paths 8-adjacency: multiple paths 0 1 1 0 1 0 0 0 1 0 1 1 0 1 0 0 0 1 0 1 1 0 1 0 0 0 1

Adjacency m-adjacency: Two pixels p and q with values from V are m-adjacent if: q is in N4(p) or q is in ND(p) and the intersection of N4(p) and N4(q) has no pixels with values in V. q q 0 1 1 0 1 0 0 0 1 0 1 1 0 1 0 0 0 1 0 1 1 0 1 0 0 0 1 p p

Region R: a subset of pixels in an image. R is called a region if every pixel in R is connected to any other pixel in R. Boundary (border or contour) of a region: set of pixels in the region that have one or more neighbors that are not in R. 0 1 1 0 0 1 1 1 0 0 1 0 0 0 0 0 0 1 1 0 0 1 1 1 0 0 1 0 1 0 0 0

Connected Components

Distance measures For pixels p,q, and z with coordinates (x,y), (s,t) and (v,w), respectively, D is a distance functions if:

Distance measures D4 distance D8 distance Pixel values D4 distances 0 1 1 0 1 0 0 0 1 2 1 2 1 0 1 1 1 1 1 0 1

Distance measures Dm distance: length of the shortest m-path between two pixels D4, D8 distance between p and q are independent of the pixels along the path Dm depends on the value of the pixels between p and q 0 0 1 1 1 0 1 0 0 0 0 1 0 1 0 1 0 0 Dm=3 Dm=2

Linear & Non-linear operations H: an operator whose inputs and outputs are images H is linear if for any two images f and g and any two scalars a and b H(af+bg)=aH(F)+bH(g)

End of Lecture