Digital Image Processing Lecture 2

Slides:



Advertisements
Similar presentations
Image Processing Ch2: Digital image Fundamentals Part 2 Prepared by: Tahani Khatib.
Advertisements

電腦視覺 Computer and Robot Vision I
Digital Image Processing
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
 Image Characteristics  Image Digitization Spatial domain Intensity domain 1.
Digital Image Processing Chapter 2: Digital Image Fundamentals.
Digital Image Fundamentals
Chapter 2 Digital Image Fundamentals. Outline Elements of Visual Perception Light and the Electromagnetic Spectrum Image Sensing and Acquisition Image.
1 Image Processing(IP) 1. Introduction 2. Digital Image Fundamentals 3. Image Enhancement in the spatial Domain 4. Image Enhancement in the Frequency Domain.
Chapter 2: Digital Image Fundamentals Fall 2003, 劉震昌.
Digital Images The nature and acquisition of a digital image.
The Digital Image.
Digital Image Fundamentals
SCCS 4761 Introduction What is Image Processing? Fundamental of Image Processing.
Digital Image Processing
: Office Room #:: 7
Image Formation. Input - Digital Images Intensity Images – encoding of light intensity Range Images – encoding of shape and distance They are both a 2-D.
Digital Image Processing in Life Sciences March 14 th, 2012 Lecture number 1: Digital Image Fundamentals.
© 1999 Rochester Institute of Technology Introduction to Digital Imaging.
Digital Image Fundamentals II 1.Image modeling and representations 2.Pixels and Pixel relations 3.Arithmetic operations of images 4.Image geometry operation.
Lecture Three Chapters Two and three Photo slides from Digital Image Processing, Gonzalez and Woods, Copyright 2002.
Digital Image Fundamentals. What Makes a good image? Cameras (resolution, focus, aperture), Distance from object (field of view), Illumination (intensity.
Chapter Two Digital Image Fundamentals. Agenda: –Light and Electromagnetic Spectrum –Image Sensing & Acquisition –Image Sampling & quantization –Relationship.
Digital Image Processing Lecture 6: Image Geometry
University of Ioannina - Department of Computer Science Digital Imaging Fundamentals Christophoros Nikou Digital Image Processing Images.
Chapter Teacher: Remah W. Al-Khatib. This lecture will cover:  The human visual system  Light and the electromagnetic spectrum  Image representation.
Digital Image Fundamentals Faculty of Science Silpakorn University.
A Simple Image Model Image: a 2-D light-intensity function f(x,y)
Lecture 3 The Digital Image – Part I - Single Channel Data 12 September
Digital Image Processing (DIP) Lecture # 5 Dr. Abdul Basit Siddiqui Assistant Professor-FURC 1FURC-BCSE7.
Medical Image Processing & Neural Networks Laboratory 1 Medical Image Processing Chapter 2 Digital Image Fundamentals 國立雲林科技大學 資訊工程研究所 張傳育 (Chuan-Yu Chang.
CS482 Selected Topics in Digital Image Processing بسم الله الرحمن الرحيم Instructor: Dr. Abdullah Basuhail,CSD, FCIT, KAU, 1432H Chapter 2: Digital Image.
Digital Image Processing NET 404) ) Introduction and Overview
Computer Vision Introduction to Digital Images.
Digital imaging By : Alanoud Al Saleh. History: It started in 1960 by the National Aeronautics and Space Administration (NASA). The technology of digital.
Digital Image Processing In The Name Of God Digital Image Processing Lecture2: Digital Image Fundamental M. Ghelich Oghli By: M. Ghelich Oghli
CS654: Digital Image Analysis Lecture 5: Pixels Relationships.
Elements of Visual Perception
University of Kurdistan Digital Image Processing (DIP) Lecturer: Kaveh Mollazade, Ph.D. Department of Biosystems Engineering, Faculty of Agriculture,
Digital imaging By : Alanoud Al Saleh. History: It started in 1960 by the National Aeronautics and Space Administration (NASA). The technology of digital.
CS654: Digital Image Analysis Lecture 4: Basic relationship between Pixels.
1 Machine Vision. 2 VISION the most powerful sense.
Introduction to Image Processing. What is Image Processing? Manipulation of digital images by computer. Image processing focuses on two major tasks: –Improvement.
ISAN-DSP GROUP Digital Image Fundamentals ISAN-DSP GROUP What is Digital Image Processing ? Processing of a multidimensional pictures by a digital computer.
Digital Image Processing
Some Basic Relationships Between Pixels Definitions: –f(x,y): digital image –Pixels: q, p –Subset of pixels of f(x,y): S.
Robotics Chapter 6 – Machine Vision Dr. Amit Goradia.
Image Processing Chapter(3) Part 1:Relationships between pixels Prepared by: Hanan Hardan.
DIGITAL IMAGE PROCESSING: DIGITAL IMAGING FUNDAMENTALS.
1. 2 What is Digital Image Processing? The term image refers to a two-dimensional light intensity function f(x,y), where x and y denote spatial(plane)
What is Digital Image Processing?
Image Sampling and Quantization
Digital Image Fundamentals
图像处理技术讲座(3) Digital Image Processing (3) Basic Image Operations
Miguel Tavares Coimbra
Image Processing Digital image Fundamentals. Introduction to the course Grading – Project: 30% – Midterm Exam: 30% – Final Exam : 40% – Total: 100% –
IMAGE PROCESSING Questions and Answers.
Computer Vision Lecture 4: Color
T490 (IP): Tutorial 2 Chapter 2: Digital Image Fundamentals
Digital Image Fundamentals
CSC 381/481 Quarter: Fall 03/04 Daniela Stan Raicu
Subject Name: IMAGE PROCESSING Subject Code: 10EC763
Digital Image Fundamentals
Digital Image Fundamentals
IT523 Digital Image Processing
Miguel Tavares Coimbra
Digital Image Processing
Lecture 2 Digital Image Fundamentals
Presentation transcript:

Digital Image Processing Lecture 2 Tariq Mahmood Khan

Image Processing (Computer Vision) - Recap “Inverse Photography”

Stages in Computer Vision Physics: Image Formation (Light, Reflectance) Physics: Cameras: Optics (Lens), Sensors (CCD, CMOS) Image Processing: Coding (Transmission, Compression) Image Processing: Enhancement (Noise Cleaning, Colors) IP-CV: Feature Detection (Objects, Actions, Motion) Computer Vision: Scene recovery (3D, Reflectance) Computer Vision: Object Recognition Human and Machine Vision: Visual Perception Robotics: Control Action (autonomous driving)

DIP Systems All digital image processing systems consist of some means to (1) digitise / acquire the images, (2) process the images (computing capability), (3) save the images (4) produce human readable hardcopy, and (5) communicate the images to other systems.

Image Acquisition Light is emitted by light source Light is reflected from objects Reflected light is sensed (captured) by eye or by camera In general, any sensor which can produce spatially-distributed intensity values of electromagnetic radiation is suited to image capturing.

Types of Image capturing system In everyday life a number of image capturing systems are used, depending on the application field. They differ in the acquisition principle acquisition speed spatial resolution sensor system

Classification of Sensors Sensors can be categorized into the following classes according to their sensitivity ranges: Electromagnetic sensors - sensitive to a certain range of electromagnetic radiation gamma radiation X-ray radiation the visual spectrum the infrared spectrum the radio wave range Non- Electromagnetic sensors ultrasonic sensors

Image Acquisition 4/22/2017

Image description f (x,y): intensity/brightness of the image at spatial coordinates (x,y) 0< f (x,y)<∞ and determined by 2 factors: Illumination component i(x,y): amount of source light incident Reflectance component r(x,y): amount of light reflected by objects f (x,y) = i(x,y) r(x,y) where 0< i(x,y)<∞: determined by the light source 0< r(x,y)<1: determined by the characteristics of objects In case of X-rays, we would deal with a transmissivity instead of a reflectivity 4/22/2017

The Digital Image Formation The digital image is a numerical computer representation of the physical image. The physical image is divided into small regions called picture elements, or pixels. The number stored in each pixel represents the brightness of the scene in the designated region.

The Digital Image The conversion process from physical to digital image is called digitisation. At each pixel location, the brightness of the physical image is quantized and converted into an integer number, called the grey level.

Sampling and Quantization Digital line scan 4/22/2017

The Digital Image The image displayed is stored as an array of numbers in the computer memory. Colour images are sampled 3 times, giving 3 digital images, 1 each for a primary colour variable (RGB, CMY or HSI).

Digital Image Each pixel has an address in the digital image, i.e. row or line number and column or sample number. Typically, the origin (x,y)=(0,0) is at the top-left corner of the image. A digital image of 640 horizontal pixels and 400 vertical pixels will have address values of x=0-639 and y=0-399.

The Digital Image The digital image should adequately resolve all spatial and intensity details of the original continuous tone image. The Nyquist (sampling) theorem requires that the pixel size should less than half the size of the finest detail in the original image. Likewise, the gray level brightness increments should be less than half the smallest tonal variation in the original image.

The Digital Image Undersampling occurs when the number of pixels in a digital image is too low to accurately represent the fine details present in the original image.

The Digital Image Undersampling results in spatial aliasing. The example shows this effect as Moire patterns.

Digital Image: Spatial and Intensity Resolution Spatial resolution refers to the number of pixels in the digital image. Typically, 256x256 is the minimum acceptable spatial resolution. Intensity resolution refers to the number of grey levels available in the digital image. Number of pixels is typically 2^N for computing convenience.

Spatial Resolution

Spatial Resolution

The Digital Image - Zooming Although a digital image may appear smooth to the human eye, when zoomed up enough the individual pixels always become visible.

Intensity Resolution / Grey level Resolution Intensity resolution refers to the number of grey levels available in the digital image. 256 grey levels (b) 128 grey levels (c) 64 grey levels (d) 32 grey levels

Intensity Resolution / Grey level Resolution For convenient computer storage, the number of grey levels is almost always 2N, N = number of bits. (e) 16 grey levels (f) 8 grey levels (g) 4 grey levels (h) 2 grey levels Image (h) is a binary image.

Intensity Resolution / Grey level Resolution Typically, the minimum number of acceptable grey levels is 16. Note the introduction of false contouring when the brightness resolution is too low.

The Digital Image N : N2 = number of pixels, square digital image. k: 2k = number of grey levels. The memory requirements to store digital images is large. One typical high-resolution image requires 1 Megabyte of memory. Colour images require 3X the memory of monochrome images.

Image Interpolation It is a basic tool used extensively in tasks such as zooming, shrinking, rotating, and geometric corrections. Fundamentally, Interpolation is a process of using known data to estimate values at unknown locations. Original resampling shrinking zooming

Image Interpolation

Image Interpolation Many methods exist in literature for interpolation such as: Pixel Replication / Nearest Neighbor Bilinear Interpolation Bicubic Interpolation

Image Interpolation: Nearest Neighbor Unknown pixel is assigned a value of its nearest neighbor

Image Interpolation: Nearest Neighbor Unknown pixel is assigned a value of its nearest neighbor

Image Interpolation: Bilinear Interpolation Unknown pixel is estimated using values of four neighbors. f(x, y) = ax + by + cxy + d coefficients that need to be estimated Known pixels Unknown pixels

Image Interpolation: Bilinear Interpolation Unknown pixel is estimated using values of four neighbors. f(x, y) = ax + by + cxy + d coefficients that need to be estimated ? Known pixels Unknown pixels

Image Interpolation: Bilinear Interpolation Unknown pixel is estimated using values of four neighbors. f(x, y) = ax + by + cxy + d coefficients that need to be estimated ? Known pixels Unknown pixels Nearest Neighbor pixels

Image Interpolation: Bilinear Interpolation Unknown pixel is estimated using values of four neighbors. f(x, y) = ax + by + cxy + d coefficients that need to be estimated 1 2 ax1 + by1 + cx1y1 + d = f(x1, y1) ax2 + by2 + cx2y2 + d = f(x2, y2) ax3 + by3 + cx3y3 + d = f(x3, y3) ax4 + by4 + cx4y4 + d = f(x4, y4) ? 3 4 a, b, c, d

Image Interpolation: Bilinear Interpolation Unknown pixel is estimated using values of sixteen neighbors.

Image Interpolation

Basic relationships between pixels Neighbours of a pixel – 4-neighbors A pixel p at coordinates (x, y) has four horizontal and vertical neighbors whose coordinates are given by (x+1,y), (x-1,y), (x,y+1), (x,y-1) This set of pixels, called the 4-neighbors of p, is denoted by N4(p). Each pixel is a unit distance from (x, y), and some of the neighbors of p lie outside the digital image if (x, y) is on the border of the image.

Neighbours of a pixel – 8-neighbors The four diagonal neighbors of p have coordinates (x+1,y+1),(x+1,y-1),(x-1,y+1),(x-1,y-1) and are denoted by ND(p). These points, together with the 4-neighbors, are called the 8-neighbors of p, denoted by N8(p). As before, some of the points in ND(p) and N8(p) fall outside the image if (x, y) is on the border of the image.

Some Definitions Two pixels are said to connected if they are neighbors and if their gray levels satisfy a specified criterion of similarity (say, if their gray levels are equal) 4-adjacency. Two pixels p and q with values from V are 4-adjacent if q is in the set N4(p). 8-adjacency. Two pixels p and q with values from V are 8-adjacent if q is in the set N8(p). m-adjacency (mixed adjacency). Two pixels p and q with values from V are m-adjacent if q is in N4(p), or q is in ND(p) and the set N4(p)  N4(q) has no pixels whose values are from V.

Basic relationships between pixels Arrangement of pixels: 0 1 1 0 1 0 0 0 1 4 neighbors N4(p): 1 0 1 0 0 Diagonal neighbors ND(p): 0 1 1 0 1 8 neighbors N8 (p) = ND(p) U N4(p) : 0 1 1 0 1 0 0 0 1

Basic relationships between pixels Mixed Connectivity: Note: Mixed connectivity can eliminate the multiple path connections that often occurs in 8-connectivity Pixel arrangement 8-adjacent to the center pixel m-adjacency

Basic relationships between pixels Path Let coordinates of pixel p: (x, y), and of pixel q: (s, t) A path from p to q is a sequence of distinct pixels with coordinates: (x0, y0), (x1, y1), ......, (xn, yn) where (x0, y0) = (x, y) & (xn, yn) = (s, t), and (xi, yi) is adjacent to (xi-1, yi-1) 1 i  n Regions A set of pixels in an image where all component pixels are connected Boundary of a region A set of pixels of a region R that have one of more neighbors that are not in R

Distance Measures Given coordinates of pixels p, q, and z: (x,y), (s,t), and (u,v) Euclidean distance between p and q: City-block distance between p and q: Chessboard distance between p and q:

Image Operation on a Pixel Basis when we refer to an operation like “dividing one image by another,” we mean specifically that the division is carried out between corresponding pixels in the two images Other arithmetic and logic operations are similarly defined between corresponding pixels in the images involved.

Liner and Nonlinear Operations Let H be an operator whose input and output are images. H is said to be a linear operator if, for any two images f and g and any two scalars a and b, H(af + bg) = aH(f) + bH(g). An operator that fails the test of above equation by definition is nonlinear.

Reading Assignment Chapter 2 (2.3-2.6) of “Digital Image Processing” by Gonzalez.

Assignment Interpolate the following image of size 4x4 to size 8x8 by using: Nearest Neighbor Interpolation Bilinear Interpolation 3 1 2 1 2 2 0 2 1 2 1 1 1 0 1 2 Prob # 2.11 and 2.15 of textbook Due Date 17/09/2012