Digital Image Processing CSC331

Slides:



Advertisements
Similar presentations
Boundary Detection - Edges Boundaries of objects –Usually different materials/orientations, intensity changes.
Advertisements

Spatial Filtering (Chapter 3)
Topic 6 - Image Filtering - I DIGITAL IMAGE PROCESSING Course 3624 Department of Physics and Astronomy Professor Bob Warwick.
Digital Image Processing
Lecture 6 Sharpening Filters
Instructor: Mircea Nicolescu Lecture 6 CS 485 / 685 Computer Vision.
DREAM PLAN IDEA IMPLEMENTATION Introduction to Image Processing Dr. Kourosh Kiani
Digital Image Processing
EE663 Image Processing Edge Detection 1
Edge detection Goal: Identify sudden changes (discontinuities) in an image Intuitively, most semantic and shape information from the image can be encoded.
Lecture 4 Edge Detection
CSSE463: Image Recognition Day 6 Yesterday: Yesterday: Local, global, and point operators all operate on entire image, changing one pixel at a time!! Local,
6/9/2015Digital Image Processing1. 2 Example Histogram.
EE663 Image Processing Edge Detection 2 Dr. Samir H. Abdul-Jauwad Electrical Engineering Department King Fahd University of Petroleum & Minerals.
Processing Digital Images. Filtering Analysis –Recognition Transmission.
Digital Image Processing
Edge Detection Phil Mlsna, Ph.D. Dept. of Electrical Engineering
CS485/685 Computer Vision Dr. George Bebis
Segmentation (Section 10.2)
2-D, 2nd Order Derivatives for Image Enhancement
EE663 Image Processing Edge Detection 3 Dr. Samir H. Abdul-Jauwad Electrical Engineering Department King Fahd University of Petroleum & Minerals.
Lecture 2: Image filtering
Edge Detection Today’s readings Cipolla and Gee –supplemental: Forsyth, chapter 9Forsyth Watt, From Sandlot ScienceSandlot Science.
Edge Detection Hao Huy Tran Computer Graphics and Image Processing CIS 581 – Fall 2002 Professor: Dr. Longin Jan Latecki.
Neighborhood Operations
Spatial Filtering: Basics
Digital Image Processing CCS331 Image Interpolation 1.
Digital Image Processing CCS331
Edge Detection & Image Segmentation Dr. Md. Altab Hossain Associate Professor Dept. of Computer Science & Engineering, RU 1.
Digital Image Processing CSC331
Introduction to Image Processing Grass Sky Tree ? ? Sharpening Spatial Filters.
Digital Image Processing CCS331 Relationships of Pixel 1.
Edges. Edge detection schemes can be grouped in three classes: –Gradient operators: Robert, Sobel, Prewitt, and Laplacian (3x3 and 5x5 masks) –Surface.
Introduction to Image Processing
Chapter 10, Part I.  Segmentation subdivides an image into its constituent regions or objects.  Image segmentation methods are generally based on two.
Digital Image Processing CSC331 Image Enhancement 1.
CS654: Digital Image Analysis Lecture 24: Introduction to Image Segmentation: Edge Detection Slide credits: Derek Hoiem, Lana Lazebnik, Steve Seitz, David.
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities Prof. Charlene Tsai.
Edge Detection and Geometric Primitive Extraction Jinxiang Chai.
Digital Image Processing CSC331 Image restoration 1.
Digital Image Processing CSC331 Morphological image processing 1.
Brent M. Dingle, Ph.D Game Design and Development Program Mathematics, Statistics and Computer Science University of Wisconsin - Stout Edge Detection.
Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos VC 15/16 – TP7 Spatial Filters Miguel Tavares Coimbra.
Digital Image Processing CSC331 Image Enhancement 1.
Digital Image Processing CSC331 Image restoration 1.
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities May 2, 2005 Prof. Charlene Tsai.
Lecture 04 Edge Detection Lecture 04 Edge Detection Mata kuliah: T Computer Vision Tahun: 2010.
Machine Vision Edge Detection Techniques ENT 273 Lecture 6 Hema C.R.
Computer Vision Image Features Instructor: Dr. Sherif Sami Lecture 4.
Instructor: Mircea Nicolescu Lecture 7
Instructor: Mircea Nicolescu Lecture 5 CS 485 / 685 Computer Vision.
Digital Image Processing
Sharpening Spatial Filters ( high pass)  Previously we have looked at smoothing filters which remove fine detail  Sharpening spatial filters seek to.
Digital Image Processing Week V Thurdsak LEAUHATONG.
Image Features (I) Dr. Chang Shu COMP 4900C Winter 2008.
Digital Image Processing CCS331 Relationships of Pixel 1.
Spatial Filtering (Chapter 3) CS474/674 - Prof. Bebis.
EDGE DETECTION Dr. Amnach Khawne. Basic concept An edge in an image is defined as a position where a significant change in gray-level values occur. An.
Miguel Tavares Coimbra
Edge Detection Phil Mlsna, Ph.D. Dept. of Electrical Engineering Northern Arizona University.
Digital Image Processing CSC331
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities Prof. Charlene Tsai.
An Adept Edge Detection Algorithm for Human Knee Osteoarthritis Images
ECE 692 – Advanced Topics in Computer Vision
Edge Detection CS485/685 Computer Vision Dr. George Bebis.
Lecture 2: Edge detection
Digital Image Processing
Lecture 2: Edge detection
IT472 Digital Image Processing
IT472 Digital Image Processing
Presentation transcript:

Digital Image Processing CSC331 Image Enhancement

Summery of previous lecture Sharpening 1st and 2nd order derivatives Laplacian filter Unsharp masking and high-boost filtering

Todays lecture First order derivatives using the gradient operator Shobel operator using first order derivatives What are Edges in image? Modeling intensity changes steps of edge detection

Sharpening The term sharpening is referred to the techniques suited for enhancing the intensity transitions. In images, the borders between objects are perceived because of the intensity change: more crisp the intensity transitions, more sharp the image. The intensity transitions between adjacent pixels are related to the derivatives of the image. Hence, operators (possibly expressed as linear filters) able to compute the derivatives of a digital image are very interesting

Sharpening spatial filter By averaging over an image, then the image becomes blurred or the details in the image are removed. Now, this averaging operation is equivalent to integration operation. The opposite differentiation operation or derivative operations will make the image sharp. We need derivative operations First derivative Second derivative

Laplacian operator Usually the sharpening filters make use of the second order operators.

Laplacian filter

(a) and (c): Isotropic results for increments of 90o (b) and (d): Isotropic results for increments of 45o

Unsharp masking and high-boost filtering The technique known as unsharp masking is a method of common use in graphics for making the images sharper. It consists of: 1. defocusing the original image; 2. obtaining the mask as the difference between the original image and its defocused copy; 3. adding the mask to the original image.

Mask of High Boost

First order derivatives using the gradient operator

Properties of the gradient The magnitude of gradient provides information about the strength of the edge. The direction of gradient is always perpendicular to the direction of the edge (the edge direction is rotated with respect to the gradient direction by -90 degrees).

Shobel operator using first order derivatives

Shobel operator using first order derivatives

The combination of different spatial enhancement methods leads to “better quality” images For instance utilize the Laplacian to highlight fine detail, and the gradient to enhance prominent edges a smoothed version of the gradient image can be used to mask the Laplacian image increase the dynamic range of the gray levels by using a gray-level transformation

What are Edges in image? Edges are significant local changes of intensity in an image. Edges typically occur on the boundary between two different regions in an image Intuitively, edge corresponds to singularities in the image (i.e. where pixel value experiences abrupt change) Detects large intensity transitions between pixels 0 0 0 33 0 0 45 78 0 45 23 33 0 0 42 76 0 0 0 38

Goal of edge detection Produce a line drawing of a scene from an image of that scene. Important features can be extracted from the edges of an image (e.g., corners, lines, curves). These features are used by higher-level computer vision algorithms (e.g., recognition).

Where is the edge? Edge easy to find

Where is the edge? Where is edge? Single pixel wide or multiple pixels?

What causes intensity changes? Various physical events cause intensity changes. Geometric events object boundary (discontinuity in depth and/or surface color and texture) surface boundary (discontinuity in surface orientation and/or surface color and texture) Non-geometric events specularity (direct reflection of light, such as a mirror) shadows (from other objects or from the same object) inter-reflections

Edge descriptors Edge normal: unit vector in the direction of maximum intensity change. Edge direction: unit vector to perpendicular to the edge normal. Edge position or center: the image position at which the edge is located. Edge strength: related to the local image contrast along the normal.

Modeling intensity changes Edges can be modeled according to their intensity profiles. Step edge: the image intensity abruptly changes from one value to one side of the discontinuity to a different value on the opposite side.  Ramp edge: a step edge where the intensity change is not instantaneous but occur over a finite distance.  

Modeling intensity changes Ridge edge: the image intensity abruptly changes value but then returns to the starting value within some short distance (generated usually by lines).

Modeling intensity changes Roof edge: a ridge edge where the intensity change is not instantaneous but occur over a finite distance (generated usually by the intersection of surfaces). 

The four steps of edge detection Smoothing: suppress as much noise as possible, without destroying the true edges. Enhancement: apply a filter to enhance the quality of the edges in the image (sharpening). Detection: determine which edge pixels should be discarded as noise and which should be retained (usually, thresholding provides the criterion used for detection). Localization: determine the exact location of an edge (sub-pixel resolution might be required for some applications, that is, estimate the location of an edge to better than the spacing between pixels). Edge thinning and linking are usually required in this step.

Edge detection using derivatives Calculus describes changes of continuous functions using derivatives. An image is a 2D function, so operators describing edges are expressed using partial derivatives. Points which lie on an edge can be detected by: (1) detecting local maxima or minima of the first derivative (2) detecting the zero-crossing of the second derivative

Gradient Operators • Motivation: detect changes change in the pixel value large gradient Gradient operator edge map image Thresholding x(m,n) I(m,n) g(m,n)

Common Operators • Gradient operator Examples: 1. Roberts operator g1

Common Operators (cont’d) 2. Prewitt operator 3. Sobel operator vertical horizontal

We know Sobel operator. and now we know one application of it.

Summery of the lecture First order derivatives using the gradient operator Shobel operator using first order derivatives What are Edges in image? Modeling intensity changes steps of edge detection

References Prof .P. K. Biswas Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur Gonzalez R. C. & Woods R.E. (2008). Digital Image Processing. Prentice Hall. Forsyth, D. A. & Ponce, J. (2011).Computer Vision: A Modern Approach. Pearson Education.