Sliding Window Filters Longin Jan Latecki October 9, 2002.

Slides:



Advertisements
Similar presentations
Boundary Detection - Edges Boundaries of objects –Usually different materials/orientations, intensity changes.
Advertisements

Linear Filtering – Part I Selim Aksoy Department of Computer Engineering Bilkent University
Spatial Filtering (Chapter 3)
Topic 6 - Image Filtering - I DIGITAL IMAGE PROCESSING Course 3624 Department of Physics and Astronomy Professor Bob Warwick.
Image Filtering. Outline Outline Concept of image filter  Focus on spatial image filter Various types of image filter  Smoothing, noise reductions 
EDGE DETECTION ARCHANA IYER AADHAR AUTHENTICATION.
Sliding Window Filters and Edge Detection Longin Jan Latecki Computer Graphics and Image Processing CIS 601 – Fall 2004.
CS 4487/9587 Algorithms for Image Analysis
Digital Image Processing
Multimedia communications EG 371Dr Matt Roach Multimedia Communications EG 371 and EE 348 Dr Matt Roach Lecture 6 Image processing (filters)
1Ellen L. Walker Edges Humans easily understand “line drawings” as pictures.
1 Image filtering Hybrid Images, Oliva et al.,
EE663 Image Processing Edge Detection 2 Dr. Samir H. Abdul-Jauwad Electrical Engineering Department King Fahd University of Petroleum & Minerals.
MSU CSE 803 Stockman Linear Operations Using Masks Masks are patterns used to define the weights used in averaging the neighbors of a pixel to compute.
CS 376b Introduction to Computer Vision 02 / 27 / 2008 Instructor: Michael Eckmann.
Edge Detection Phil Mlsna, Ph.D. Dept. of Electrical Engineering
1 Image Filtering Readings: Ch 5: 5.4, 5.5, 5.6,5.7.3, 5.8 (This lecture does not follow the book.) Images by Pawan SinhaPawan Sinha formal terminology.
Introduction to Computer Vision CS / ECE 181B Thursday, April 22, 2004  Edge detection (HO #5)  HW#3 due, next week  No office hours today.
1 Image filtering Images by Pawan SinhaPawan Sinha.
1 Image filtering
Lecture 2: Image filtering
MSU CSE 803 Linear Operations Using Masks Masks are patterns used to define the weights used in averaging the neighbors of a pixel to compute some result.
Computational Photography: Image Processing Jinxiang Chai.
Image Filtering. Problem! Noise is a problem, even in images! Gaussian NoiseSalt and Pepper Noise.
Edges and Contours– Chapter 7. Visual perception We don’t need to see all the color detail to recognize the scene content of an image That is, some data.
Chapter 3 (cont).  In this section several basic concepts are introduced underlying the use of spatial filters for image processing.  Mainly spatial.
Edge Detection Hao Huy Tran Computer Graphics and Image Processing CIS 581 – Fall 2002 Professor: Dr. Longin Jan Latecki.
CS 376b Introduction to Computer Vision 02 / 26 / 2008 Instructor: Michael Eckmann.
Neighborhood Operations
Spatial Filtering: Basics
Edges. Edge detection schemes can be grouped in three classes: –Gradient operators: Robert, Sobel, Prewitt, and Laplacian (3x3 and 5x5 masks) –Surface.
SHINTA P. Juli What are edges in an image? Edge Detection Edge Detection Methods Edge Operators Matlab Program.
Chapter 10, Part I.  Segmentation subdivides an image into its constituent regions or objects.  Image segmentation methods are generally based on two.
CSC508 Convolution Operators. CSC508 Convolution Arguably the most fundamental operation of computer vision It’s a neighborhood operator –Similar to the.
October 7, 2014Computer Vision Lecture 9: Edge Detection II 1 Laplacian Filters Idea: Smooth the image, Smooth the image, compute the second derivative.
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities Prof. Charlene Tsai.
Edge Detection and Geometric Primitive Extraction Jinxiang Chai.
Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos VC 15/16 – TP7 Spatial Filters Miguel Tavares Coimbra.
Chapter 9: Image Segmentation
CSE 6367 Computer Vision Image Operations and Filtering “You cannot teach a man anything, you can only help him find it within himself.” ― Galileo GalileiGalileo.
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities May 2, 2005 Prof. Charlene Tsai.
Course 5 Edge Detection. Image Features: local, meaningful, detectable parts of an image. edge corner texture … Edges: Edges points, or simply edges,
Lecture 04 Edge Detection Lecture 04 Edge Detection Mata kuliah: T Computer Vision Tahun: 2010.
Machine Vision Edge Detection Techniques ENT 273 Lecture 6 Hema C.R.
TOPIC 12 IMAGE SEGMENTATION & MORPHOLOGY. Image segmentation is approached from three different perspectives :. Region detection: each pixel is assigned.
Computer Vision Image Features Instructor: Dr. Sherif Sami Lecture 4.
Instructor: Mircea Nicolescu Lecture 7
September 26, 2013Computer Vision Lecture 8: Edge Detection II 1Gradient In the one-dimensional case, a step edge corresponds to a local peak in the first.
1 Edge Operators a kind of filtering that leads to useful features.
Spatial Filtering (Chapter 3) CS474/674 - Prof. Bebis.
EDGE DETECTION Dr. Amnach Khawne. Basic concept An edge in an image is defined as a position where a significant change in gray-level values occur. An.
Miguel Tavares Coimbra
Edge Detection slides taken and adapted from public websites:
Edge Detection Phil Mlsna, Ph.D. Dept. of Electrical Engineering Northern Arizona University.
Fundamentals of Spatial Filtering:
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities Prof. Charlene Tsai.
Image Segmentation – Detection of Discontinuities
Image Pre-Processing in the Spatial and Frequent Domain
Digital Image Processing
Computer Vision Lecture 9: Edge Detection II
Dr. Chang Shu COMP 4900C Winter 2008
a kind of filtering that leads to useful features
a kind of filtering that leads to useful features
Image filtering Images by Pawan Sinha.
Lecture 2: Edge detection
Linear Operations Using Masks
Feature Detection .
Image Filtering Readings: Ch 5: 5. 4, 5. 5, 5. 6, , 5
IT472 Digital Image Processing
IT472 Digital Image Processing
Presentation transcript:

Sliding Window Filters Longin Jan Latecki October 9, 2002

Linear Image Filters Linear operations calculate the resulting value in the output image pixel f(i,j) as a linear combination of brightness in a local neighborhood of the pixel h(i,j) in the input image. This equation is called to discrete convolution: Function w is called a convolution kernel or a filter mask. In our case it is a rectangle of size (2a+1)x(2b+1).

Exercise: Compute the 2-D linear convolution of the following two signal X with mask w. Extend the signal X with 0’s where needed.

Averaging of brightness values is a special case of discrete convolution. For a 3 x 3 neighborhood the convolution mask w is Applying this mask to an image results in smoothing. Matlab example program is filterEx1.m Local image smoothing can effectively eliminate impulsive noise or degradations appearing as thin stripes, but does not work if degradations are large blobs or thick stripes. Image smoothing = image blurring

The significance of the central pixel may be increased to better reflect properties of Gaussian noise:

Edge detectors locate sharp changes in the intensity function edges are pixels where brightness changes abruptly. Calculus describes changes of continuous functions using derivatives; an image function depends on two variables - partial derivatives. A change of the image function can be described by a gradient that points in the direction of the largest growth of the image function. An edge is a property attached to an individual pixel and is calculated from the image function behavior in a neighborhood of the pixel. It is a vector variable: magnitude of the gradient and direction

The gradient direction gives the direction of maximal growth of the function, e.g., from black (f(i,j)=0) to white (f(i,j)=255). This is illustrated below; closed lines are lines of the same brightness. The orientation 0° points East.

Edges are often used in image analysis for finding region boundaries. Boundary and its parts (edges) are perpendicular to the direction of the gradient.

The gradient magnitude and gradient direction are continuous image functions, where arg(x,y) is the angle (in radians) from the x-axis to the point (x,y).

Sometimes we are interested only in edge magnitudes without regard to their orientations. The Laplacian may be used. The Laplacian has the same properties in all directions and is therefore invariant to rotation in the image. The Laplace operator is a very popular operator approximating the second derivative which gives the gradient magnitude only.

The Laplacian is approximated in digital images by a convolution sum. A 3 x 3 mask for 4-neighborhoods and 8-neighborhood A Laplacian operator with stressed significance of the central pixel or its neighborhood is sometimes used. In this approximation it loses invariance to rotation

A digital image is discrete in nature, derivatives must be approximated by differences. The first differences of the image g in the vertical direction (for fixed i) and in the horizontal direction (for fixed j) n is a small integer, usually 1. The value n should be chosen small enough to provide a good approximation to the derivative, but large enough to neglect unimportant changes in the image function.

Gradient operators can be divided into three categories I. Operators approximating derivatives of the image function using differences. rotationally invariant (e.g., Laplacian) need one convolution mask only. Individual gradient operators that examine small local neighborhoods are in fact convolutions and can be expressed by convolution masks. approximating first derivatives use several masks, the orientation is estimated on the basis of the best matching of several simple patterns. Operators which are able to detect edge direction. Each mask corresponds to a certain direction.

II. Operators based on the zero crossings of the image function second derivative (e.g., Marr-Hildreth or Canny edge detector). III. Operators which attempt to match an image function to a parametric model of edges. Parametric models describe edges more precisely than simple edge magnitude and direction and are much more computationally intensive. The categories II and III will not be covered here;

Roberts operator The magnitude of the edge is computed as The primary disadvantage of the Roberts operator is its high sensitivity to noise, because very few pixels are used to approximate the gradient.

Prewitt operator The Prewitt operator approximates the first derivative, similarly to the Sobel, Kirsch, Robinson and some other operators that follow. Operators approximating first derivative of an image function are sometimes called compass operators because of the ability to determine gradient direction. The gradient is estimated in eight (for a 3 x 3 convolution mask) possible directions. Larger masks are possible. The direction of the gradient is given by the mask giving maximal response. This is valid for all following operators approximating the first derivative.

Used as a simple detector of horizontality and verticality of edges in which case only masks h 1 and h 3 are used. If the h 1 response is y and the h 3 response x, we might then derive edge strength (magnitude) as Sobel operator and direction as arctan (y / x). Matlab example program is filterEx1.m

Robinson operator Kirsch operator

Nonlinear Image Filters Median is an order filter, it uses order statistics. Given an NxN window W(x,y) with pixel (x,y) being the midpoint of W, the pixel intensity values of pixels in W are ordered from smallest to the largest, as follow: Median filter selects the middle value as the value of (x,y).

Morphological Filter

For comparison see Order Filters on Homework 2 Implement in Matlab a linear filter for image smoothing (blurring) and a nonlinear filters: median, opening, and closing. Apply them to noise_1.gif, noise_2.gif in and to one example image of your choice. Compare the results.