Digital Image Processing in Life Sciences March 14 th, 2012 Lecture number 1: Digital Image Fundamentals.

Slides:



Advertisements
Similar presentations
CS Spring 2009 CS 414 – Multimedia Systems Design Lecture 4 – Digital Image Representation Klara Nahrstedt Spring 2009.
Advertisements

Digital Image Processing
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Chapter 3 Image Enhancement in the Spatial Domain.
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Digital Imaging and Image Analysis
 Image Characteristics  Image Digitization Spatial domain Intensity domain 1.
6/9/2015Digital Image Processing1. 2 Example Histogram.
3. Introduction to Digital Image Analysis
Digital Image Processing Chapter 2: Digital Image Fundamentals.
Digital Image Fundamentals
Chapter 2 Digital Image Fundamentals. Outline Elements of Visual Perception Light and the Electromagnetic Spectrum Image Sensing and Acquisition Image.
Digital Image Processing
Digtial Image Processing, Spring ECES 682 Digital Image Processing Oleh Tretiak ECE Department Drexel University.
1 Image Processing(IP) 1. Introduction 2. Digital Image Fundamentals 3. Image Enhancement in the spatial Domain 4. Image Enhancement in the Frequency Domain.
Chapter 2: Digital Image Fundamentals Fall 2003, 劉震昌.
Digital Images The nature and acquisition of a digital image.
Spectral contrast enhancement
Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos VC 14/15 – TP3 Digital Images Miguel Tavares Coimbra.
The Digital Image.
Digital Image Fundamentals
SCCS 4761 Introduction What is Image Processing? Fundamental of Image Processing.
Digital Image Processing
Image Formation. Input - Digital Images Intensity Images – encoding of light intensity Range Images – encoding of shape and distance They are both a 2-D.
Digital Image Processing Lecture 2
The Digital Image Dr. John Ryan.
Digital Image Fundamentals II 1.Image modeling and representations 2.Pixels and Pixel relations 3.Arithmetic operations of images 4.Image geometry operation.
Seeram Chapter #3: Digital Imaging
September 23, 2014Computer Vision Lecture 5: Binary Image Processing 1 Binary Images Binary images are grayscale images with only two possible levels of.
Chapter Two Digital Image Fundamentals. Agenda: –Light and Electromagnetic Spectrum –Image Sensing & Acquisition –Image Sampling & quantization –Relationship.
EE663 Image Processing Dr. Samir H. Abdul-Jauwad Electrical Engineering Department King Fahd University of Petroleum & Minerals.
Digital Image Processing & Analysis Fall Outline Sampling and Quantization Image Transforms Discrete Cosine Transforms Image Operations Image Restoration.
Image Compression – Fundamentals and Lossless Compression Techniques
Chapter Teacher: Remah W. Al-Khatib. This lecture will cover:  The human visual system  Light and the electromagnetic spectrum  Image representation.
Computer Graphics & Image Processing Lecture 1 Introduction.
Digital Image Fundamentals Faculty of Science Silpakorn University.
A Simple Image Model Image: a 2-D light-intensity function f(x,y)
Digital Imaging. Digital image - definition Image = “a two-dimensional function, f(x,y), where x and y are spatial coordinates, and the amplitude of f.
Lecture 3 The Digital Image – Part I - Single Channel Data 12 September
Digital Image Processing (DIP) Lecture # 5 Dr. Abdul Basit Siddiqui Assistant Professor-FURC 1FURC-BCSE7.
Ch1: Introduction Prepared by: Tahani Khatib AOU
Medical Image Processing & Neural Networks Laboratory 1 Medical Image Processing Chapter 2 Digital Image Fundamentals 國立雲林科技大學 資訊工程研究所 張傳育 (Chuan-Yu Chang.
DIGITAL IMAGE. Basic Image Concepts An image is a spatial representation of an object An image can be thought of as a function with resulting values of.
Digital Image Processing In The Name Of God Digital Image Processing Lecture2: Digital Image Fundamental M. Ghelich Oghli By: M. Ghelich Oghli
Image Processing Ch2: Digital image Fundamentals Prepared by: Tahani Khatib.
Autonomous Robots Vision © Manfred Huber 2014.
Visual Computing Computer Vision 2 INFO410 & INFO350 S2 2015
1 Machine Vision. 2 VISION the most powerful sense.
Nottingham Image Analysis School, 23 – 25 June NITS Image Segmentation Guoping Qiu School of Computer Science, University of Nottingham
ISAN-DSP GROUP Digital Image Fundamentals ISAN-DSP GROUP What is Digital Image Processing ? Processing of a multidimensional pictures by a digital computer.
Digital Image Processing
CS Spring 2010 CS 414 – Multimedia Systems Design Lecture 4 – Audio and Digital Image Representation Klara Nahrstedt Spring 2010.
Some Basic Relationships Between Pixels Definitions: –f(x,y): digital image –Pixels: q, p –Subset of pixels of f(x,y): S.
Digital Image Processing Image Enhancement in Spatial Domain
Instructor: Mircea Nicolescu Lecture 5 CS 485 / 685 Computer Vision.
Digital Image Processing CCS331 Relationships of Pixel 1.
1. 2 What is Digital Image Processing? The term image refers to a two-dimensional light intensity function f(x,y), where x and y denote spatial(plane)
Masaki Hayashi 2015, Autumn Visualization with 3D CG Digital 2D Image Basic.
Digital Image Fundamentals
Miguel Tavares Coimbra
Image Processing Digital image Fundamentals. Introduction to the course Grading – Project: 30% – Midterm Exam: 30% – Final Exam : 40% – Total: 100% –
IMAGE PROCESSING Questions and Answers.
Digital 2D Image Basic Masaki Hayashi
Computer Vision Lecture 4: Color
T490 (IP): Tutorial 2 Chapter 2: Digital Image Fundamentals
Digital Image Fundamentals
CSC 381/481 Quarter: Fall 03/04 Daniela Stan Raicu
Digital Image Processing
Miguel Tavares Coimbra
Presentation transcript:

Digital Image Processing in Life Sciences March 14 th, 2012 Lecture number 1: Digital Image Fundamentals

 What Is Digital Image Processing? (The Origins of Digital Image Processing)  Fundamental Steps in Digital Image Processing  Image Sampling and Quantization  Spatial and Gray-Level Resolution  Some Basic Relationships Between Pixels  Zooming and Shrinking Digital Images  Lookup tables  Color spaces Lecture’s outline

Terms to be conveyed: Pixel Gray level Bit depth Dynamic range Connectivity types/neighborhood Interpolation types Look-up tables

Book: Digital Image Processing, Rafael C. Gonzales and Richard E.Woods Web resources: (very thorough and informative) (beautiful examples, excellent tutorials)

Next topics: 2. Image enhancement in the spatial domain 3. Segmentation 4. Image enhancement in the frequency domain 5. Multi dimensional image processing 6-7. Guest lectures-TBD

What Is A Digital Image? Image= “a two-dimensional function, f(x,y), where x and y are spatial coordinates, and the amplitude of f at any pair of coordinates (x, y) is called the intensity (gray level of the image) at that point. When x, y, and the amplitude values of f are all finite, discrete quantities, we call the image a digital image.” (Gonzalez and Woods).

These sets of numbers can be depicted in terms of frequencies

We can define three types of computerized processes: Low-, mid-, and high-level. Low: image preprocessing, noise reduction, enhance contrast etc. Mid: segmentation, sorting and classification. High: assembly of all components into a meaningful coherent form Digital Image Processing-Points to consider: Why process? Are both the input and output of a process images? Where does image processing stop and image analysis start? Are the processing results intended for human perception or for machine perception? Character recognition and fingerprint comparisons vs intelligence photos…

Digital image origins- The digital image dates back to… the 1920’s and the Bartlane cable picture transmission system between NY and London. The image took 3 hours to transmit, instead of more than one week. They started with 5 tone levels and increased to 15 levels by Taken from Gonzalez and Woods

 What Is Digital Image Processing? (The Origins of Digital Image Processing)  Fundamental Steps in Digital Image Processing  Image Sampling and Quantization  Spatial and Gray-Level Resolution  Some Basic Relationships Between Pixels  Zooming and Shrinking Digital Images  Lookup tables  Color spaces Lecture’s outline

Essential steps when processing digital images: Acquisition Enhancement Restoration Color image restoration Wavelets Morphological processing Segmentation Representation Recognition Outputs are digital images Outputs are attributes of the image

Image acquisition Acquire or receive an image for further processing. This step has a major impact over the entire procedure of processing and analysis. Image Enhancement Improving quality subjectively (e.g. by change of contrast) Image Restoration Improving quality objectively (e.g. by removing psf)

microscopy.fsu.edu

Morphological processing Extracting components for the purpose of representing shapes Segmentation Deconstructing the image into its constituent objects. A crucial step for successful recognition of the image contents.

Morphological processing Extracting components for the purpose of representing shapes Segmentation Deconstructing the image into its constituent objects. A crucial step for successful recognition of the image contents. Representation Feature selection-classification/grouping of objects

 What Is Digital Image Processing? (The Origins of Digital Image Processing)  Fundamental Steps in Digital Image Processing  Image Sampling and Quantization  Spatial and Gray-Level Resolution  Some Basic Relationships Between Pixels  Zooming and Shrinking Digital Images  Lookup tables  Color spaces Lecture’s outline

Keep in mind: The sensor we used to create the image has a continuous output. But, the transition from a continuum to a digital image requires two processes: sampling and quantization. Sampling is the process of digitizing the spatial coordinates. Quantization is the process of digitizing the amplitude values at those spatial coordinates. The arrangement of the sensor used to create the image determines the sampling method and its output. Different limits determine the performance of the optical sensors and of the mechanical sensors. Sampling and quantization

Sampling and quantization result in arrays of discrete quantities. By convention, the coordinate (x,y)=(0,0) is located at the upper leftmost corner of the image. picture elements=image elements=pels=pixels (Gonzales and Woods) Sampling results in typical image sizes that can vary from 128 x 128 to 4096 x 4096 or any combination thereof.

An Image Formation Model Let l(x 0, y 0 ) be the gray level (gl) value at (x 0, y 0 ) : l=f (x 0, y 0 ) l is bounded by L min and L max and the boundary [L min, L max ] is the gray scale. This interval is usually shifted to [0, L-1] where 0 represents black gl values, and L-1 represents white gl values.

Quantization results in discrete values of gray levels, typically an integer power of 2: L=2 k. If k=8, the result is 256 gray levels, from 0 to 255. Dynamic range- the portion of the gray levels in the image out of the entire gray scale of the image. Think about high vs low dynamic range images: how does the dynamic range affect the contrast of the image? Next lecture…

Gray level (bit-depth) Resolution How many bits are required to save a digital image? b=M x N x k (or M 2 k for images of equal dimensions). Size (kb) 8 (256)12 (4096)16 (65536)

8bit images- values are integers, unsigned 16bit images- values are integers, some softwares allow signed. 32bit images-floating-point, signed.

 What Is Digital Image Processing? (The Origins of Digital Image Processing)  Fundamental Steps in Digital Image Processing  Image Sampling and Quantization  Spatial and Gray-Level Resolution  Some Basic Relationships Between Pixels  Zooming and Shrinking Digital Images  Lookup tables  Color spaces Lecture’s outline

Spatial and gray-level resolution Spatial resolution is rather intuitive, and is determined by the quality and “density” of the sampling. Sampling theories (eg Nyquist-Shannon) state that sampling should be performed at a rate that is at least twice the size of the smallest object/highest frequency. Based on this, over-sampling and under-sampling (=spatial aliasing) can occur. Gray level resolution is a term used to describe the binning of the signal rather than the actual difference we managed to obtain when we quantized the signal. 8-bit and 16-bit images are the most common ones, but 10- and 12-bit images can also be found.

128 x x x x 64 Changing the resolution of the image without changing bit-depth checker board patterns

2bit 3bit 4bit 8bit 1bit Changing the bit-depth of the image without changing resolution False contouring

 What Is Digital Image Processing? (The Origins of Digital Image Processing)  Fundamental Steps in Digital Image Processing  Image Sampling and Quantization  Spatial and Gray-Level Resolution  Some Basic Relationships Between Pixels  Zooming and Shrinking Digital Images  Lookup tables  Color spaces Lecture’s outline

(x+1, y), (x-1, y), (x, y+1), (x, y-1)= 4 neighbors of p, or N 4 (p) (x+1, y+1), (x+1, y-1), (x-1, y+1), (x-1, y-1)= the four diagonal neighbors, or N d (p). N 4 (p) together with N d (p) are N 8 (p). Consider the case of image borders. Neighbors of a pixel (x,y) (x+1, y)(x-1, y) (x, y+1) (x, y-1) (x+1, y+1) (x+1, y-1) (x-1, y+1) (x-1, y-1)

Adjacency/Connectivity, Regions, and Boundaries Pixels are said to be connected if they are neighbors and if their gray levels satisfy a specified criterion of similarity. Consider this example of binary pixels V- the set of gray levels used to define adjacency. In this binary example, V={0} to define adjacency of pixels with the value 0. In non-binary images, the values of V can have a wider range.

The region R of an image- a subset of pixels which is a connected set, meaning that there exists a path that connects the adjacent pixels. The boundary (=border=contour) of R is the set of pixels in R that have one or more neighbors that are not in R. What happens when R is the entire image? Do not confuse boundary with edge. The edge is formed by discontinuity of gray levels at a certain point. In binary images, edges and boundaries correspond.

Distances between pixels Between (x,y) and (s,t): Eucladian distance: given by Pythagoras D4 distance (=city-block distance): D 4 (p, q) = |x – s| + |y – t|. 3,3 3,2 3,1 4,2 2,2 2,3 1,34,35,3 4,4 3,42,4 3, Diamond pattern Pixel coordinates:

D 8 (p, q) =max( |x – s|, |y – t|) results in a square pattern around the center pixel

 What Is Digital Image Processing? (The Origins of Digital Image Processing)  Fundamental Steps in Digital Image Processing  Image Sampling and Quantization  Spatial and Gray-Level Resolution  Some Basic Relationships Between Pixels  Zooming and Shrinking Digital Images  Lookup tables  Color spaces Lecture’s outline

Zooming and shrinking digital images Zoom: 1. Create new pixel locations 2. Assign gray level values to the locations

For increasing the size of an image an integer number of times, the method of “pixel replication” is used. For example, when changing a 512 x 512 image to 1024 x 1024, every column and every row in the original image is duplicated. At high magnification factors, checkerboard patterns appear. Nearest neighbor interpolation Bilinear interpolation (2 x 2) Bicubic interpolation (4 x 4) Examples of non-adaptive interpolation

Pixel replication Bilinear Bicubic Scaling up using different methods

 What Is Digital Image Processing? (The Origins of Digital Image Processing)  Fundamental Steps in Digital Image Processing  Image Sampling and Quantization  Spatial and Gray-Level Resolution  Some Basic Relationships Between Pixels  Zooming and Shrinking Digital Images  Lookup tables  Color spaces Lecture’s outline

Look up tables: Save computational time (LUTs can be found early in history…) Require a mapping or transformation function- an equation that converts the brightness value of the input pixel to another value in the output pixel Do not alter pixel values Image transformations that involve look-up tables can be implemented by either one of two mechanisms: at the input so that the original image data are transformed, or at the output so that a transformed image is displayed but the original image remains unmodified.

 What Is Digital Image Processing? (The Origins of Digital Image Processing)  Fundamental Steps in Digital Image Processing  Image Sampling and Quantization  Spatial and Gray-Level Resolution  Some Basic Relationships Between Pixels  Zooming and Shrinking Digital Images  Lookup tables  Color spaces Lecture’s outline

There are ways to describe color images other than the RGB space Color space=color gamut RGB= 3 X 8-bit channels= 24bit= true color The histograms of RGB images can be viewed either as separate channels or as the weighted average of the channels. Some representations of color images calculate a weighted average of green, red and blue.

Hue-Saturation-Intensity (more intuitive, as we perceive the world): Hue= color spectrum, Saturation= color purity, Intensity= brightness More: Hue-Saturation-Lightness; Hue-Saturation-Brightness

End of Lecture 1 Thank you!