Presentation is loading. Please wait.

Presentation is loading. Please wait.

Digital Image Processing: Introduction

Similar presentations


Presentation on theme: "Digital Image Processing: Introduction"— Presentation transcript:

1 Digital Image Processing: Introduction

2 “One picture is worth more than ten thousand words”
Introduction “One picture is worth more than ten thousand words” Anonymous

3 References “Digital Image Processing”, Rafael C. Gonzalez & Richard E. Woods, Addison-Wesley, 2002 Much of the material that follows is taken from this book “Machine Vision: Automated Visual Inspection and Robot Vision”, David Vernon, Prentice Hall, 1991 Available online at: homepages.inf.ed.ac.uk/rbf/BOOKS/VERNON/

4 UNIT 1

5 Contents This lecture will cover: What is a digital image?
What is digital image processing? History of digital image processing State of the art examples of digital image processing Key stages in digital image processing

6 What is a Digital Image? A digital image is a representation of a two- dimensional image as a finite set of digital values, called picture elements or pixels Real world is continuous – an image is simply a digital approximation of this.

7 Pixel values typically represent gray levels, colours, heights, opacities etc
Remember digitization implies that a digital image is an approximation of a real scene 1 pixel

8 Common image formats include:
1 sample per point (B&W or Grayscale) 3 samples per point (Red, Green, and Blue) 4 samples per point (Red, Green, Blue, and “Alpha”, a.k.a. Opacity) For most of this course we will focus on grey-scale images

9 The figure is an example of digital image that you are now viewing on your computer screen. But actually , this image is nothing but a two dimensional array of numbers ranging between 0 and 255. Each number represents the value of the function f(x,y) at any point. In this case the value 128 , 230 ,123 each represents an individual pixel value. The dimensions of the picture is actually the dimensions of this two dimensional array. 128 230 123 232 321 77 89 80 255

10 What is Digital Image Processing?
Digital image processing focuses on two major tasks Improvement of pictorial information for human interpretation Processing of image data for storage, transmission and representation for autonomous machine perception.

11 How it works In the above figure , an image has been captured by a camera and has been sent to a digital system to remove all the other details , and just focus on the water drop by zooming it in such a way that the quality of the image remains the same.

12 Introduction Signal processing is a discipline in electrical engineering and in mathematics that deals with analysis and processing of analog and digital signals , and deals with storing , filtering , and other operations on signals. These signals include transmission signals , sound or voice signals , image signals , and other signals e.t.c. Out of all these signals , the field that deals with the type of signals for which the input is an image and the output is also an image is done in image processing. As it name suggests, it deals with the processing on images. It can be further divided into analog image processing and digital image processing.

13 Analog image processing
Analog image processing is done on analog signals. It includes processing on two dimensional analog signals. In this type of processing, the images are manipulated by electrical means by varying the electrical signal. The common example include is the television image. Digital image processing The digital image processing deals with developing a digital system that performs operations on an digital image. Digital image processing has dominated over analog image processing with the passage of time due its wider range of applications.

14 In this course we will stop here
The continuum from image processing to computer vision can be broken up into low-, mid- and high-level processes Low Level Process Input: Image Output: Image Examples: Noise removal, image sharpening Mid Level Process Input: Image Output: Attributes Examples: Object recognition, segmentation High Level Process Input: Attributes Output: Understanding Examples: Scene understanding, autonomous navigation Give the analogy of the character recognition system. Low Level: Cleaning up the image of some text Mid level: Segmenting the text from the background and recognising individual characters High level: Understanding what the text says In this course we will stop here

15 History of Digital Image Processing
Early 1920s: One of the first applications of digital imaging was in the news- paper industry The Bartlane cable picture transmission service Images were transferred by submarine cable between London and New York Pictures were coded for cable transfer and reconstructed at the receiving end on a telegraph printer Early digital image

16 History of DIP (cont…) Mid to late 1920s: Improvements to the Bartlane system resulted in higher quality images New reproduction processes based on photographic techniques Increased number of tones in reproduced images Early 15 tone digital image Improved digital image

17 History of DIP (cont…) 1960s: Improvements in computing technology and the onset of the space race led to a surge of work in digital image processing 1964: Computers used to improve the quality of images of the moon taken by the Ranger 7 probe Such techniques were used in other space missions including the Apollo landings A picture of the moon taken by the Ranger 7 probe minutes before landing

18 Typical head slice CAT image
History of DIP (cont…) 1970s: Digital image processing begins to be used in medical applications 1979: Sir Godfrey N. Hounsfield & Prof. Allan M. Cormack share the Nobel Prize in medicine for the invention of tomography, the technology behind Computerised Axial Tomography (CAT) scans Typical head slice CAT image

19 History of DIP (cont…) 1980s - Today: The use of digital image processing techniques has exploded and they are now used for all kinds of tasks in all kinds of areas Image enhancement/restoration Artistic effects Medical visualisation Industrial inspection Law enforcement Human computer interfaces

20 Examples: Image Enhancement
One of the most common uses of DIP techniques: improve quality, remove noise etc

21 Examples: The Hubble Telescope
Launched in 1990 the Hubble telescope can take images of very distant objects However, an incorrect mirror made many of Hubble’s images useless Image processing techniques were used to fix this

22 Examples: Artistic Effects
Artistic effects are used to make images more visually appealing, to add special effects and to make composite images

23 Original MRI Image of a Dog Heart
Examples: Medicine Take slice from MRI scan of canine heart, and find boundaries between types of tissue Image with gray levels representing tissue density Use a suitable filter to highlight edges Original MRI Image of a Dog Heart Edge Detection Image

24 Examples: GIS Geographic Information Systems
Digital image processing techniques are used extensively to manipulate satellite imagery Terrain classification Meteorology

25 Examples: GIS (cont…) Night-Time Lights of the World data set
Global inventory of human settlement Not hard to imagine the kind of analysis that might be done using this data

26 Examples: Industrial Inspection
Human operators are expensive, slow and unreliable Make machines do the job instead Industrial vision systems are used in all kinds of industries Can we trust them?

27 Examples: PCB Inspection
Printed Circuit Board (PCB) inspection Machine inspection is used to determine that all components are present and that all solder joints are acceptable Both conventional imaging and x-ray imaging are used

28 Examples: Law Enforcement
Image processing techniques are used extensively by law enforcers Number plate recognition for speed cameras/automated toll systems Fingerprint recognition Enhancement of CCTV images

29 Examples: HCI Try to make human computer interfaces more natural
Face recognition Gesture recognition Does anyone remember the user interface from “Minority Report”? These tasks can be extremely difficult

30 Applications of Digital Image Processing
Image sharpening and restoration Medical field Remote sensing Transmission and encoding Machine/Robot vision Color processing Pattern recognition Video processing Microscopic Imaging Others

31 Image sharpening and restoration
Image sharpening and restoration refers here to process images that have been captured from the modern camera to make them a better image or to manipulate those images in way to achieve desired result. It refers to do what Photoshop usually does. This includes Zooming, blurring , sharpening , gray scale to color conversion, detecting edges and vice versa , Image retrieval and Image recognition. The common examples are: Original Zoomed Blurr

32 Sharp image Edges

33 UV imaging In the field of remote sensing , the area of the earth is scanned by a satellite or from a very high ground and then it is analyzed to obtain information about it. One particular application of digital image processing in the field of remote sensing is to detect infrastructure damages caused by an earthquake.

34 Hurdle detection Hurdle detection is one of the common task that has been done through image processing, by identifying different type of objects in the image and then calculating the distance between robot and hurdles.

35 Line follower robot Most of the robots today work by following the line and thus are called line follower robots. This help a robot to move on its path and perform some tasks. This has also been achieved through image processing.

36 Fundamental Steps in Digital Image Processing:
Outputs of these processes generally are images Colour Image Processing Wavelets & Multiresolution processing Image Compression Morphological Processing Image Restoration Knowledge Base Segmentation Image Enhancement Representation & Description Image Acquisition Object Recognition Problem Domain

37 Step 1: Image Acquisition
The image is captured by a sensor (eg. Camera), and digitized if the output of the camera or sensor is not already in digital form, using analogue-to-digital convertor

38 Step 2: Image Enhancement
The process of manipulating an image so that the result is more suitable than the original for specific applications. The idea behind enhancement techniques is to bring out details that are hidden, or simple to highlight certain features of interest in an image.

39 Step 3: Image Restoration
- Improving the appearance of an image - Tend to be mathematical or probabilistic models. Enhancement, on the other hand, is based on human subjective preferences regarding what constitutes a “good” enhancement result.

40 Step 4: Colour Image Processing
Use the colour of the image to extract features of interest in an image. Colour modeling and processing in a digital domain etc.

41 Step 5: Wavelets Are the foundation of representing images in various degrees of resolution. It is used for image data compression where images are subdivided into smaller regions.

42 Step 6: Compression Techniques for reducing the storage required to save an image or the bandwidth required to transmit it.

43 Step 7: Morphological Processing
Tools for extracting image components that are useful in the representation and description of shape. In this step, there would be a transition from processes that output images, to processes that output image attributes.

44 Step 8: Image Segmentation Segmentation procedures partition an image into its constituent parts or objects. Important Tip: The more accurate the segmentation, the more likely recognition is to succeed.

45 Step 9: Representation and Description
Representation: Make a decision whether the data should be represented as a boundary or as a complete region. It is almost always follows the output of a segmentation stage. Boundary Representation: Focus on external shape characteristics, such as corners and inflections. Region Representation: Focus on internal properties, such as texture or skeleton shape. Transforming raw data into a form suitable for subsequent computer processing. Description deals with extracting attributes that result in some quantitative information of interest or are basic for differentiating one class of objects from another.

46

47 Step 10: Object Recognition Recognition: the process that assigns label to an object based on the information provided by its description. Recognition is the process that assigns a label, such as, “vehicle” to an object based on its descriptors.

48 Components of an Image Processing System
Network Image displays Computer Mass storage Hardcopy Specialized image processing hardware Image processing software Typical general-purpose DIP system Image sensors Problem Domain

49 Components of an Image Processing System
Image Sensors Two elements are required to acquire digital images. The first is the physical device that is sensitive to the energy radiated by the object we wish to image (Sensor). The second, called a digitizer, is a device for converting the output of the physical sensing device into digital form.

50 Components of an Image Processing System
2. Specialized Image Processing Hardware Usually consists of the digitizer, mentioned before, plus hardware that performs other primitive operations, such as an arithmetic logic unit (ALU), which performs arithmetic and logical operations in parallel on entire images. This type of hardware sometimes is called a front-end subsystem, and its most distinguishing characteristic is speed. In other words, this unit performs functions that require fast data throughputs that the typical main computer cannot handle.

51 Components of an Image Processing System
4. Image Processing Software Software for image processing consists of specialized modules that perform specific tasks. A well-designed package also includes the capability for the user to write code that, as a minimum, utilizes the specialized modules.

52 Components of an Image Processing System
5. Mass Storage Capability Mass storage capability is a must in a image processing applications. And image of sized 1024 * 1024 pixels requires one megabyte of storage space if the image is not compressed. Digital storage for image processing applications falls into three principal categories: 1. Short-term storage for use during processing. 2. on line storage for relatively fast recall 3. Archival storage, characterized by infrequent access

53 Components of an Image Processing System
5. Mass Storage Capability One method of providing short-term storage is computer memory. Another is by specialized boards, called frame buffers, that store one or more images and can be accessed rapidly. The on-line storage method, allows virtually instantaneous image zoom, as well as scroll (vertical shifts) and pan (horizontal shifts). On-line storage generally takes the form of magnetic disks and optical-media storage. The key factor characterizing on-line storage is frequent access to the stored data. Finally, archival storage is characterized by massive storage requirements but infrequent need for access.

54 Components of an Image Processing System
6. Image Displays The displays in use today are mainly color (preferably flat screen) TV monitors. Monitors are driven by the outputs of the image and graphics display cards that are an integral part of a computer system.

55 Components of an Image Processing System
7. Hardcopy devices Used for recording images, include laser printers, film cameras, heat-sensitive devices, inkjet units and digital units, such as optical and CD-Rom disks.

56 Components of an Image Processing System
8. Networking Is almost a default function in any computer system, in use today. Because of the large amount of data inherent in image processing applications the key consideration in image transmission is bandwidth. In dedicated networks, this typically is not a problem, but communications with remote sites via the internet are not always as efficient.

57 Summary We have looked at:
What is a digital image? What is digital image processing? History of digital image processing State of the art examples of digital image processing Key stages in digital image processing Next time we will start to see how it all works…

58 Digital image = a multidimensional
array of numbers (such as intensity image) or vectors (such as color image) Each component in the image called pixel associates with the pixel value (a single number in the case of intensity images or a vector in the case of color images).

59 Visual Perception: Human Eye
(Picture from Microsoft Encarta 2000)

60 Cross Section of the Human Eye
(Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.

61 The lens contains 60-70% water, 6% of fat.
Visual Perception: Human Eye (cont.) The lens contains 60-70% water, 6% of fat. The iris diaphragm controls amount of light that enters the eye. Light receptors in the retina - About 6-7 millions cones for bright light vision called photopic - Density of cones is about 150,000 elements/mm2. - Cones involve in color vision. - Cones are concentrated in fovea about 1.5x1.5 mm2. - About millions rods for dim light vision called scotopic - Rods are sensitive to low level of light and are not involved color vision. 4. Blind spot is the region of emergence of the optic nerve from the eye.

62 Blind-Spot Experiment
Draw an image similar to that below on a piece of paper (the dot and cross are about 6 inches apart) Close your right eye and focus on the cross with your left eye Hold the image about 20 inches away from your face and move it slowly towards you The dot should disappear!

63

64 Image Formation In The Eye
Muscles within the eye can be used to change the shape of the lens allowing us focus on objects that are near or far away An image is focused onto the retina causing rods and cones to become excited which ultimately send signals to the brain

65 Brightness Adaptation of Human Eye : Mach Band Effect
Position Intensity

66 Mach Band Effect Intensities of surrounding points effect perceived brightness at each point. In this image, edges between bars appear brighter on the right side and darker on the left side. (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.

67 In area A, brightness perceived is darker while in area B is
Mach Band Effect (Cont) A B Intensity Position In area A, brightness perceived is darker while in area B is brighter. This phenomenon is called Mach Band Effect.

68 but they appear progressively darker as background becomes lighter.
Brightness Adaptation of Human Eye : Simultaneous Contrast Simultaneous contrast. All small squares have exactly the same intensity but they appear progressively darker as background becomes lighter.

69 Simultaneous Contrast
(Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.

70 Optical illusion (Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.

71 Visible Spectrum (Images from Rafael C. Gonzalez and Richard E.
Wood, Digital Image Processing, 2nd Edition.

72 Image Sensors Single sensor Line sensor Array sensor
(Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.

73 Image Sensors : Single Sensor
(Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.

74 Fingerprint sweep sensor Computerized Axial Tomography
Image Sensors : Line Sensor Fingerprint sweep sensor Computerized Axial Tomography (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.

75 Image “After snow storm”
Fundamentals of Digital Images x y Origin Image “After snow storm” f(x,y) w An image: a multidimensional function of spatial coordinates. w Spatial coordinate: (x,y) for 2D case such as photograph, (x,y,z) for 3D case such as CT scan images (x,y,t) for movies w The function f may represent intensity (for monochrome images) or color (for color images) or other associated values.

76 Conventional Coordinate for Image Representation
(Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.

77 Digital Image Types : Intensity Image
Intensity image or monochrome image each pixel corresponds to light intensity normally represented in gray scale (gray level). Gray scale values

78 Digital Image Types : RGB Image
Color image or RGB image: each pixel contains a vector representing red, green and blue components. RGB components

79 Image Types : Binary Image
Binary image or black and white image Each pixel contains one bit : 1 represent white 0 represents black Binary data

80 Image Types : Index Image
Each pixel contains index number pointing to a color in a color table Color Table Index No. Red component Green Blue 1 0.1 0.5 0.3 2 1.0 0.0 3 4 5 0.2 0.8 0.9 Index value

81 Digital Image Acquisition Process
(Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.

82 Generating a Digital Image

83 Conventional indexing method
Basic Relationship of Pixels (0,0) x (x,y) (x+1,y) (x-1,y) (x,y-1) (x,y+1) (x+1,y-1) (x-1,y-1) (x-1,y+1) (x+1,y+1) y Conventional indexing method

84 N4(p) = Neighbors of a Pixel 4-neighbors of p:
Neighborhood relation is used to tell adjacent pixels. It is useful for analyzing regions. (x,y-1) 4-neighbors of p: (x-1,y) (x+1,y) (x-1,y) (x+1,y) (x,y-1) (x,y+1) N4(p) = p (x,y+1) 4-neighborhood relation considers only vertical and horizontal neighbors. Note: q Î N4(p) implies p Î N4(q)

85 N8(p) = Neighbors of a Pixel (cont.) 8-neighbors of p: (x-1,y-1)
8-neighborhood relation considers all neighbor pixels.

86 Diagonal neighbors of p:
Neighbors of a Pixel (cont.) Diagonal neighbors of p: (x-1,y-1) (x+1,y-1) (x-1,y-1) (x+1,y-1) (x-1,y+1) (x+1,y+1) p ND(p) = (x-1,y+1) (x+1,y+1) Diagonal -neighborhood relation considers only diagonal neighbor pixels.

87 Connectivity Connectivity is adapted from neighborhood relation.
Two pixels are connected if they are in the same class (i.e. the same color or the same range of intensity) and they are neighbors of one another. For p and q from the same class w 4-connectivity: p and q are 4-connected if q Î N4(p) w 8-connectivity: p and q are 8-connected if q Î N8(p) w mixed-connectivity (m-connectivity): p and q are m-connected if q Î N4(p) or q Î ND(p) and N4(p) Ç N4(q) = Æ

88 Adjacency A pixel p is adjacent to pixel q is they are connected.
Two image subsets S1 and S2 are adjacent if some pixel in S1 is adjacent to some pixel in S2 S1 S2 We can define type of adjacency: 4-adjacency, 8-adjacency or m-adjacency depending on type of connectivity.

89 Types of Adjacency 4-adjacency: Two pixels p and q with values from V are 4-adjacent if q is in the set N4(p). 8-adjacency: Two pixels p and q with values from V are 8-adjacent if q is in the set N8(p). m-adjacency =(mixed)

90 Types of Adjacency m-adjacency:
Two pixels p and q with values from V are m-adjacent if : q is in N4(p) or q is in ND(p) and the set N4(p) ∩ N4(q) has no pixel whose values are from V (no intersection) Important Note: the type of adjacency used must be specified

91 Types of Adjacency Mixed adjacency is a modification of 8- adjacency. It is introduced to eliminate the ambiguities that often arise when 8-adjacency is used. For example:

92 Types of Adjacency In this example, we can note that to connect between two pixels (finding a path between two pixels): In 8-adjacency way, you can find multiple paths between two pixels While, in m-adjacency, you can find only one path between two pixels So, m-adjacency has eliminated the multiple path connection that has been generated by the 8-adjacency. Two subsets S1 and S2 are adjacent, if some pixel in S1 is adjacent to some pixel in S2. Adjacent means, either 4-, 8- or m-adjacency.

93 Path A path from pixel p at (x,y) to pixel q at (s,t) is a sequence of distinct pixels: (x0,y0), (x1,y1), (x2,y2),…, (xn,yn) such that (x0,y0) = (x,y) and (xn,yn) = (s,t) and (xi,yi) is adjacent to (xi-1,yi-1), i = 1,…,n q p We can define type of path: 4-path, 8-path or m-path depending on type of adjacency.

94 Path (cont.) 8-path m-path p q p q p q m-path from p to q
solves this ambiguity 8-path from p to q results in some ambiguity

95 Distance For pixels p, q and z, with coordinates (x,y), (s,t) and (v,w), respectively, D is a distance function if: (a) D (p,q) ≥ 0 (D (p,q) = 0 iff p = q), (b) D (p,q) = D (q, p), and (c) D (p,z) ≤ D (p,q) + D (q,z).

96 Distance (cont.) D4-distance (city-block distance) is defined as 1 2
Pixels with D4(p) = 1 is 4-neighbors of p.

97 Distance (cont.) D8-distance (chessboard distance) is defined as 2 2 2
1 1 1 2 2 1 1 2 2 1 1 1 2 2 2 2 2 2 Pixels with D8(p) = 1 is 8-neighbors of p.

98 Distance Measures The Euclidean Distance between p and q is defined as: De (p,q) = [(x – s)2 + (y - t)2]1/2 Pixels having a distance less than or equal to some value r from (x,y) are the points contained in a disk of radius r centered at (x,y) q (s,t) De (p,q) p (x,y)

99 Distance Measures The D4 distance (also called city-block distance) between p and q is defined as: D4 (p,q) = | x – s | + | y – t | Pixels having a D4 distance from (x,y), less than or equal to some value r form a Diamond centered at (x,y) q (s,t) D4 p (x,y)

100 Distance Measures Example: The pixels with distance D4 ≤ 2 from (x,y) form the following contours of constant distance. The pixels with D4 = 1 are the 4-neighbors of (x,y)

101 Distance Measures The D8 distance (also called chessboard distance) between p and q is defined as: D8 (p,q) = max(| x – s |,| y – t |) Pixels having a D8 distance from (x,y), less than or equal to some value r form a square Centered at (x,y) q (s,t) D8(b) p (x,y) D8(a) D8 = max(D8(a) , D8(b))

102 Distance Measures Example: D8 distance ≤ 2 from (x,y) form the following contours of constant distance.

103 Distance Measures Dm distance:
is defined as the shortest m-path between the points. In this case, the distance between two pixels will depend on the values of the pixels along the path, as well as the values of their neighbors.

104 Distance Measures Example:
Consider the following arrangement of pixels and assume that p, p2, and p4 have value 1 and that p1 and p3 can have can have a value of 0 or 1 Suppose that we consider the adjacency of pixels values 1 (i.e. V = {1})

105 Distance Measures Cont. Example:
Now, to compute the Dm between points p and p4 Here we have 4 cases: Case1: If p1 =0 and p3 = 0 The length of the shortest m-path (the Dm distance) is 2 (p, p2, p4)

106 Distance Measures Cont. Example: Case2: If p1 =1 and p3 = 0
now, p1 and p will no longer be adjacent (see m-adjacency definition) then, the length of the shortest path will be 3 (p, p1, p2, p4)

107 Distance Measures Cont. Example: Case3: If p1 =0 and p3 = 1
The same applies here, and the shortest – m-path will be 3 (p, p2, p3, p4)

108 Distance Measures Cont. Example: Case4: If p1 =1 and p3 = 1
The length of the shortest m-path will be 4 (p, p1 , p2, p3, p4)

109 Image Sampling and Quantization
Image sampling: discretize an image in the spatial domain Spatial resolution / image resolution: pixel size or number of pixels (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.

110 Under sampling, we lost some image details!
How to choose the spatial resolution Spatial resolution = Sampling locations Original image Sampled image Under sampling, we lost some image details!

111 How to choose the spatial resolution : Nyquist Rate
Sampled image Original image 2mm 1mm Minimum Period Spatial resolution (sampling rate) No detail is lost! Nyquist Rate: Spatial resolution must be less or equal half of the minimum period of the image or sampling frequency must be greater or Equal twice of the maximum frequency. = Sampling locations

112 Aliased Frequency Sampling rate: 5 samples/sec
Two different frequencies but the same results !

113 Effect of Spatial Resolution
256x256 pixels 64x64 pixels 128x128 pixels 32x32 pixels

114 Effect of Spatial Resolution
(Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.

115 Moire Pattern Effect : Special Case of Sampling
Moire patterns occur when frequencies of two superimposed periodic patterns are close to each other. (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.

116 Effect of Spatial Resolution
(Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.

117 Can we increase spatial resolution by interpolation ?
Down sampling is an irreversible process. (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.

118 Image Quantization Image quantization:
discretize continuous pixel values into discrete numbers Color resolution/ color depth/ levels: - No. of colors or gray levels or - No. of bits representing each pixel value - No. of colors or gray levels Nc is given by where b = no. of bits

119 Quantization function
Quantization level 2 1 Light intensity Darkest Brightest

120 Effect of Quantization Levels

121 16 levels 8 levels In this image, it is easy to see false contour.
Effect of Quantization Levels (cont.) 16 levels 8 levels In this image, it is easy to see false contour. 4 levels 2 levels

122 2. As an image size increase, fewer gray levels may be needed.
How to select the suitable size and pixel depth of images The word “suitable” is subjective: depending on “subject”. Low detail image Medium detail image High detail image Lena image Cameraman image To satisfy human mind 1. For images of the same size, the low detail image may need more pixel depth. 2. As an image size increase, fewer gray levels may be needed. (Images from Rafael C. Gonzalez and Richard E. Wood, Digital Image Processing, 2nd Edition.

123 Human vision: Spatial Frequency vs Contrast

124 Human vision: Distinguish ability for Difference in brightness
Regions with 5% brightness difference


Download ppt "Digital Image Processing: Introduction"

Similar presentations


Ads by Google