Presentation is loading. Please wait.

Presentation is loading. Please wait.

Graphics III Image Processing II.

Similar presentations


Presentation on theme: "Graphics III Image Processing II."— Presentation transcript:

1 Graphics III Image Processing II

2 Acknowledgement Most of this lecture note has been taken from the lecture note on Multimedia Technology course of University of Louisiana at Lafayette. I’d like to thank Assoc. Prof. Dr. William Bares who create such a good work on these lecture notes.

3 Image Processing Applications
Improve contrast, sharpen, remove noise, detect edges of features Detect motion in consecutive frames for motion detection in security systems Retouch scanned photographs Creative effects: warping, emboss, compositing

4 Contrast and Dynamic Range
Contrast: distinction between light and dark shades Dynamic Range: span from minimum to maximum color intensity values Using histograms to analyze contrast and dynamic range

5 Histogram Graph of the number of pixels in an image having each possible pixel value Example, assume monochrome images 8-bit per pixel and allocate a histogram array of 256 integer values all initially zero. Loop over all image pixels p c = Monochrome intensity value of pixel p Histogram[c] = Histogram[c] + 1

6 Grayscale image and its histogram

7 Histograms for RGB color images
For RGB color images, a separate histogram is generated for red, green, and blue components The horizontal axis is labeled with the RGB pixel values , and the vertical axis measures the number of pixels having a given pixel value

8 RGB histogram for a color image

9 Contrast Enhancement Improve the contrast and dynamic range of a dull and washed out image Low contrast grayscale image and its histogram

10 Contrast Enhancement Process
Step 2: Scale histogram to expand dynamic range Scale highest intensity H into a value equal to or close to 255 Scale = 255 / H Loop over all pixels (x,y) ResultPixel2(x,y) = Scale * ResultPixel1(x,y) Step 1: Shift histogram Shift histogram so more pixels have values near zero. Loop over all pixels (x,y) ResultPixel1(x,y) = InputPixel(x,y) – L

11 Image Processing Filters
Convolve pixels of input image using an HxV filter kernel Changing the filter kernel produces a variety of effects such as low pass filter (blur), emboss, edge detect

12 Filter kernel applied to 3x3 block of pixels
For example, the 3x3 filter kernel is to be applied at a pixel (x,y) of an image. The following convolution step is applied for each pixel of an image. Loop over all pixels (x,y) ResultPisel(x,y) = K(1) * P(x-1,y+1) + K(2) * P(x,y+1) + K(3) * P(x+1,y+1) + K(4) * P(x-1,y) K(5) * P(x,y) K(6) * P(x+1,y) + K(7) * P(x-1,y-1) + K(8) * P(x,y-1) K(9) * P(x+1,y-1) Clamp ResultPixel(x,y) to range

13 Image Processing Filter (ต่อ)
Note: Assign pixel values of zero when the filter extends past the edge of the image. RGB color images: Each pixel is represented by 3 values (r,g,b), where 0 <= r,g,b <= 255 Consequently, the above generic 3x3 filter algorithm must be separately performed for the red, green, and blue componentes

14 Edge Dectection

15 Vertical Edge Detection
Original row of pixels Shifted by one Subtract to get result Vertical edge at boundary between 0 and 1 pixel values Result(x,y) = Pixel(x,y) – Pixel(x-1,y) Vertical Edge Detection Filter Kernel 0 0 0 -1 1 0

16 Horizontal Edge Detection
Result(x,y) = Pixel(x,y) – Pixel(x,y-1) Horizontal Edge Detection Filter Kernel 0 0 0 0 1 0 0 -1 0

17 Combined Vertical and Horizontal Edge Detection
Result(x,y) = Pixel(x,y) – Pixel(x-1,y) – Pixel(x,y-1) Combined V&H Edge Detection Filter Kernel 0 0 0 -1 1 0 0 -1 0

18 Emboss Creates the effect of an image punched out of gray metal.
Similar to edge detect with the addition of a constant gray intensity to fill-in the flat areas Result(x,y) = source(x,y) – source(x-dx,y-dy) + 128 Where the integer values of dx, dy determine the direction of the emboss Typical values are dx = dy = 1 or dx = dy = -1

19 Emboss Example

20 Mosaic (Pixelate) Replace the RGB values of a rectangular block of pixels by the average of their RGB values

21 Mosaic Example For example, the below 2x2 block of pixels are all assigned the RGB color value of (50,30,70) found by taking the average RGB color value of the four pixels Compute 2x2 mosaic of pixels

22 Other Useful Filter Kernels
Shadow mask -1 0 1 0 1 2 Enhancement mask: 3x3 smooth 1 2 1 2 4 2

23 Other Useful Filter Kernels (ต่อ)
Enhancement mask: 5x5 smooth

24 Other Useful Filter Kernels (ต่อ)
Enhancement mask: 3x3 sharpen

25 Other Useful Filter Kernels (ต่อ)
Enhancement mask: 5x5 sharpen

26 Morphological Image Processing Operations
Morphological operations deal with the shape or structure of sets of pixels Applications Optical Character Recognition (OCR): identify characters from scanned text pages Segmentation: Identify the pixels forming the boundary of an object in an image

27 Cleaning up Digitized Images
Useful to eliminate errors introduced by noise and limited sampling resolution before applying OCR or segmentation procedures. Erroneous: Extra noise pixels make two separate objects or characters spaced closely together Extra noise pixels may be scattered about the image Missing pixels cause breaks and gaps in objects causing one objectto be mistakenly identified as two separate objects

28 Typical Errors From Digitized Images
Original Document Error Type (a) Error Type (b) Error Type (c)

29 Dilate and Erode Operators
The dilate and erode operators can be employed to eliminate the typical errors found in digitized images These operators would be applied prior to OCR or segmentation In this example, consider only bilevel (black or white 1-bit per pixel) images

30 Binary Dilation Set all white pixels that are adjacent to a black pixel to black Before After Example of binary dilation (original pixels are marked in gray)

31

32 Binary Erosion Identify all black pixels that have at least one neighboring white pixel, and then setting all such pixels to white. Before After Example of binary erosion (eroded pixels are marked in light gray)

33

34 Opening and Closing Opening: application of an erosion followed by a dilation. This combination of operators is useful to eliminate extraneous pixels from a digitized images.

35 Examples of opening to remove erroneous pixels

36 Opening and Closing Closing: Application of a dilation followed by an erosion. This combination of operators is useful to fill-in missing pixels in a digitized image.

37 Example of closing to fill-in missing pixels

38 Digital Image Compositing
Compositing creates a new image by combining multiple source images Television and motion picture special effects represent the major application of compositing

39 Optical Compositing Far before computers and digital image processing were available, early photographers and filmmakers developed optical compositing methods. In 1857 the swedish photgraphers Oscar Rejlander combined the images from 32 glass negatives to produce one massive print. This allowed the photographer to film several small groups of models in separate easier to manage shoots rather than assembling a large cast for a single shoot.

40 Optical Compositing (ต่อ)
Many early movie special effect shots were filmed using the rear projection technique in which the special effects footage was projected onto a screen behind the live action In the frame from King Kong, stop motion footage of the miniature ape was projected onto a screen behind actress Fay Wray.

41 Digital Compositing Digital compositing methods compute the pixels of the resulting composite image by combining the pixels of multiple source images The pseudo-code to blend two equal-sized source images is as follows: Loop over all pixels (x,y) ResultPixel(x,y) = SourcePixelOne(x,y) * Scale1 + SourcePixelTwo(x,y) * Scale2

42 Digital Compositing (ต่อ)
For RGB color images, the blending equation must be applied separately for each of the three color channels for each pixel. ResultPixelRed = SourcePixelOneRed * Scale1 + SourcePixelTwoRed * Scale2 ResultPixelGreen = SourcePixelOneGreen * Scale1 + SourcePixelTwoGreen * Scale2 ResultPixelBlue = SourcePixelOneBlue * Scale1 + SourcePixelTwoBlue * Scale2

43 Digital Compositing (ต่อ)
For example, blend 50% of image one with 50% of image two. Since the two scale factors typically sum to 1.0, it’s only necessary to specify the scale factor to the first image and drive the second by subtracting the first scale factor from 1.0.

44 Alpha Channel Compositing
More intricate compositing effects can be achieved by using an alpha channel mask or matte to specify the scale factors on a pixel-by-pixel basis.

45 Alpha Channel Compositing (ต่อ)
The pseudo-code for the alpha channel compositing operation is as follows: Scale1 = AlphaChannel(x,y) //Assume values are normalized to range 0.0 to 1.0 Scale2 = 1.0 – Scale1 ResultPixel(x,y) = SourceImageOne(x,y) * Scle1 + SourceImageTwo(x,y) * Scale2 Another example of blending two images using a grdient alpha channel matte

46 Blue Screen Compositing
Keying on a designated color can automatically create alpha channel mattes. For example, film action against a solid blue background and create a matte by assigning matte values of 0.0 for each blue pixel and 1.0 otherwise. The background color must be selected so that it does not appear in the people or objects being filmed. Green is also often used as the solid background color.

47 Blue Screen Compositing (ต่อ)
Chroma key systems can replace areas of the background key color with a live video source. This technique is commonly used to composite weather casters over computer generated weather maps and radars displays.

48 Blue Screen Compositing (ต่อ)
For example, lizard puppets are filmed against a blue background and composited over a swamp background in a popular television commercial.

49 Automatic matte extraction from blue screen image

50 Compositing and Computer-Generated (CG) Imagery

51 Live-action crowd. CG filled-in the distant crowds.


Download ppt "Graphics III Image Processing II."

Similar presentations


Ads by Google