Presentation is loading. Please wait.

Presentation is loading. Please wait.

Image Display & Enhancement

Similar presentations


Presentation on theme: "Image Display & Enhancement"— Presentation transcript:

1 Image Display & Enhancement
Lecture 2 Prepared by R. Lathrop 10/99 updated 1/03 Readings: ERDAS Field Guide 5th ed Chap 4; Ch 5: ; App A Math Topics:

2 Where in the World?

3 Learning objectives Remote sensing science concepts Math Concepts
Role of additive color process in computer display Spectral enhancement through image stretching Image fusion Image segmentation Math Concepts Summarizing image data sets Measures of central location & dispersion Skills Calculating disk space of digital image Image spectral enhancement: LUT table and stretching methods Image fusion and basic segmentation

4 Analog-to-digital conversion process
A-to-D conversion transforms continuous analog signal to discrete numerical (digital) representation by sampling that signal at a specified frequency Continuous analog signal Most of traditional remote sensing archive are in analog format. We have to converse analog format to digital number form. Discrete sampled value Radiance, L dt Adapted from Lillesand & Kiefer

5 Digital Images Measurement Vector of a pixel - is the set of data file values for one pixel in all n bands Digital Number (DN) or Brightness Value (BV) - the tonal gray scale expressed as a number, typically 8-bit number (0-255) Dimensionality - the number of data layers (bands) For a pixel RS data is a measurement vector of a pixel in all n bands, including number of bands, Brightness value of every band (for Black, BV=0; For white, BV=255)

6 Digital Image Band 1 Multiple spatially co-registered bands, can be displayed singly in B&W or in color composite 255 Band 2 Different bands data are co-registered in a pixel and overlay the color of each bands. Band 3 8 bit DN

7 Image Notation i = row (or line) in the image j = column
Columns = j = 5 Rows = i = 4 i = row (or line) in the image j = column k, l = bands of imagery Bvijk = BV in row i, column j of band k n = total # of pixels in an array For a RS image data, we should know row number, column number, total pixel number, band number (i.e., band resolution) and BV (radiometric resolution).

8 Calculating disk space
[ ( (x * y * b) * n) ] x 1.4 = output file size in bytes where: y = rows x = columns b = number of bytes per pixel per band n = number of bands 1.4 adds 30% for pyramid layers and 10% for other info. e.g. (4*5*1)*4*1.4=896 bytes 4 bit data: b=0.5 8-bit data: b=1; 16 bit-data: b=2; 32-bit data: b=4 How much is disk space for a data set in Erdas Imagine? Number of bytes per pixel per band is depended on your data radiometric resolution (i.e., 4 bit, 8 bit or 16 bit)

9 Digital Image Storage Formats
Band sequential (BSQ) - each band contained in a separate file Band interleaved by line (BIL) - each record in the file contains a scan line (row) of data for one band, with successive bands recorded as successive lines Band Interleaved by Pixel (BIP) There are three digital image storage formats, i.e., band sequential (BSQ), band interleaved by line (BIL) and band interleaved by pixel (BIP).

10 Summarizing data distributions
Frequency distributions - method of describing or summarizing large volumes of data by grouping them into a limited number of classes or categories Histograms - graphical representation of a frequency distribution in the form of a bar chart Two parameters are usually used to summarize data distribution, frequency distribution (or probability distribution function) and histograms.

11 Summarizing Data Distributions: Histograms
255 Digital Number # of pixels Histograms This figure shows histogram bar of a RS data.

12 Measures of Central Location
Mean - simple arithmetic average, the sum of all observations divided by the number of observations Median - the middle number in a data set, midway in the frequency distribution Mode - the value that occurs with the greatest frequency, the peak in a histogram Three parameters to measure central location of the data. Why do we use both mean and median? What’s the difference between the two parameters? Think it about.

13 Measures of Central Location
Mode Median # of pixels Mean 255 Digital Number

14 Measures of Dispersion
Range - the difference between the largest and smallest value Variance - the average of the squared deviations between the data values and the mean Standard Deviation - the square root of the variance, in the units of data measurement Three parameters are usually used to measure dispersion of univariate data.

15 Measures of Dispersion: Range
Example: Range = (max - min) = = 140 255 Digital Number # of pixels Min = 60 Max = 200

16 Covariance & Correlation Matrices
Provide a useful summary of data relationships High variance suggests a higher information content for that band High correlation suggests a substantial amount of redundancy Low correlation suggests that each band provides information not found in the other Covariance is to measure correlation of multivariate data matrix.

17 Covariance Matrix Covariance matrix 1 2 3 4 5 6 7
Diagonals represent band variances. Example, variance for Band 3 = Off-diagonals represent covariances. Example, covariance of Band 1 and 4 = -35.3; same as covariance of Band 4 and 1. Negative covariance: as one band increases, the other decreases. Covariance matrix Covariance matrix is a symmetric matrix (A=AT). Covariance below diagonal is equals to above ones.

18 We use ArcTools to calculate covariance matrix
We use ArcTools to calculate covariance matrix. Click Arctools, go to spatial statistic, go to band collection statistics.

19 Image Display Computer Display Monitor has 3 color planes: R, G, B
that can display DN’s or BV’s with values between 0-255 3 layers of data can be viewed simultaneously: 1 layer in Red plane 1 layer in Green plane 1 layer in Blue plane Computer display monitor use three primary color, red, green and blue to display color composite and brightness value.

20 Image Display: RGB color compositing
Red band, e.g. DN = 0 Green band, e.g., DN = 90 Blue-green pixel (0, 90, 200 RGB) For example, for a given data with three bands, red band, green band and blue band, the composite BV of the pixel is (0, 90, 200) Blue band, e.g., DN = 200

21 Landsat MSS bands 4 and 5 GREEN RED

22 Landsat MSS bands 6 and 7 INFRARED 1 INFRARED 2
Note: water absorbs IR energy-no return=black INFRARED 1 INFRARED 2

23 MSS color composite combining bands creates a false color composite
Manhattan Rutgers combining bands creates a false color composite red=vegetation light blue=urban black=water pink=agriculture Philadelphia Pine barrens Chesapeake Bay Delaware River

24 Primary Colors Red Green Blue

25 Subtractive Primary Colors
Yellow (R+G) absence of blue Cyan (G+B) absence of red Yellow=white-blue; Cyan=white-red; Magenta=white-green. Magenta (R+B) absence of green

26 Additive Color Process
color R G B white black grey red yellow cyan magenta orange dark blue

27 Color Additive Process
Y G W C white=red+green+blue; Yellow=red+green; Cyan=green+blue; Magenta=blue+red. M Black background B

28 Color Subtractive Process
G Y C B B Yellow-Green=red; Yellow-red=green Magenta-blue=red; Magenta-red=blue; Cyan-blue=green; Cyan-green=blue Black=white-blue-red-blue. R M White background

29 Image Spectral Enhancement

30 Image spectral enhancement
Image display devices typically operate over a range of 256 gray levels. Ideally the image data ranges over this full extent. # of pixels Min = 0 Max = 255 Digital Number 255

31 Image spectral enhancement
However, sensor data in a single band rarely extend over this entire range, resulting in a loss of contrast. The objective of spectral enhancement is to determine a transformation function to improve the brightness, contrast and color balance and thereby enhance image interpretability. 255 Digital Number # of pixels Min = 50 Max = 200 Image spectral enhancement is to improve contrast, brightness and color balance. No data No data

32 Image spectral enhancement : lookup tables
Image file values are read into the image processor display memory. These values are then manipulated (streched) for display by specifying the contents of the 256 element color look-up-table (LUT). By changing the LUT, the user can easily change the output display without changing the original file DN values. 1 Creat lookup table; 2 Stretch imagery values into the Lookup table(0-255); 3 Display lookup table values. LUT Input Output LUT Green band DN = 190 Enhanced Green pixel Display DN = 190 Data File Green band DN = 100

33 Image spectral enhancement stretch value
Difference between File Pixel and LUT Value Image spectral enhancement stretch value Original image data

34 Image spectral enhancement: Lookup tables
Pro: All possible values are computed only once - computationally efficient. Not change the original data. Transformation function may be in linear or non-linear functions. LUT method

35 LUT Input-Output relationship: ideal
255 Output DN Input DN 1-to-1 transformation function Output = 127 1-to-1 transformation do not change anything, input=output. Input = 127 From ERDAS Imagine Field Guide 5th Ed.

36 Transformation function
255 Output DN The steeper the transformation line -> the greater the contrast stretch 255 Input DN

37 LUT Breakpoint Editor for ERDAS Imagine
We may LUT breakpoint editor to stretch the minimum or maximum of original data into 0 and 255 respectively. In the lab, we will show how to use it.

38 Image spectral enhancement: Min-max linear contrast stretch
255 60 108 158 Stretch 60 to 0, 158 to 255, respectively. 125

39 Linear transformation function
255 Output DN Input DN Input min = 60 Output min = 0 Input max = 158 Output max = 255 The steeper the transformation line -> the greater the contrast stretch

40 Image spectral enhancement: Min-max linear contrast stretching
Linear stretch: uniform expansion, with all values, including rarely occurring values, weighted equally DN’ = [(DN - MIN)/(MAX - MIN)] x 255 Example: DN = DN’ = [( ) / ( )] x = [48 / 98] x 255 = .49 x 255 = 125 Pro: easy; Con: overstreching without considering frequency distribution Example from Lillesand & Kiefer, 2nd ed

41 Image spectral enhancement: Std. Dev. linear contrast stretching
If data histogram near normal, then 95% of the data is within +- 2 std dev from the mean, 2.5% in each tail 255

42 Why is this image SO magenta colored?
TM 4-5-3 R-G-B Values of red and blue bands are too bright. Too big brightness values of red and blue bands

43 Overstretching: too much of a good thing

44 Image spectral enhancement: Histogram stretching
Histogram stretch: image values are assigned to the display LUT on the basis of their frequency of occurrence greatest contrast near mode least contrast in histogram tails 255 108 158 60 38 Example from Lillesand & Kiefer, 2nd ed

45 Histogram stretching Input max = 158 Output max = 255 Output DN
Input min = 60 Output min = 0 Nonlinear function in tails of distribution 255 Input DN

46 Image spectral enhancement: Contrast stretching
Special stretch: display range can be assigned to any particular user-defined range of image values 255 158 60 92 Focus on a specific range Example from Lillesand & Kiefer, 2nd ed

47 Special piecewise stretching
255 Different sections of the input data stretched to different extents; i.e. different pieces of the transformation function line with different slopes Output DN 255 Input DN

48 Adaptive Filtering Image stretching represents a global operator – i.e. applies the stretch equally across the entire scene and doesn’t take into account local differences in image brightness or other characteristics. Not always the best approach. Adaptive filters work by adapting the stretch to a smaller region of interest, usually the area within a moving window. The previous stretching is a global operator for entire scene, but we may be just interested in the a giver area. We will adaptive Filtering with a moving windows. Detail will be introduce in the lap class.

49 Mini-Quiz Image 1 Can we stretch a image with DN>255 (say 10-bit image, DN) to a 8-bit image (DN values: 0-255)? 510 255 DN # of pixels Answers: Yes, when RANGE (=max-min) ≤255(e.g., image 1), because no info will be lost; Be cautious when RANGE (max-min) >255 (e.g., image 2), since some information will be lost during the stretch. Image 2 1023 DN # of pixels 130 900

50 Multisensor fusion Various techniques have been developed to merge low spatial resolution (but high spectral resolution) with high spatial resolution (but low spectral resolution, e.g., panchromatic) imagery example: TM and ETM+ PAN Multisensor fusion will become more common as the new high spatial resolution PAN imagery becomes more widely available If we have two datasets with different spatial resolutions, one is with lower spatial resolution (e.g., TM, 4-5 m resolution), the other is with higher resolution (e.g., ETM + panchromatic, 1 m resolution). Can we merge two image together? We will use multisensor fusion to do it.

51 One meter Pan-sharpened Multispectral IKONOS imagery (simulated)
Tennis courts in Washington Park, Denver, CO

52 Quickbird image example: Barnegat Bay, NJ 10/18/2004
Panchromatic: m Multispectral (color): m Pixel size for this merged Pan-Multi image is 0.7 m

53 Example: IHS Color-space transform
RGB to IHS: transform fro Red-Green-Blue color space to Intensity-Hue-Saturation Low and high resolution images are co-registered and resampled to same GRC 3 bands of the multispectral image converted to IHS space then PAN band substituted for the Intensity component, then back-transformed into RGB color space A disadvantage is that only 3 bands may be transformed simultaneously We can transform 3 bands (RGB) of image into IHS (Intensity-hue-saturation). PAN band substituted for the intensity component, then back-transformed into RGB color space. Detail methods will be introduced in the lab course. This method is similar to coordinate transformation in mathematic.

54 Intensity, Hue & Saturation color coordinate system
255 Intensity blue Saturation 255 Intensity ranges from 0 to 255; Saturation ranges from 0 to 255; Hue ranges from 0 to 360. 255,0 green red Hue, max Hue=360

55 Example: PCA Spectral domain fusion
Low and high resolution images are co-registered and resampled to same GRC PCA of multispectral image Substitution of PAN image for 1st PC, often the “brightness component”, then backtransform to image space This technique can be used for any number of bands Generally a good compromise between limited spectral distortion and visually attractiveness We use PCA (principal component analysis) to extract main components of multispectral image). Substitute Panchromatic image for 1st PC, backtransform to image space. Detail information will be provided in the lab.

56 Example: High Pass Filter (HPF) method
Capture high frequency information from the high spatial resolution panchromatic image using some form of high pass filter This high frequency information then added into the low spatial resolution multi-spectral imagery Often produces less distortion to the original spectral characteristics of the imagery but also less visually attractive Capture high frequency information from high spatial resolution panchromatic image (single band) and added into low spatial resolution multi-spectural imagery. Detail information will be presented in the next few topics.

57 Example: Brovey Transform fusion
For each spectral band i [DNBi / (DNB1 + DNB2 + DNB3)] x (DN high res. Image) Brovey transform was developed to increase contrast in the low and high tails of the image histogram for visual interpretation- doesn’t preserve the original scene radiometry. Other methods: Multiplicative Spherical Coordinates Wavelets

58 Simple Image Segmentation
Simplifying the image into 2 classes based on thresholding a single image band, so that additional processing can be applied to each class independently < DN threshold = Class 1 >= DN threshold = Class 2 Example: gray level thresholding of NIR band used to segment image into land vs. water binary mask +

59 Summary 1 The method of calculating disk space of digital image;
2 Statistic parameters and color composite; 3 Image spectral enhancement: LUT table and linear and non-linear stretching methods.

60 Homework: Image Statistics (Due Next Monday);
Reading lecture 2 supplement online; Reading textbook Ch. 4, 5: , 8: ; Reading ERDAS Ch. 4, ; ERDAS App A Math topics.


Download ppt "Image Display & Enhancement"

Similar presentations


Ads by Google