Presentation is loading. Please wait.

Presentation is loading. Please wait.

Multispectral Format from Perspective of Remote Sensing

Similar presentations

Presentation on theme: "Multispectral Format from Perspective of Remote Sensing"— Presentation transcript:

1 Multispectral Format from Perspective of Remote Sensing
Rulon E. Simmons (585) These charts show the similarities and differences between the requirements for a multispectral format for Remote Sensing and for other applications. They suggest that considering requirements for other applications to be a subset of requirements for Remote Sensing is a reasonable approach. Doing so seems to be in line with the most recent traffic that suggested that non-visible wavelengths not be excluded from the format, and as Danny Rich stated: “Adopting a file format that is a clear superset of what is required for basic color imaging does not cost us anything.”

2 Outline Color Issues Format issues Steps to color processing
Color Accuracy in Remote Sensing Handling color display of non-visible bands Format issues Similarities and differences with other multispectral applications XML for Metadata Compression From the point of view of the CIE, color representation is clearly the focus. This is true of the disciplines that look to the CIE for guidance including Remote Sensing. While Remote Sensing folks may also choose to evaluate their imagery with computer algorithms, visual inspection is still very important and common. Handling the data in a way that optimizes the visual display is very desirable. The Remote Sensing community currently has some accepted standard formats that are used for handling imagery. While these have been extended to handle multispectral imagery, there is currently a lack of standardization relative to color processing and display.

3 Contributing Persons:
Color Issues Contributing Persons: Rulon Simmons Tim Elder Scott Bennett Michael Vaughn The following charts show a four-step process to handling color issues. This process actually comes from the digital camera world. There are current efforts to introduce this process into the Remote Sensing community.

4 Color Processing Approach*
RGB raw Pre- Processing Color Correction Rendering Output Preparation DRA Channel Balance RGB’  XYZ XYZ  LAB TTC MTFC LAB  XYZ’ XYZ’  RGB” Gamma The four steps to color processing are: (1) Pre-processing; (2) Color Correction; (3) Rendering; and (4) Output Preparation. From a format point of view, recording the transformations used in the first three steps will help others obtain similar results when viewing the same imagery. The transformations of the last step are display-specific and are therefore not ones that would normally be stored with the imagery. Display *Note: This is a standard approach used with digital photography.

5 Pre-Processing Calibration Point Black Point
In this example, a 12-bit image is acquired. If the red, green, and blue channels were displayed as acquired, the color balance would generally not be good. So the first thing that is done is to select a black reference point (sometimes chosen at a 1% penetration of the low end of the image histogram). This black pixel is readjusted in each channel to a count level of zero. Then one or more calibration points are chosen at the mid to upper range of the histogram. Each point must have a measured or estimated reflectance in each color channel. This can be done by imaging a gray-scale or color patches or by using some other object whose reflectance is more or less known. A regression using many points is preferable when there is much uncertainty in the reflectance knowledge of the calibration points.

6 Pre-Processing (Calibration from Digital Counts to Reflectance)
Empirical Line Method* Using one or more calibration patches Using best guess of reflectance of one or more objects Atmospheric modeling (e.g., MODTRAN) *Note: The ELM method of calibration is applicable to any discipline, not just Remote Sensing. The method described on the previous page is used in calibrating digital cameras. It is also used in Remote Sensing where it is called the Empirical Line Method (ELM), used to simultaneously calibrate out sensor and atmospheric effects. The Empirical Line Method works well in situations where there are objects of known reflectance in an image. In other cases, atmospheric modeling is sometimes done. Since the ELM method of calibration is virtually the same as what is done for calibration by other disciplines, it would not be a stretch to include it in the first release of a format standard. Accommodating other atmospheric correction isn’t really too difficult and is something with which I can help.

7 Color Correction (Causes of Color Inaccuracies in Remote Sensing)
Multispectral band passes in the visible region not equivalent to HVS response functions IKONOS, QuickBird Skylight illumination Common phenomenon: Materials in shadow appear bluish Dynamic Range Adjustment (DRA) Varying approaches produce different color presentations Commercial Remote Sensing got a big boost with the launch of IKONOS and QuickBird high-resolution satellites. It was soon noticed that some of the colors in the imagery were different than expected. Subsequent evaluation showed that several factors were responsible for this, the primary one being that the spectral bandpasses of the satellites were quite different from those of the human visual system.

8 Color Correction (Problem 1: Different Spectral Sensitivities)
E = Eye Sensitivity I = IKONOS Sensitivity This chart shows the spectral sensitivities of the eye and of the four spectral bands of IKONOS. Note that the sensor bandpasses are not sensitive to the entire visible range of colors. There is very little sensitivity in an area between the red and green sensors; the blue bandpass is translated about 25 nm away from the blue HVS response; and the red sensor is sensitive to wavelengths beyond the HVS.

9 Color Correction (Ikonos Vs. HVS Color Example)
Here, a simulation tool developed by Kodak attempts to correct IKONOS colors to those seen by the human visual system. You can see that they are quite different. IKONOS Image Simulated HVS Image

10 Color Correction (GretagMacbeth Color Checker: Truth & IKONOS Simulation)
Notice how the colors of the Macbeth Color Checker change when viewed by IKONOS vs. the HVS.

11 Color Correction (Eckerd’s Blue Roof Turns Purple)
The distinctive blue roofs of Eckerd drug stores show up as purple on IKONOS imagery. HVS IKONOS QuickBird

12 Color Correction (Image Example and Simulation of IKONOS Color)
Ground Truth: Digital Camera Colors as seen by IKONOS don’t match the colors as seen by the digital camera. Simulated IKONOS Color

13 Color Correction (Problem 2: Shadow Illumination)
White reference chip in direct sunlight Used to calibrate the ASD spectrometer White reference chip in shadow Illuminated by light scattered by the atmosphere This white reference, shown in sunlight and shade, shows why shaded sides of roofs take on a bluish cast due to the color of scattered skylight.

14 Rendering Tone Transfer Curve Luminance (out) Luminance (in)
Convert from XYZ to LAB (to get to a color appearance space) Apply a Tone Transfer Curve (TTC) to the “L” channel to stretch midtones while compressing highlights and shadows Apply Modulation Transfer Function Compensation to the “L” channel to sharpen image Convert back to XYZ Tone Transfer Curve Luminance (in) Luminance (out) Often, imagery is displayed with just a linear DRA stretch. However, film companies discovered years ago that a characteristic curve that maps log exposure to density with an S-shaped curve is preferable. This stretches the important midtones without clipping shadows and highlights. A similar Tone Transfer Curve, often constructed in a piece-wise fashion as shown, is also useful for digital photography.

15 Output Preparation Phosphor Gamut will allow reproduction of most colors seen by IKONOS, but many colors seen by the eye cannot be reproduced by IKONOS. SO HOW CAN YOU DO COLOR CORRECTION? The eye can see a broader range of colors than IKONOS can image or phosphors can display on a monitor.

16 Color Correction Approach
There is no perfect solution, however, all of the problems are addressed by creating a transform that minimizes the Delta E between a set of color patches as seen by the eye and by the sensor Use real patches were possible Use synthesized data based on system specs A transformation that approximates HVS color from IKONOS color can be made by minimizing the Delta E differences of a set of color patches. The transformation is particularly challenged by vegetation, which has a strong reflectance in the near-IR where IKONOS has sensitivity beyond the range of the human eye.

17 Color Correction (Beyond Visible)
Near IR and other non-visual spectral regions can be handled in the same way as shown on previous slide. (Note: UV is rarely used in Remote Sensing because it is heavily absorbed by the atmosphere.) The transformation seeks to minimize the Delta E between an acquired or synthetically generated image of a set reference panels and some aim colors For IKONOS, a “false-color IR” image is produced by mapping the IR channel to the red display, red to the green display, and green to the blue display Just because a non-visible region of the spectrum is imaged does not mean that it can’t be mapped in a standard way back to visible colors.

18 Color Correction (for “hyper”-visible imagery)
Currently in Remote Sensing there is no standard method of mapping color for systems that have more than three spectral channels covering the visible range of the spectrum Current visualization packages just display three bands (either user selected or chosen by the program) A useful option would be to smartly combine the bands to make an image that is as close as possible to what the human visual system would see Systems that have more than three spectral bands in the visible region are good candidates for combining the bands through a weighted combination to produce a close match to the human visual system. The more such bands, the better the match can be made to the eye’s sensitivity. Such a choice for visual representation has the potential of greatly aiding imagery interpreters. It would seem to be relevant to all disciplines.

19 Contributing Persons: Rulon Simmons Bernard Brower
Format Issues Contributing Persons: Rulon Simmons Bernard Brower One of the challenges of the CIE TC 8-07 is to come up with a format standard that is acceptable to a diverse community. Many organizations already have some favorite approaches, whether “home-brewed” or widely-accepted standards. Even currently accepted standards are being revised or replaced with exciting new technology.

20 Similarities and Differences between Remote Sensing and other Applications of Multispectral Imaging
All of the requirements for other applications apply to Remote Sensing as well. In addition, Remote Sensing requires geolocation information. Atmospheric considerations, most severe for Remote Sensing applications, can be handled in similar ways to other imagery or through more sophisticated modeling techniques (shown in a previous slide). The other main requirement imposed by Remote Sensing (but not entirely unique to it) is that of compression in order to efficiently handle large data sets. Note that BIIF, used by NATO is nearly identical to NITF (used by the U.S. military). The U.S. has plans to adopt the more universal BIIF standard.

21 Reconciling Differences
Consider requirements for all disciplines to be a subset of requirements for Remote Sensing Note 1: This approach does not require that all Remote Sensing issues be addressed in the first release of the standard as long as it is extensible at a later date Note 2: A basic set of Remote Sensing requirements may not be too difficult to accommodate Build upon a standard that will be acceptable to all parties in the future Standardize an approach to recording metadata that can be used within any file format If format requirements for other disciplines are considered to be a subset of Remote Sensing requirements, everybody wins and and can use the format. Taking such a position does not necessarily mean that the more stressing requirements of Remote Sensing would have to be addressed immediately. However, I recommend implementing the relatively simple aspects in the first release.

22 Standard Approach to Metadata (Why Use XML?)
Standard method of coding metadata Can be used regardless of format Is human readable as well as machine readable All file formats under consideration have fields for metadata. This metadata can be used to store information useful to color reproduction as well as information relevant to image size, location, etc. Extensible Markup Language (XML) is a standard, easy-to-use, method of recording matadata in a form that is both machine and human readable. It is compatible with HTML. Adopting a standard such as XML for metadata storage will simplify the development of software to read the metadata and can be used within more than one format.

23 XML for Pre-Processing
<DRA> <PARAM NAME=“GAIN” VALUE=“1.5/> <PARAM NAME=“OFFSET” VALUE=“10”/> </DRA> </PRE-PROCESSING> Here is an example of XML code to handle the DRA pre-processing step. It starts with a named function and ends with a similar delimiter with an added slash (/) mark. Substeps can be nested within each function using similar format.

<PARAM NAME=“NROWS” VALUE=“1000”/> <PARAM NAME=“NCOLUMNS” VALUE=“1000”/> <PARAM NAME=“NBITS” VALUE=“12”/> <PARAM NAME=“SEQUENCE” VALUE=“BIP”/> </SIZE> This XML code would handle all of the image size information.

25 XML for Location <LOCATION>
<PARAM NAME=“COORD PROJECTION” VALUE=“UTM”/> <PARAM NAME=“LAT” VALUE=“XX.XX”/> <PARAM NAME=“LONG” VALUE=“YY.YY”/> </LOCATION> This simple XML code would give the geolocation information of perhaps the upper left pixel of an image.

26 XML for Color Correction
<COLOR MATRIX> <PARAM NAME=“RGB TO XYZ” VALUE=“X1, X2, X3, X4, X5, X6, X7, X8, X9”/> </COLOR MATRIX> This XML code would handle color transformation matrices.

27 XML for Rendering <RENDERING>
<PARAM NAME=“XYZ TO LAB” VALUE=“X1, X2, X3, X4, X5, X6, X7, X8, X9”/> <PARAM NAME=“TTC” VALUE=“X1, Y1, X2, Y2, X3, Y3, …”/> <PARAM NAME=“MTFC” VALUE=“X1, X2, X3, X4, X5, X6, X7, X8, X9”/> <PARAM NAME=“LAB TO XYZ” VALUE=“X1, X2, X3, X4, X5, X6, X7, X8, X9”/> </RENDERING> In the rendering step, data is converted to LAB, processed to apply a TTC and MTFC, and then converted back to XYZ.

28 XML for Display (In general would not be included in metadata.)
<COLOR MATRIX <PARAM NAME=“XYZ TO RGB” VALUE=“X1, X2, X3, X4, X5, X6, X7, X8, X9”/> <PARAM NAME=“LAB TO RGB” VALUE=“X1, X2, X3, X4, X5, X6, X7, X8, X9”/> </COLOR MATRIX> <PARAM NAME=“GAMMA” VALUE=“2”/> </DISPLAY> This XML would do the color management necessary to display an image on a particular output device. As already indicated, this would not normally be included in an image file.

29 Compression Compression is required for some, but not all multispectral applications JPEG 2000 is an emerging international standard that is gaining wide acceptance (can be used in lossy or lossless mode) JPEG 2000 can be used to stream very large data sets in real time to small computing devices such as PDAs (using the feature that the whole image does not have to be decompressed before it can be sent) JPEG 2000 can be accommodated within the current NITF/BIIF standard, making it attractive to many currently doing Remote Sensing For those needing compression, JPEG 2000 is an attractive option, having much more capability than standard JPEG or other alternatives. JPEG 2000 serves as not only an emerging standard for compression, but also as a file format. Data can be stored in either lossy or lossless modes. JPEG 2000 is also compatible with currently-used NITF and BIIF formats. In other words, it can be used as a stand-alone format or wrapped within the NITF and BIIF formats. This makes JPEG 2000 a particularly attractive format option because it can be used within existing Remote Sensing formats and has application across many other disciplines.

30 Conclusions / Recommendations
Use a four-step approach to color management currently used with digital photography Non-visible channels can be displayed in predictable ways using standard color science “Hyper”-visible channels can be combined to give a better representation of true color (from HVS perspective) Multispectral format requirements for Remote Sensing are more similar to than different from other applications Select a standard that will be acceptable to all disciplines, taking into account widely-used standard formats and one that will likely be used in the future. Use XML as a standard for recording metadata The color science used in other disciplines is applicable to Remote Sensing. A standard application of this science is needed for both Remote Sensing and other applications. A standard that stretches across all disciplines should be achievable.

Download ppt "Multispectral Format from Perspective of Remote Sensing"

Similar presentations

Ads by Google