Presentation on theme: "Multispectral Format from Perspective of Remote Sensing"— Presentation transcript:
1Multispectral Format from Perspective of Remote Sensing Rulon E. Simmons(585)These charts show the similarities and differences between the requirements for a multispectral format for Remote Sensing and for other applications. They suggest that considering requirements for other applications to be a subset of requirements for Remote Sensing is a reasonable approach. Doing so seems to be in line with the most recent traffic that suggested that non-visible wavelengths not be excluded from the format, and as Danny Rich stated: “Adopting a file format that is a clear superset of what is required for basic color imaging does not cost us anything.”
2Outline Color Issues Format issues Steps to color processing Color Accuracy in Remote SensingHandling color display of non-visible bandsFormat issuesSimilarities and differences with other multispectral applicationsXML for MetadataCompressionFrom the point of view of the CIE, color representation is clearly the focus. This is true of the disciplines that look to the CIE for guidance including Remote Sensing. While Remote Sensing folks may also choose to evaluate their imagery with computer algorithms, visual inspection is still very important and common. Handling the data in a way that optimizes the visual display is very desirable.The Remote Sensing community currently has some accepted standard formats that are used for handling imagery. While these have been extended to handle multispectral imagery, there is currently a lack of standardization relative to color processing and display.
3Contributing Persons: Color IssuesContributing Persons:Rulon SimmonsTim ElderScott BennettMichael VaughnThe following charts show a four-step process to handling color issues. This process actually comes from the digital camera world. There are current efforts to introduce this process into the Remote Sensing community.
4Color Processing Approach* RGB rawPre- ProcessingColorCorrectionRenderingOutputPreparationDRAChannel BalanceRGB’ XYZXYZ LABTTCMTFCLAB XYZ’XYZ’ RGB”GammaThe four steps to color processing are: (1) Pre-processing; (2) Color Correction; (3) Rendering; and (4) Output Preparation. From a format point of view, recording the transformations used in the first three steps will help others obtain similar results when viewing the same imagery. The transformations of the last step are display-specific and are therefore not ones that would normally be stored with the imagery.Display*Note: This is a standard approach used with digital photography.
5Pre-Processing Calibration Point Black Point In this example, a 12-bit image is acquired. If the red, green, and blue channels were displayed as acquired, the color balance would generally not be good. So the first thing that is done is to select a black reference point (sometimes chosen at a 1% penetration of the low end of the image histogram). This black pixel is readjusted in each channel to a count level of zero. Then one or more calibration points are chosen at the mid to upper range of the histogram. Each point must have a measured or estimated reflectance in each color channel. This can be done by imaging a gray-scale or color patches or by using some other object whose reflectance is more or less known. A regression using many points is preferable when there is much uncertainty in the reflectance knowledge of the calibration points.
6Pre-Processing (Calibration from Digital Counts to Reflectance) Empirical Line Method*Using one or more calibration patchesUsing best guess of reflectance of one or more objectsAtmospheric modeling (e.g., MODTRAN)*Note: The ELM method of calibration is applicable to any discipline, not just Remote Sensing.The method described on the previous page is used in calibrating digital cameras. It is also used in Remote Sensing where it is called the Empirical Line Method (ELM), used to simultaneously calibrate out sensor and atmospheric effects.The Empirical Line Method works well in situations where there are objects of known reflectance in an image. In other cases, atmospheric modeling is sometimes done.Since the ELM method of calibration is virtually the same as what is done for calibration by other disciplines, it would not be a stretch to include it in the first release of a format standard. Accommodating other atmospheric correction isn’t really too difficult and is something with which I can help.
7Color Correction (Causes of Color Inaccuracies in Remote Sensing) Multispectral band passes in the visible region not equivalent to HVS response functionsIKONOS, QuickBirdSkylight illuminationCommon phenomenon: Materials in shadow appear bluishDynamic Range Adjustment (DRA)Varying approaches produce different color presentationsCommercial Remote Sensing got a big boost with the launch of IKONOS and QuickBird high-resolution satellites. It was soon noticed that some of the colors in the imagery were different than expected. Subsequent evaluation showed that several factors were responsible for this, the primary one being that the spectral bandpasses of the satellites were quite different from those of the human visual system.
8Color Correction (Problem 1: Different Spectral Sensitivities) E = Eye SensitivityI = IKONOS SensitivityThis chart shows the spectral sensitivities of the eye and of the four spectral bands of IKONOS. Note that the sensor bandpasses are not sensitive to the entire visible range of colors. There is very little sensitivity in an area between the red and green sensors; the blue bandpass is translated about 25 nm away from the blue HVS response; and the red sensor is sensitive to wavelengths beyond the HVS.
9Color Correction (Ikonos Vs. HVS Color Example) Here, a simulation tool developed by Kodak attempts to correct IKONOS colors to those seen by the human visual system. You can see that they are quite different.IKONOS ImageSimulated HVS Image
10Color Correction (GretagMacbeth Color Checker: Truth & IKONOS Simulation) Notice how the colors of the Macbeth Color Checker change when viewed by IKONOS vs. the HVS.
11Color Correction (Eckerd’s Blue Roof Turns Purple) The distinctive blue roofs of Eckerd drug stores show up as purple on IKONOS imagery.HVSIKONOSQuickBird
12Color Correction (Image Example and Simulation of IKONOS Color) Ground Truth:Digital CameraColors as seen by IKONOS don’t match the colors as seen by the digital camera.Simulated IKONOS Color
13Color Correction (Problem 2: Shadow Illumination) White reference chip in direct sunlightUsed to calibrate the ASD spectrometerWhite reference chip in shadowIlluminated by light scattered by the atmosphereThis white reference, shown in sunlight and shade, shows why shaded sides of roofs take on a bluish cast due to the color of scattered skylight.
14Rendering Tone Transfer Curve Luminance (out) Luminance (in) Convert from XYZ to LAB (to get to a color appearance space)Apply a Tone Transfer Curve (TTC) to the “L” channel to stretch midtones while compressing highlights and shadowsApply Modulation Transfer Function Compensation to the “L” channel to sharpen imageConvert back to XYZTone TransferCurveLuminance (in)Luminance (out)Often, imagery is displayed with just a linear DRA stretch. However, film companies discovered years ago that a characteristic curve that maps log exposure to density with an S-shaped curve is preferable. This stretches the important midtones without clipping shadows and highlights. A similar Tone Transfer Curve, often constructed in a piece-wise fashion as shown, is also useful for digital photography.
15Output PreparationPhosphor Gamut will allow reproduction of most colors seen by IKONOS, but many colors seen by the eye cannot be reproduced by IKONOS.SO HOW CAN YOU DO COLOR CORRECTION?The eye can see a broader range of colors than IKONOS can image or phosphors can display on a monitor.
16Color Correction Approach There is no perfect solution, however, all of the problems are addressed by creating a transform that minimizes the Delta E between a set of color patches as seen by the eye and by the sensorUse real patches were possibleUse synthesized data based on system specsA transformation that approximates HVS color from IKONOS color can be made by minimizing the Delta E differences of a set of color patches. The transformation is particularly challenged by vegetation, which has a strong reflectance in the near-IR where IKONOS has sensitivity beyond the range of the human eye.
17Color Correction (Beyond Visible) Near IR and other non-visual spectral regions can be handled in the same way as shown on previous slide. (Note: UV is rarely used in Remote Sensing because it is heavily absorbed by the atmosphere.)The transformation seeks to minimize the Delta E between an acquired or synthetically generated image of a set reference panels and some aim colorsFor IKONOS, a “false-color IR” image is produced by mapping the IR channel to the red display, red to the green display, and green to the blue displayJust because a non-visible region of the spectrum is imaged does not mean that it can’t be mapped in a standard way back to visible colors.
18Color Correction (for “hyper”-visible imagery) Currently in Remote Sensing there is no standard method of mapping color for systems that have more than three spectral channels covering the visible range of the spectrumCurrent visualization packages just display three bands (either user selected or chosen by the program)A useful option would be to smartly combine the bands to make an image that is as close as possible to what the human visual system would seeSystems that have more than three spectral bands in the visible region are good candidates for combining the bands through a weighted combination to produce a close match to the human visual system. The more such bands, the better the match can be made to the eye’s sensitivity. Such a choice for visual representation has the potential of greatly aiding imagery interpreters. It would seem to be relevant to all disciplines.
19Contributing Persons: Rulon Simmons Bernard Brower Format IssuesContributing Persons:Rulon SimmonsBernard BrowerOne of the challenges of the CIE TC 8-07 is to come up with a format standard that is acceptable to a diverse community. Many organizations already have some favorite approaches, whether “home-brewed” or widely-accepted standards. Even currently accepted standards are being revised or replaced with exciting new technology.
20Similarities and Differences between Remote Sensing and other Applications of Multispectral Imaging All of the requirements for other applications apply to Remote Sensing as well. In addition, Remote Sensing requires geolocation information. Atmospheric considerations, most severe for Remote Sensing applications, can be handled in similar ways to other imagery or through more sophisticated modeling techniques (shown in a previous slide). The other main requirement imposed by Remote Sensing (but not entirely unique to it) is that of compression in order to efficiently handle large data sets.Note that BIIF, used by NATO is nearly identical to NITF (used by the U.S. military). The U.S. has plans to adopt the more universal BIIF standard.
21Reconciling Differences Consider requirements for all disciplines to be a subset of requirements for Remote SensingNote 1: This approach does not require that all Remote Sensing issues be addressed in the first release of the standard as long as it is extensible at a later dateNote 2: A basic set of Remote Sensing requirements may not be too difficult to accommodateBuild upon a standard that will be acceptable to all parties in the futureStandardize an approach to recording metadata that can be used within any file formatIf format requirements for other disciplines are considered to be a subset of Remote Sensing requirements, everybody wins and and can use the format. Taking such a position does not necessarily mean that the more stressing requirements of Remote Sensing would have to be addressed immediately. However, I recommend implementing the relatively simple aspects in the first release.
22Standard Approach to Metadata (Why Use XML?) Standard method of coding metadataCan be used regardless of formatIs human readable as well as machine readableAll file formats under consideration have fields for metadata. This metadata can be used to store information useful to color reproduction as well as information relevant to image size, location, etc.Extensible Markup Language (XML) is a standard, easy-to-use, method of recording matadata in a form that is both machine and human readable. It is compatible with HTML.Adopting a standard such as XML for metadata storage will simplify the development of software to read the metadata and can be used within more than one format.
23XML for Pre-Processing <DRA><PARAM NAME=“GAIN” VALUE=“1.5/><PARAM NAME=“OFFSET” VALUE=“10”/></DRA></PRE-PROCESSING>Here is an example of XML code to handle the DRA pre-processing step. It starts with a named function and ends with a similar delimiter with an added slash (/) mark. Substeps can be nested within each function using similar format.
24XML for Size <SIZE> <PARAM NAME=“NBANDS” VALUE=“4”/> <PARAM NAME=“NROWS” VALUE=“1000”/><PARAM NAME=“NCOLUMNS” VALUE=“1000”/><PARAM NAME=“NBITS” VALUE=“12”/><PARAM NAME=“SEQUENCE” VALUE=“BIP”/></SIZE>This XML code would handle all of the image size information.
25XML for Location <LOCATION> <PARAM NAME=“COORD PROJECTION” VALUE=“UTM”/><PARAM NAME=“LAT” VALUE=“XX.XX”/><PARAM NAME=“LONG” VALUE=“YY.YY”/></LOCATION>This simple XML code would give the geolocation information of perhaps the upper left pixel of an image.
26XML for Color Correction <COLOR MATRIX><PARAM NAME=“RGB TO XYZ” VALUE=“X1, X2, X3, X4, X5, X6, X7, X8, X9”/></COLOR MATRIX>This XML code would handle color transformation matrices.
27XML for Rendering <RENDERING> <PARAM NAME=“XYZ TO LAB” VALUE=“X1, X2, X3, X4, X5, X6, X7, X8, X9”/><PARAM NAME=“TTC” VALUE=“X1, Y1, X2, Y2, X3, Y3, …”/><PARAM NAME=“MTFC” VALUE=“X1, X2, X3, X4, X5, X6, X7, X8, X9”/><PARAM NAME=“LAB TO XYZ” VALUE=“X1, X2, X3, X4, X5, X6, X7, X8, X9”/></RENDERING>In the rendering step, data is converted to LAB, processed to apply a TTC and MTFC, and then converted back to XYZ.
28XML for Display (In general would not be included in metadata.) <COLOR MATRIX<PARAM NAME=“XYZ TO RGB” VALUE=“X1, X2, X3, X4, X5, X6, X7, X8, X9”/><PARAM NAME=“LAB TO RGB” VALUE=“X1, X2, X3, X4, X5, X6, X7, X8, X9”/></COLOR MATRIX><PARAM NAME=“GAMMA” VALUE=“2”/></DISPLAY>This XML would do the color management necessary to display an image on a particular output device. As already indicated, this would not normally be included in an image file.
29CompressionCompression is required for some, but not all multispectral applicationsJPEG 2000 is an emerging international standard that is gaining wide acceptance (can be used in lossy or lossless mode)JPEG 2000 can be used to stream very large data sets in real time to small computing devices such as PDAs (using the feature that the whole image does not have to be decompressed before it can be sent)JPEG 2000 can be accommodated within the current NITF/BIIF standard, making it attractive to many currently doing Remote SensingFor those needing compression, JPEG 2000 is an attractive option, having much more capability than standard JPEG or other alternatives. JPEG 2000 serves as not only an emerging standard for compression, but also as a file format. Data can be stored in either lossy or lossless modes.JPEG 2000 is also compatible with currently-used NITF and BIIF formats. In other words, it can be used as a stand-alone format or wrapped within the NITF and BIIF formats. This makes JPEG 2000 a particularly attractive format option because it can be used within existing Remote Sensing formats and has application across many other disciplines.
30Conclusions / Recommendations Use a four-step approach to color management currently used with digital photographyNon-visible channels can be displayed in predictable ways using standard color science“Hyper”-visible channels can be combined to give a better representation of true color (from HVS perspective)Multispectral format requirements for Remote Sensing are more similar to than different from other applicationsSelect a standard that will be acceptable to all disciplines, taking into account widely-used standard formats and one that will likely be used in the future.Use XML as a standard for recording metadataThe color science used in other disciplines is applicable to Remote Sensing. A standard application of this science is needed for both Remote Sensing and other applications. A standard that stretches across all disciplines should be achievable.