Presentation is loading. Please wait.

Presentation is loading. Please wait.

Using GOES Imagery as Pointing Truth for TEMPO Image Navigation and Registration Kerrie Allen 1, James L. Carr 1, Brad Pierce 2, Joseph Fox-Rabinovitz.

Similar presentations


Presentation on theme: "Using GOES Imagery as Pointing Truth for TEMPO Image Navigation and Registration Kerrie Allen 1, James L. Carr 1, Brad Pierce 2, Joseph Fox-Rabinovitz."— Presentation transcript:

1 Using GOES Imagery as Pointing Truth for TEMPO Image Navigation and Registration Kerrie Allen 1, James L. Carr 1, Brad Pierce 2, Joseph Fox-Rabinovitz 1, Norman Lo 1, David Zakar 1 The InstrumentThe Solution Acknowledgements The Tropospheric Emissions: Monitoring of Pollution (TEMPO) instrument will observe tropospheric O 3, NO 2, and other trace gases as part of NASA’s Earth Venture Instrument (EV-I) program. TEMPO will be hosted onboard a geostationary communications satellite to allow for continuous observation of CONUS and parts of Canada and Mexico. As an EV-I hosted payload, TEMPO will not have the same Image Navigation and Registration (INR) capabilities available to it that an observatory-class mission such as GOES-R would have. 1.Carr Astronautics, Greenbelt, MD 2.NOAA/NESDIS Center for Satellite Applications and Research (STAR), Madison, WI The Approach Satellite observations are linked to specific locations on the earth through the process of Image Navigation and Registration (INR). Navigation involves determining the latitude and longitude of each pixel; registration maintains the pointing of each pixel to the same earth location and/or the registration between spectral channels. GOES-NOP implements INR using star senses and landmarks observed by the instrument to characterize their pointing. GOES-R will use instrument star sensing and rely on GPS. Another method is to use tie points– identify a point in the satellite imagery, and find the same point in another, well- navigated image to serve as “pointing truth.” The offset between the two images, which can be determined through image processing techniques like normalized cross correlation (NCC), can then be translated into satellite roll, pitch, and yaw misalignments and other geometric model parameters. A difficulty of using tie points is that, if two satellites at different longitudes are looking at an object at some unknown altitude above the earth, there will be a parallax error. Often, particularly when the pointing truth is generated from aerial photography, tie points are only selected in clear sky areas, so that the tie point will be located on the ground. However, for TEMPO, often there will be times when the image will be dominated by clouds. Next Steps The Prototype Feasibility of this concept has been shown through a Matlab validation prototype. In September of 2012, GOES-14 had been taken out of storage due to an anomaly with GOES-13. After GOES-13 returned to normal operations, GOES-14 continued to operate for increased coverage during Hurricane Sandy. The data from this time period– three satellites taking simultaneous imagery– provides an excellent proxy dataset, with GOES-14 at 89.5° W, binned to a lower resolution, acting as TEMPO, and GOES-13 and -15 providing pointing truth. TEMPO’s INR algorithm is currently being implemented in C++. It is designed to work with imagery from the GOES-NOP series, or the GOES- R series, or a combination thereof. There is a great deal of existing NOP imagery available for development and testing purposes, but this technique is intended to take advantage of the increased refresh rate of GOES-R imagery, and its increased navigation accuracy. The tie point offsets are used as inputs for TEMPO’s Kalman Filter, which continually provides and updates estimates of the instrument state. The state includes various sources of system error, from roll, pitch, and yaw to slit rotational misalignment in the grating spectrometer. These states can then be used to calculate the geographic position of each pixel. Once an entire scan’s worth of tie points are available, the filter calculates a smoothed state, refining the accuracy of the calculations. The state estimates will also be used to determine scan parameters that keep the imagery centered over the CONUS. Accurate geolocations are key to TEMPO’s performance, as they will allow researchers to link observations back to potential sources of pollution. TEMPO has been selected as an Earth Venture Instrument under NASA’s Earth System Science Pathfinder program. Its development team is led by the Smithsonian Astrophysical Observatory and includes Ball Aerospace, NASA Langley, and a number of other partners. Carr Astronautics is the INR lead. TEMPO is scheduled for delivery in 2017. The work described in this poster is the subject of Provisional Patent 61/933,574: “Image Navigation and Registration (INR) Transfer from Exquisite Systems to Hosted Space Payloads”, filed 30 January 2014. When available, TEMPO’s ground system will use near simultaneous imagery from GOES-East and West to enable binocular vision and compensate for the parallax of unknown objects. Each TEMPO data cube is associated with the most recent CONUS image from each GOES satellite. The data is collapsed with a spectral response function and resampled spatially to create an image that resembles one taken by GOES’ imager. Chips are extracted from this image, and compared to neighborhoods in the GOES imagery, and NCC is used to determine the offset that would make the two images line up most accurately. If near-simultaneous imagery is available from only one GOES satellite, we can still use the match, but need to compensate for parallax using a cloud-top height or appropriately de-weight the measurement in the direction of the parallax when the feature being matched is a cloud. Each tie-point measurement is used to update a Kalman Filter state, which includes roll, pitch, and yaw attitude states. Between updates, the attitude states are propagated using gyroscope telemetry. Imagery from the three satellites used in the prototype is nearly simultaneous, and comes from identical imagers. Offsets in the tie points should be determined by the relative pointing misalignments of the satellites. Therefore, the offset values should be closely clustered for all tie points, and relatively small after compensating for parallax. In fact, this is what we see: image matching without parallax mitigation finds offsets in the range of -8 to +25 GOES pixels. Binocular tie points reduce that range to -2 to +2 GOES pixels, which translates to a pixel or less at TEMPO’s coarser resolution. Calculated object altitudes were generally in the range of 0 to 7km, which are realistic cloud heights. The parallax-compensated tie points follow roughly Gaussian distributions in each axis. Chips from two images are compared using normalized cross correlation. Poster #2.54 TEMPO GOES Parallax Parallax can cause large errors when comparing images gathered by satellites at different longitudes. Tie point offsets, colored by NCC value. The scatter is much higher in the east-west direction, where parallax is more important than in the north-south direction. Offsets for binocular tie points show much lower variance. Tie points shown at their locations in images from GOES-15(left), 14(center) and 13(right). Histogram of tie point offsets show a roughly Gaussian distribution GOES East TEMPO GOES West Binocular tie points use two sources of imagery to solve for altitude and mitigate parallax error.


Download ppt "Using GOES Imagery as Pointing Truth for TEMPO Image Navigation and Registration Kerrie Allen 1, James L. Carr 1, Brad Pierce 2, Joseph Fox-Rabinovitz."

Similar presentations


Ads by Google