Download presentation
Presentation is loading. Please wait.
1
BASIC PRINCIPLES OF REMOTE SENSING
ATMOSPHERE OCEAN LAND M E T O R L G Y
3
WHY MAKE MEASUREMENT FROM SPACE?
REPEAT OBSERVATIONS 750 SYNOPTIC COVERAGE 4 600 3 450 300 150 800 km 2 1 14 13 12 11 10 9 8 7 6 00 Latitude 150 Orbit Number 300 5 450 10 km 600 2820km 750 100 m Swath = 70km 10 m 1.6 m 4.5 km 357 km 1956 km 11.3 km 35.7 km
4
Need for a Meteorological Satellite Observation System
Surface-based (in situ) meteorological observations are the most reliable But they are confined to land In situ observations are scanty over oceans which occupy two-thirds of the earth’s surface It is difficult to set up observatories in inhospitable regions Automatic weather stations cannot observe all parameters
5
Need for a Meteorological Satellite Observation System
Meteorological parameters inferred indirectly Satellites can provide extensive coverage Satellites can provide continuous monitoring Problem is that a single satellite cannot perform both tasks simultaneously Satellites are expensive
6
History of Remote Sensing
The history of remote sensing began with the invention of photography. The term "photography" is derived from two Greek words meaning "light" (phos) and "writing" (graphien).
7
First photograph in the world by Niepce
Niepce takes first picture of nature from a window view of the French countryside using a camera obscura and an emulsion using bitumen of Judea, a resinous substance, and oil of lavender (it took 8 hours in bright sunlight to produce the image) French inventor, most noted as the inventor of photography
8
History of Remote Sensing
Gasper Felix Tournachon “Nadar" takes the first aerial photograph from a captive balloon from an altitude of 1,200 feet over Paris. Gasper Felix Tournachon "Nadar" ( ) San Francisco, looking to the southwest. Published 1878, C. R. Parsons. Balloons gave us the first real oppurtunity to view the earth from above, but before we could physically do this our artistic imaginations were capable of envisioning what this perspective (see above). With the development of a camera and image capturing mechanism it was possible to acquire remotely sensed imagery of urban areas for the first time. . . The cartoon above is from May 25, "What Nadar had really done was to change the level of art to the level of science and utility, from the artistic drawing to an instrument of work." - Daunier Felix Tournachon, aka Nadar, captured the first aerial view of European cities, starting with Paris in 1868. Boston from a captive balloon at 1,200 feet, October 13, 1860, James Wallace Black. This is the oldest conserved aerial photograph (Nadar's first works were lost). This photograph is printed with Albumen (coal) and housed at the Boston Public Library. Boston from a captive balloon at 1,200 feet, October 13, 1860, James Wallace Black. This is the oldest conserved aerial photograph
9
History of Remote Sensing
The Bavarian Pigeon Corps uses pigeons to transmit messages and take aerial photos.
10
History of Remote Sensing
1914 – WW I provided a boost in the use of aerial photography, but after the war, enthusiasm waned
11
History of Remote Sensing
First space photographs from V-2 rockets. U-2 takes first flight.
12
Photograph from V-2 Rocket
Typical example of an oblique photograph, looking across Arizona and Gulf of California
13
History of Remote Sensing
EXPLORER-7 launched in 1959 Carried Suomi radiometer measuring solar &terrestrial radiation (ERB study)
14
History of Remote Sensing
TIROS-1, Launched 01 Apr 1960 Carried just an ordinary TV camera. It was the Beginning of Satellite Meteorology.
15
Remote Sensing: Definitions
Remote Sensing is the art, science and technology of obtaining reliable information about physical objects and the environment, through a process of recording, measuring and interpreting imagery and digital representations of energy patterns derived from non-contact sensor systems" (Colwell, 1997). Photogrammetry and Remote Sensing are the art, science and technology of obtaining reliable information about physical objects and the environment, through a process of recording, measuring and interpreting imagery and digital representations of energy patterns derived from non-contact sensor systems" (Colwell, 1997). "Remote sensing may be broadly defined as the collection of information about an object without being in physical contact with the object. Aircraft and satellites are the common platforms from which remote sensing observations are made. The term remote sensing is restricted to methods that employ electromagnetic energy as the means of detecting and measuring target characteristics" (Sabins, 1978). "Remote sensing is the art and science of obtaining information from a distance, i.e. obtaining information about objects or phenomena without being in physical contact with them. The science of remote sensing provides the instruments and theory to understand how objects and phenomena can be detected. The art of remote sensing is in the development and use analysis techniques to generate useful information"(Aronoff, 1995).
16
History of Remote Sensing
"Remote sensing may be broadly defined as the collection of information about an object without being in physical contact with the object. The term remote sensing is restricted to methods that employ electromagnetic energy as the means of detecting and measuring target characteristics" (Sabins, 1978).
17
History of Remote Sensing
"Remote sensing is the art and science of obtaining information from a distance, i.e. obtaining information about objects or phenomena without being in physical contact with them. (Aronoff, 1995). The science of remote sensing provides the instruments and theory to understand how objects and phenomena can be detected. The art of remote sensing is in the development and use analysis techniques to generate useful information" techniques involve amassing knowledge pertinent to the sensed scene (target) by utilizing electromagnetic radiation, force fields, or acoustic energy sensed by recording cameras, radiometers and scanners, lasers, radio frequency receivers, radar systems, sonar, thermal devices, sound detectors, seismographs, magnetometers, gravimeters, scintillometers, and other instruments. Remote Sensing in the most generally accepted meaning refers to instrument-based techniques employed in the acquisition and measurement of spatially organized data/information on some property(ies) (spectral; spatial; physical) of an array of target points (pixels) within the sensed scene that correspond to features, objects, and materials, doing this by applying one or more recording devices not in physical, intimate contact with the item(s) under surveillance); techniques involve amassing knowledge pertinent to the sensed scene (target) by utilizing electromagnetic radiation, force fields, or acoustic energy sensed by recording cameras, radiometers and scanners, lasers, radio frequency receivers, radar systems, sonar, thermal devices, sound detectors, seismographs, magnetometers, gravimeters, scintillometers, and other instruments.
18
Basic Principles of EM Wave Propagation
Black body radiation at different temperatures ( 300, 950, and 2500 Kelvin).
19
Electromagnetic Waves
20
Period, Amplitude and Wavelength
21
Electromagnetic Spectrum
0.4 0.5 0.6 0.7
22
Infinite Possibilities
The electromagnetic spectrum stretches across x-rays to radio waves Theoretically speaking, it can be broken into an infinite number of parts
23
Infinite Possibilities
Depending on which spectral region is scanned we will get different information
24
Infinite Possibilities
Depending upon where we place a satellite, we will get different information
25
Infinite Possibilities
What you get depends on Sensor Spectral window used Response of the sensor Field of view Orbit Satellite height Inclination of the satellite orbit Scan swath Time of scan Repeat cycle of the satellite visit
26
LOSS OF EM ENERGY Spherical Spreading Absorption Scattering
Amplitude dies off with distance Absorption Scattering Spherical Spreading. This phenomena is simply the fact that the amplitude dies off with distance travelled (i.e., the further you are from the source, the weaker the signal). This obvious physical effect is no different than observing that as one gets farther and farther from a source of sound, a weaker and weaker sound is heard. Technically, as the wave travels, the wave-front is a sphere whose radius gets larger and larger. Since the area of a sphere is 4πR2, at some radius (distance travelled), the energy per unit area on the wave-front is equal to the original energy divided by 4πR2. Since the energy is proportional to the square of the amplitude, the amplitude is proportional to 1/R. Strictly speaking, the energy is not diminished by this effect, it is just spread over the surface of larger and larger spheres. (b) Absorption. It is appropriate to think of absorption as the tendency for materials to simply soak up electromagnetic energy and convert it to heat. We can measure some of this energy as radiated (emitted) heat that is at a longer wavelength than the original energy. (c) Scattering. In the interaction of electromagnetic waves with objects that are small relative to a wavelength of the incident radiation a dipole is induced within the scatterer. The induced dipole is in the same direction as the incident electrical vector, its moment proportional to the field and its phase same as that of the incident field.
27
What Happens When EMR Strikes Matter?
Transmission Reflection Absorption
28
Transmission It is a process by which incident radiation passes through matter w/o measurable attenuation 1> 2 1= 3 Medium 1 1 3 2 Medium 2
29
Reflection and Scattering
Reflection process whereby incident radiation "bounces off" the surface of substance in a single, predictable direction; caused by surfaces smooth relative to wavelengths of incident radiation; no change in velocity or wavelength 1= 2 1 2 Medium 1 Medium 2 Reflection
30
Reflection and Scattering
Scattering (Diffused reflection) occurs when incident radiation is dispersed or spread out unpredictably in many different directions; occurs when surfaces rough relative to wave-lengths of incident radiation; no change in velocity or wavelength Medium 1 Medium 2 Scattering
31
Absorption It is a process by which incident radiation is taken in by the medium (e.g., surface, atmospheric particulates, atmospheric layer); medium opaque to incident radiation It is the tendency for materials to simply soak up electromagnetic energy and convert it to heat. Some of this energy can be measured as radiated (emitted) heat that is at a longer wavelength than the original energy. Emission Absorption Emission
32
EMR - Atmosphere Interactions
EMR travels through space w/o modification Diversion and depletion occurs as solar and terrestrial radiation interact with earth's atmosphere Interference is wavelength selective - meaning at certain wavelengths EMR passes freely through atmosphere, whereas restricted at other wavelengths
33
Windows and Absorption Bands
Atmospheric Windows (transmision bands) - areas of EMS where specific wavelengths pass relatively unimpeded through atmosphere Absorption Bands - areas where specific wavelengths are totally or partially blocked
34
Windows and Absorption Bands
35
Important Atmospheric Windows
um UV, visible, near infrared um SWIR um Mid infrared um Mid infrared um Thermal Infrared > 0.6cm Microwave
36
Atmospheric Gases - Selective Absorbers with reference to wavelength
Gamma and X-ray - completely absorbed in the upper atmosphere by Oxygen and Nitrogen Ultraviolet (<0.2um) - absorbed by molecules of oxygen (O and O2 combine form ozone); ozone absorbs UV w/ wavelengths um in stratosphere um - water vapor and carbon dioxide absorb in narrow bands Thermal infrared strong absorption by water vapor between 5-8um and 20um-1,000um (1cm) carbon dioxide absorbs 14-20um ozone 9-10um Absorbed radiation heats the lower atmosphere Microwave region - 3 relatively narrow absorption bands occur between cm (oxygen and water vapor) beyond 0.6cm , atmospheric gases generally do not impede passage of microwave radiation
37
Atmospheric Windows
38
Infrared Windows in the Atmosphere
Wavelength Range Band Sky Transparency Sky Brightness microns J high low at night microns H very low microns K microns L microns: fair microns: high low microns M microns N 8 - 9 microns and microns: fair others: low very high microns microns: Q microns: Z microns
39
Atmospheric Windows Absorption Bands in MW
um um 2.0 – 2.4um 3.0 – 5.0um 8.0 – 14.0um > 0.6 Cm Absorption Bands in MW Watervapour : 22.2 GHz and 183GHz Oxygen : 60GHz(single line) and 118.7GHz(band)
40
Important Atmospheric Windows
µm UV, visible, near infrared µm Mid infrared µm Mid infrared µm Mid infrared µm Thermal Infrared > 0.6cm Microwave
41
IMAGING AND SOUNDING Objective to study earth's surface - different remote sensing instruments designed to operate in windows where cloudless atmosphere will transmit sufficient radiation for detection Objective to study atmosphere constituents - operate in atmospheric absorption bands
42
SATELLITE OBSERVATIONAL SYSTEM FOR METEOROLOGY
SURFACE SENSING (IMAGING) ( VHRR, SSMI, SCATTEROMETERS, MADRAS) SOUNDERS (TOVS, TRMM-RADAR,SAPHIR) T, P, RH PROFILE MINOR CONSTITUENTS PRECIPITATION PROFILE LAND COVER SEA SURFACE TEMPERATURE CLOUD MOTION VECTOR OCEAN SURFACE WIND VECTOR SNOW COVER CLOUD STRUCTURE CYCLONE MOVEMENT
43
Atmospheric Absorption and Transmission
Most significant absorbers of EMR: Ozone Carbon dioxide Water vapor Oxygen Nitrogen
44
Spectral Signatures A primary use of remote sensing data is in classifying the myriad features in a scene into meaningful categories The image then becomes a thematic map (the theme is selectable e.g., land use, geology, vegetation types, rainfall). A farmer may use to monitor the health of his crops without going out to the field A geologist may use the images to study the types of minerals or rock structure A biologist may want to study the variety of plants in a certain location.
45
Spectral Signatures At certain wavelengths, sand reflects more energy than green vegetation while at other wavelengths it absorbs more (reflects less) energy. Therefore, in principle, various kinds of surface materials can be distinguished from each other by these differences in reflectance. When more than two wavelengths are used, the resulting images tend to show more separation among the objects. The improved ability of multispectral sensors provides a basic remote sensing data resource for quantitative thematic information, such as the type of land cover. These data provide unique identification characteristics leading to a quantitative assessment of the Earth's features.
46
Spectral Signatures
47
Spectral Signatures ♦ GL ■PW ♥ RS ● SW Percent Reflectance at 0.85µm
100 80 ♦ GL 60 ■PW Percent Reflectance at 0.85µm ♥ RS 40 20 ● SW 20 40 60 80 100 Percent Reflectance at 0.55µm
49
PRINCIPLE OF LAND COVER DISCRIMINATION
FRESH SNOW GREEN VEGETATION DARK TONED SOIL LIGHT TONED SOIL CLEAR WATER TURBID WATER
50
TYPICAL SPECTRAL REFLECTANCE CURVE OF SNOW.
Snow reflectance depends on:- Wavelength Grain size (hence age) Snow pack thickness Liquid water content Contaminant present Solar zenith angle Reflectance relative to BaSO4 TYPICAL SPECTRAL REFLECTANCE CURVE OF SNOW. Snow condition cold, sifted, `sugar’ consistency. Snow density g/cm3 Wavelength in micrometer
51
SNOW (a) (b) IRS LISS-3 image over part of Himalayas. (a) is in band-2 (Green) and (b) in band-5 (SWIR).
52
Pixels and Bits Using radio waves, data from Earth-orbiting satellites are transmitted on a regular basis to ground stations. They are translated into a digital image that can be displayed on a computer screen. Just like the pictures on your television set, satellite imagery is made up of tiny squares These squares are called pixels—short for picture elements—and represent the relative reflected light energy recorded for that part of the image. Using radio waves, data from Earth-orbiting satellites are transmitted on a regular basis to ground stations. They are translated into a digital image that can be displayed on a computer screen. Just like the pictures on your television set, satellite imagery is made up of tiny squares as shown in Fig.11, each of a different gray shade or color. These squares are called pixels—short for picture elements—and represent the relative reflected light energy recorded for that part of the image.
53
Satellite Image of Hurricane Floyd
Pixels Using radio waves, data from Earth-orbiting satellites are transmitted on a regular basis to properly equipped ground stations. As the data are received they are translated into a digital image that can be displayed on a computer screen. Just like the pictures on your television set, satellite imagery is made up of tiny squares as shown in Fig.11, each of a different gray shade or color. These squares are called pixels—short for picture elements—and represent the relative reflected light energy recorded for that part of the image. This weather satellite image of hurricane Floyd from September 15, 1999, has been magnified to show the individual picture elements (pixels) that form most remote sensing images. (Image derived from NOAA GOES DATA). Each pixel represents a square area on an image that is a measure of the sensor's ability to resolve (see) objects of different sizes. For example, the Enhanced Thematic Mapper (ETM+) on the Landsat 7 satellite has a maximum resolution of 15 meters; therefore, each pixel represents an area 15 m x 15 m, or 225 m2. Higher resolution (smaller pixel area) means that the sensor is able to discern smaller objects. By adding up the number of pixels in an image, you can calculate the area of a scene. For example, if you count the number of green pixels in a false color image, you can calculate the total area covered with vegetation. Satellite Image of Hurricane Floyd 15 Sep 99
54
Pixels Each pixel represents a square area on an image that is a measure of the sensor's ability to resolve (see) objects of different sizes. For example, the Enhanced Thematic Mapper (ETM+) on the Landsat 7 satellite has a maximum resolution of 15 meters; Therefore, each pixel represents an area 15 m x 15 m, or 225 m2. Higher resolution (smaller pixel area) means that the sensor is able to discern smaller objects. By adding up the number of pixels in an image, you can calculate the area of a scene. For example, if you count the number of green pixels in a false color image, you can calculate the total area covered with vegetation.
55
Bits How does the computer know which parts of the image should be dark and which one should be bright? Computers understand the numeric language of binary numbers, which are sets of numbers consisting of 0s and 1s that act as an "on-off" switch. Converting from our decimal system to binary numbers, Darkest point is assigned the binary number 00, Dark gray as 01, Light gray as 10 and the Brightest part the binary number 11. How does the computer know which parts of the image should be dark and which one should be bright? Computers understand the numeric language of binary numbers, which are sets of numbers consisting of 0s and 1s that act as an "on-off" switch. Converting from our decimal system to binary numbers, 00 = 0, 01 = 1, 10 = 2, 11 = 3. Note that we cannot use decimal numbers since all computers are fussy—they only like "on" and "off." For example, consider an image shown below that is made up of 8 columns by 5 rows of pixels. In this figure, four shades are present: black, dark gray, light gray and white. The darkest point is assigned the binary number 00, dark gray as 01, light gray as 10, and the brightest part the binary number 11. We therefore have four pixels (B5, C4, D7 and E2) that the spacecraft says are 00. There are three dark gray pixels (B3, C2, C6 and E6) assigned the binary number 01, three light gray pixels (D3, D6 and E5) that are binary number 10, and 29 white pixels are assigned the binary number 11.
56
Bits Four shades between white and black would produce images with too much contrast. So instead of using binary numbers between 00 and 11, spacecraft use a string of 8 binary numbers (called "8-bit data"), which can range from to These numbers correspond from 0 to 255 in the decimal system. With 8-bit data, we can assign the darkest point in an image to the number , and the brightest point in the image to This produces 256 shades of gray between black and white. It is these binary numbers between 0 and 255 that the spacecraft sends back for each pixel in every row and column—and it takes a computer to keep track of every number for every pixel!
57
Color Images Another essential ingredient in most remote sensing images is color. Variations in black and white imagery can be very informative. The number of different gray tones that the eye can separate is limited to about 20 to 30 steps (out of a maximum of about 200) on a contrast scale. On the other hand, the eye can distinguish 20,000 or more color tints, enabling small but often important variations within the target materials or classes to be discerned. How does the computer know which parts of the image should be dark and which one should be bright? Computers understand the numeric language of binary numbers, which are sets of numbers consisting of 0s and 1s that act as an "on-off" switch. Converting from our decimal system to binary numbers, 00 = 0, 01 = 1, 10 = 2, 11 = 3. Note that we cannot use decimal numbers since all computers are fussy—they only like "on" and "off." For example, consider an image shown below that is made up of 8 columns by 5 rows of pixels. In this figure, four shades are present: black, dark gray, light gray and white. The darkest point is assigned the binary number 00, dark gray as 01, light gray as 10, and the brightest part the binary number 11. We therefore have four pixels (B5, C4, D7 and E2) that the spacecraft says are 00. There are three dark gray pixels (B3, C2, C6 and E6) assigned the binary number 01, three light gray pixels (D3, D6 and E5) that are binary number 10, and 29 white pixels are assigned the binary number 11.
58
Color Images Since different bands have a different contrast, computers can be used to produce a color image from a black and white remote sensing data set. Similar to the screen on a color television set, computer screens can display three different images using blue light, green light and red light. The combination of these three wavelengths of light will generate the color image that our eyes can see. This is accomplished by displaying black and white satellite images corresponding to various bands in either blue, green, or red light to achieve the relative contrast between the bands. Finally, when these three colors are combined, a color image—called a "false color image"—is produced
59
Colour Images Thermal IR NIR Red Green VIS Blue Colour
Another essential ingredient in most remote sensing images is color. While variations in black and white imagery can be very informative, the number of different gray tones that the eye can separate is limited to about 20 to 30 steps (out of a maximum of about 200) on a contrast scale. On the other hand, the eye can distinguish 20,000 or more color tints, enabling small but often important variations within the target materials or classes to be discerned. Fig.12. Images at different bands and false colour image Since different bands (or wavelengths) have a different contrast, computers can be used to produce a color image from a black and white remote sensing data set. Remember, satellites record the reflected and emitted brightness in the different parts of the spectrum, as is demonstrated in the figure above. Similar to the screen on a color television set, computer screens can display three different images using blue light, green light and red light. The combination of these three wavelengths of light will generate the color image that our eyes can see. This is accomplished by displaying black and white satellite images corresponding to various bands in either blue, green, or red light to achieve the relative contrast between the bands as shown in Figure.14. Finally, when these three colors are combined, a color image—called a "false color image"—is produced (it's called "false color" because colors are assigned that we can see and easily interpret with our eyes). In order to understand what the colors mean in the satellite image, we must know which band (or wavelength) is used for each of the blue, green and red parts of the computer display. Without detailed knowledge of how each band has been changed for contrast and brightness, we cannot be sure why the colors are what they are. Blue Colour
60
… close-up view over the Alps
Color Images MSG-1, 24 Feb 2004, 11:00 UTC, VIS 0.6 … close-up view over the Alps SEVIRI VIS 0.6
61
Color Images MSG-1 24 Feb 2003 11:00 UTC Channel 03 (NIR1.6)
1= low-level fog or stratus 2= Snow 3= Thin mid-level cloud with water droplets (supercooled cloud) 4= Lakes
62
Recommended RGBs for Monitoring of Day-time Fog
1 2 3 MSG-1 24 Feb 2003 11:00 UTC RGB Composite R = NIR1.6 G = VIS0.8 B = VIS0.6 1= low-level fog or stratus 2= snow 3= thin mid-level cloud with water droplets (supercooled cloud)
63
RGB VIS 0.6, IR 10.8-IR 8.7, IR 12.0-IR 8.7 Fire sulphur plant
Dust storms
64
Scanning microwave Radiometer (MSMR)
Remote Sensors Passive Active Optical IR (OIR) Microwaves Active instruments provide their own energy (electromagnetic radiation) to illuminate the object or scene they observe. They send a pulse of energy from the sensor to the object and then receive the radiation that is reflected or backscattered from that object. Scientists use many different types of active remote sensors. (a) Radar (Radio Detection and Ranging). A Radar uses a transmitter operating at either radio or microwave frequencies to emit electromagnetic radiation and a directional antenna or receiver to measure the time of arrival of reflected or backscattered pulses of radiation from distant objects. Distance to the object can be determined since electromagnetic radiation propagates at the speed of light. (b) Scatterometer. A scatterometer is high frequency microwave radar designed specifically to measure backscattered radiation. Over ocean surfaces, measurements of backscattered radiation in the microwave spectral region can be used to derive maps of surface wind speed and direction. (c) Lidar (Light Detection and Ranging). A lidar uses a laser (light amplification by stimulated emission of radiation) to transmit a light pulse and a receiver with sensitive detectors to measure the backscattered or reflected light. Distance to the object is determined by recording the time between the transmitted and backscattered pulses and using the speed of light to calculate the distance travelled. Lidars can determine atmospheric profiles of aerosols, clouds, and other constituents of the atmosphere. (d) Laser Altimeter A laser altimeter uses a lidar (see above) to measure the height of the instrument platform above the surface. By independently knowing the height of the platform with respect to the mean Earth's surface, the topography of the underlying surface can be determined. Photographic camera, Opto-mechanical Scanner(MSS), Push-broom Scanner (IRS-LISS) Scanning microwave Radiometer (MSMR) LIDAR Scatterometer, SAR
65
DIFF STAGES OF REMOTE SENSING SYSTEM
66
Questions if any ?
67
Thank you
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.