Presentation is loading. Please wait.

Presentation is loading. Please wait.

LWR 407 Principles of REMOTE SENSING N.L Mufute Midlands State University FACULTY OF NATURAL RESOURCES MANAGEMENT AND AGRICULTURE DEPARTMENT OF LAND AND.

Similar presentations


Presentation on theme: "LWR 407 Principles of REMOTE SENSING N.L Mufute Midlands State University FACULTY OF NATURAL RESOURCES MANAGEMENT AND AGRICULTURE DEPARTMENT OF LAND AND."— Presentation transcript:

1 LWR 407 Principles of REMOTE SENSING N.L Mufute Midlands State University FACULTY OF NATURAL RESOURCES MANAGEMENT AND AGRICULTURE DEPARTMENT OF LAND AND WATER RESOURCES MANAGEMENT 2011 mufutenl@msu.ac.zw

2 Remote Sensing Definition of remote sensing Remote sensing (RS), also called earth observation, refers to obtaining information about objects or areas on the Earth’s surface without being in direct contact with the object or area. The detection and recording instruments for this technology are known as remote sensors. The object being monitored is called target.

3 What is remote sensing? The International Society for Photogrammetry and Remote Sensing (ISPRS) defined Remote Sensing (RS) as: “The art, science, and technology of obtaining reliable information about physical objects and the environment, through the process of recording, measuring, and interpreting imagery and digital representation of energy patterns derived from non contact sensor system ". via cameras recording on film, which may then be scanned (aerial photos) via sensors, which directly output digital data (satellite imagery)

4 History of remote sensing 1783: The Marquis d’Arlandes and Pilatre made a voyage near Paris using a balloon. Photography using balloon, pigeon 1860: Aerial photos in Russia and the USA 1914-19: The first World War and the second World War (1939-45) had seen tremendous development in photography 1927: Robert Goddard launched the first liquid-fueled rocket. 1955: Work began on the Baikonur launch site in central Asia. 1957: Sputnik 1 launched from Baikonur (first satellite) 1961: Yuri Gagarin launched in the Vostok 1 capsule, becoming the first human in space. 1969: Neil Armstrong and Buzz Aldrin became the first humans to walk on the Moon. 1971: The first Space Station in history, the Russian Salyut 1 1972: (US Landsat1) the concept of imaging from satellites is introduced 1986: France launched the first stereo-image satellite (SPOT1) 1992: The space year (the maturity of remote sensing - 20 years of operation) 1995 The Shuttle-Mir Program (1 st phase of the International Space Station (ISS). 2000 The first 3 astronauts (2 Russian and one American) start to live in the ISS

5 Remote sensing basic processes

6 Energy Source or Illumination (A) - the first requirement for remote sensing is to have an energy source which illuminates or provides electromagnetic energy to the target of interest. Radiation and the Atmosphere (B) - as the energy travels from its source to the target, it will come in contact with and interact with the atmosphere it passes through. This interaction may take place a second time as the energy travels from the target to the sensor. Interaction with the Target (C) - once the energy makes its way to the target through the atmosphere, it interacts with the target depending on the properties of both the target and the radiation Recording of Energy by the Sensor (D) - after the energy has been scattered by, or emitted from the target, we require a sensor (remote - not in contact with the target) to collect and record the electromagnetic radiation.

7 Remote sensing basic processes Transmission, Reception, and Processing (E) - the energy recorded by the sensor has to be transmitted, often in electronic form, to a receiving and processing station where the data(in the form of an energy pattern ) are processed into an image (hardcopy and/or digital). Interpretation and Analysis (F) - the processed image is interpreted, visually and/or digitally or electronically, to extract information about the target which was illuminated. Accuracy assessment (radiometric and geometric correction) is also done at this stage. Application (G) - the final element of the remote sensing process is achieved when we apply the information we have been able to extract from the imagery about the target in order to better understand it, reveal some new information, or assist in solving a particular problem.

8 Advantages of remote sensing Provides a regional view (large areas) Provides repetitive looks at the same area Remote sensors "see" over a broader portion of the spectrum than the human eye Sensors can focus in on a very specific bandwidth in an image or a number of bandwidths simultaneously Provides geo-referenced, digital, data Some remote sensors operate in all seasons, at night, and in bad weather

9 Remote sensing applications Land-use mapping Forest and agriculture applications Telecommunication planning Environmental applications Hydrology and coastal mapping Urban planning Emergencies and Hazards Global change and Meteorology

10 Remote Sensing Platforms The vehicle or carrier for remote sensors. Typical platforms are satellite and aircraft, But can also include radio controlled air planes, balloons, kites for low altitude remote sensing,as well as vehicles for ground investigations. Platforms with the highest altitude are geosynchronous satellites such as the Geosynchronous Meteorological Satellite (GMS)

11 Table showing various platforms, altitudes and objects being sensed. PlatformAltitudeObservationRemarks Geostationary Satellite36000kmFixed point observatione.g. GMS Circular orbit satellite500-1000kmRegular observatione.g Land sat, SPOT, etc Space shuttle240km-350kmIrregular observation space experiment Radio-Sonde100m-100kmVarious investigations e.g meteorological High altitude Jet plane10km-12kmReconnaissance wide area investigations Low\ middle altitude plane 500m-8000mVarious investigation surveys Helicopter100m-2000m “ Radio controlled planeBelow 500m “ Gliders50-500m “ Balloons800m- “ Crane car5-50mClose range surveys Ground measurement car 0-30mGround truthing

12 Remote Sensing Active and passive remote sensing In active remote sensing, the sensor emits a signal (electromagnetic, sonar, laser...) and measures the signal returned by the target. Active and passive remote sensing In passive remote sensing, the sensor does not emit any signal. It simply measures the ambient signal in the surrounding medium (air, water,...).

13 Sensor Characteristics Sensed: Active vs. Passive Angle: Nadir vs. Off-nadir Motion: Scanning vs. Non-scanning Type: Imaging vs. Non-imaging Nadir vs. Off-nadir Off-nadir viewing results in a lower spatial resolution. ɵ = sensor angle (view angle) ɵ = 0 for nadir (directly above surface)

14 Active vs. Passive Sensors Passive sensors detect naturally produced radiation (by the sun, earth, or atmosphere) – Emitted – thermal infrared, passive microwave – Or Reflected (solar) – visible, near infrared Active sensors produce their own radiation and detect reflection or backscatter – Lasers (visible) – Radars (microwave) – Altimeters (visible or microwave)

15 Sensor Resolutions Spatial Resolution - the size of the instantaneous field of view, e.g. 10 x 10 m. Spectral Resolution - the number and size of spectral regions the sensor records data in, e.g. blue, green, red, near-infrared, thermal infrared, microwave (radar). Radiometric Resolution - the sensitivity of detectors to small differences in electromagnetic energy. Temporal Resolution- how often the sensor acquires data from the same position on the earth, e.g. every 30 days.

16 Spatial Resolution

17 Spectral Resolution

18 Radiometric Resolution Radiometric resolution in remotely sensed data is defined as the amount of energy required to increase a pixel value by one quantization level or 'count'. In image processing, quantization levels are usually referred to as Digital Numbers (DN).

19

20 Temporal Resolution

21 Remote Sensing Data RS data is determined according the particular needs of the project. The trade-off between the possible spatial and spectral resolution and temporal coverage configurations must be considered before choosing a sensor/RS platform. There are many different sensor designs as far as the spatial, spectral resolutions, coverage area and orbits concerned.

22 Fundamental Premise of Remote Sensing We can identify and learn about objects and features on the Earth's surface by studying the spectral characteristics of the radiation reflected (and/or emitted) by these features.

23 Electromagnetic radiation Electromagnetic radiation (EMR) describes the way in which high- frequency energy (visible light, radio waves, heat, ultraviolet rays and X-rays) is transferred from one object to another through space. The sun is the main source of electromagnetic radiation (EMR) The electromagnetic radiation is normally used as an information carrier in remote sensing.

24 Electromagnetic radiation The reflectance is a key quantity for the remote sensing. It is the ratio of the amount of light being reflected by a target to the amount of light incident to the target. Reflectance is a unitless quantity; Its varies strongly with wavelength; Reflectance of objects usually vary strongly with the angle; The reflectance

25 The Electromagnetic Spectrum It consists of the whole range of different waves contained in the electromagnetic radiation. It ranges from shorter wavelengths (including gamma and X-rays) to the longer wavelengths (including microwaves and broadcast radio waves) There are several regions of the EMS which are useful for R.S; – Visible, – Ultraviolet, – Infrared (divided into reflected and thermal radiation) – Microwave The physical principles of interaction of the EMR with targets are different over each spectral range.

26 The Electromagnetic Spectrum

27

28 INTERACTIONS WITH THE ATMOSPHERE EMR from the sun that is reflected by the earth and detected by the satellite or aircraft-borne sensor must pass through the atmosphere twice, once on its journey from the sun to the earth and second after being reflected by the surface of the earth back to the sensor. Interactions of the direct solar radiation and reflected radiation from the target with the atmospheric constituents interfere with the process of remote sensing and are referred to as “Atmospheric Effects”. The interaction of EMR with the atmosphere is important to remote sensing for two main reasons. i. information carried by EMR reflected/ emitted by the earth’s surface is modified while traversing through the atmosphere. Ii. the interaction of EMR with the atmosphere can be used to obtain useful information about the atmosphere itself.

29 INTERACTIONS WITH THE ATMOSPHERE The atmospheric constituents scatter and absorb the radiation modulating the radiation reflected from the target by attenuating it, changing its spatial distribution and introducing into field of view radiation from sunlight scattered in the atmosphere and some of the energy reflected from nearby ground area. Both scattering and absorption vary in their effect from one part of the spectrum to the other. The solar energy is subjected to modification by several physical processes as it passes the atmosphere, viz. Scattering; Absorption, and Refraction

30 Atmospheric Scattering Scattering is the redirection of EMR by particles suspended in the atmosphere or by large molecules of atmospheric gases. Scattering not only reduces the image contrast but also changes the spectral signature of ground objects as seen by the sensor. The amount of scattering depends upon the size of the particles, their abundance, the wavelength of radiation, depth of the atmosphere through which the energy is travelling and the concentration of the particles. The concentration of particulate matter varies both in time and over season. Thus the effects of scattering will be uneven spatially and will vary from time to time. Theoretically scattering can be divided into three categories depending upon the wavelength of radiation being scattered and the size of the particles causing the scattering. The three different types of scattering from particles of different sizes are summarized below:

31 Types of Scattering

32 Rayleigh Scattering Predominates where EMR interacts with particles that are smaller than the wavelength of the incoming light. Shorter wavelengths are scattered more than longer wavelengths. In the absence of these particles and scattering the sky would appear black. In the context of remote sensing, the Rayleigh scattering is the most important type of scattering. It causes a distortion of spectral characteristics of the reflected light when compared to measurements taken on the ground.

33 Mie Scattering Occurs when the wavelength of the incoming radiation is similar in size to the atmospheric particles. Caused by aerosols: a mixture of gases, water vapour and dust. It is generally restricted to the lower atmosphere where the larger particles are abundant and dominates under overcast cloud conditions. It influences the entire spectral region from ultra violet to near infrared regions.

34 Non-selective Scattering Occurs when the particle size is much larger than the wavelength of the incoming radiation. Particles responsible for this effect are water droplets and larger dust particles. The scattering is independent of the wavelength, all the wavelength are scattered equally. The most common example of non-selective scattering is the appearance of clouds as white. As cloud consist of water droplet particles and the wavelengths are scattered in equal amount, the cloud appears as white. The effects of haze are less pronounced in the thermal infrared region. Microwave radiation is completely immune to haze and can even penetrate clouds.

35 Atmospheric Absorption The gas molecules present in the atmosphere strongly absorb the EMR passing through the atmosphere in certain spectral bands. Mainly three gases are responsible for most of absorption of solar radiation, viz. ozone, carbon dioxide and water vapour. Ozone absorbs the high energy, short wavelength portions of the ultraviolet spectrum (λ<0.24 μm) thereby preventing the transmission of this radiation to the lower atmosphere. Carbon dioxide is important in remote sensing as it effectively absorbs the radiation in mid and far infrared regions of the spectrum (13-17.5 μm). whereas two most important regions of water vapour absorption are in bands 5.5 - 7.0 μm and above 27 μ m. Absorption relatively reduces the amount of light that reaches our eye making the scene look relatively duller.

36 Atmospheric Windows – The general atmospheric transmittance across the whole spectrum of wavelengths is shown in Figure below. – The atmosphere selectively transmits energy of certain wavelengths. The spectral bands for which the atmosphere is relatively transparent are known as atmospheric windows. – Atmospheric windows are present in the visible part (0.4 μm - 0.76 μm) and the infrared regions of the EM spectrum. – In the visible part transmission is mainly effected by ozone absorption and by molecular scattering. – The atmosphere is transparent again beyond about λ = 1mm, the region used for microwave remote sensing.

37 Refraction It is the bending of light at the contact between two media, also occurs in the atmosphere as the light passes through the atmospheric layers of varied clarity, humidity and temperature. These variations influence the density of atmospheric layers, which in turn, causes the bending of light rays as they pass from one layer to another. The most common phenomena are the mirage like apparitions sometimes visible in the distance on hot summer days.

38 Electromagnetic Radiation: The interaction of the EMR with the target The EMR reaching the target enters in some physical interactions with it; The most common processes are the absorption, scattering, reflection, transmiss ion, refraction and reemission; Man y other more uncommon processes also exist; The occurrence of these events depend on the type of the target, the physical geometry and the wavelength s involved.

39 Electromagnetic radiation Reflection Reflected light does not enter the medium. In theory, the intensity of the reflected light is equal to the intensity of the incident beam;

40 Electromagnetic radiation The refraction and transmission A portion of light beam entering a new medium is being reflected out while the rest enters the medium; The amount of light entering the medium as described by the quantity transmittance; The direction of light beam traveling in the medium is different than the incident direction of propagation. This process is called refraction;

41 Spectral Signature Spectral reflectance is the ratio of reflected energy to incident energy as a function of wavelength. Various materials of the earth ’ s surface have different spectral reflectance characteristics. Spectral reflectance is responsible for the colour or tone in a photographic image of an object. Trees appear green because they reflect more of the green wavelength. The values of the spectral reflectance of objects averaged over different, well-defined wavelength intervals comprise the spectral signature of the objects or features by which they can be distinguished. To obtain the necessary ground truth for the interpretation of multispectral imagery, the spectral characteristics of various natural objects have been extensively measured and recorded. The spectral reflectance is dependent on wavelength, it has different values at different wavelengths for a given terrain feature.

42 Spectral Signature The reflectance characteristics of the earth ’ s surface features are expressed by spectral reflectance, which is given by:

43 Spectral Signature is called a spectral reflectance curve. This varies with the variation in the chemical composition and physical conditions of the feature, which results in a range of values. The spectral response patterns are averaged to get a generalized form, which is called generalized spectral response pattern for the object concerned. Spectral signature is a term used for unique spectral response pattern, which is characteristic of a terrain feature. Figure below shows a typical reflectance curves for three basic types of earth surface features, healthy vegetation, dry bare soil (grey-brown and loamy) and clear lake water.

44 Typical Reflectance Curves for 3 Basic Types of Earth Surface Features

45 Reflectance Characteristics of Earth’s Cover types Vegetation The spectral characteristics of vegetation vary with wavelength. Plant pigment in leaves called chlorophyll strongly absorbs radiation in the red and blue wavelengths but reflects green wavelength. The internal structure of healthy leaves acts as diffuse reflector of near infrared wavelengths. Measuring and monitoring the near infrared reflectance is one way that scientists determine how healthy particular vegetation may be.

46 Reflectance Characteristics of Earth’s Cover types Water: Majority of the radiation incident upon water is not reflected but is either absorbed or transmitted. Longer visible wavelengths and near infrared radiation is absorbed more by water than by the visible wavelengths. Thus water looks blue or blue green due to stronger reflectance at these shorter wavelengths and darker if viewed at red or near infrared wavelengths. The factors that affect the variability in reflectance of a water body are depth of water, materials within water and surface roughness of water.

47 Reflectance Characteristics of Earth’s Cover types Soil: The majority of radiation incident on a soil surface is either reflected or absorbed and little is transmitted. The characteristics of soil that determine its reflectance properties are its moisture content, organic matter content, texture, structure and iron oxide content. The soil curve shows less peak and valley variations. The presence of moisture in soil decreases its reflectance. By measuring the energy that is reflected by targets on earth’s surface over a variety of different wavelengths, we can build up a spectral signature for that object. And by comparing the response pattern of different features we may be able to distinguish between them, which we may not be able to do if we only compare them at one wavelength. E.g., Water and Vegetation reflect somewhat similarly in the visible wavelength but not in the infrared.

48 RS Applications for Different Spectral bands Blue (0.45-0.50) Water penetration, vegetation characteristics, sed. Green (0.50-0.60) Green reflectance, healthy vegetation Red (0.60-0.70) Vegetation discrimination, red chlorophyll adsorp. Panchromatic (0.5-0.75) Mapping, land use, stereo pairs Reflective IR (0.75-0.90) Biomass, crop ID, soil-crop-land-water Mid IR (1.5-1.75) Plant turgidity, drought, cloud, snow-ice Mid IR (2.0-2.35) Geology, rock formations Thermal IR (10-12.5) Temp, thermal inertia, moisture studies Microwave (0.1-5 cm) Snow cover & depth, veg water content Microwave (5-24 cm) Melting snow, water land boundaries, soil Moisture

49 Output of a Remote Sensing System The output of a remote sensing system is usually an image representing the scene being observed. Many further steps of digital image processing and modelling are required in order to extract useful information from the image. Suitable techniques are adopted for a given theme, depending on the requirements of the specific problem. Since remote sensing may not provide all the information needed for a full-fledged assessment, many other spatial attributes from various sources are needed to be integrated with remote sensing data. This integration of spatial data and their combined analysis is performed through a set of computer software/hardware, known as Geographical Information System (GIS).

50 Image Interpretation Elements The following eight elements are mostly used in image interpretation; 1.Size. A proper photo-scale should be selected depending on the purpose of the interpretation. Approximate size of an object can be measured by multiplying the length on the image by the inverse of the photo-scale. 2. Shape. The specific shape of an object as it is viewed from above will be imaged on a vertical photograph. Therefore the shape looking from a vertical view should be known. For example, the crown of a conifer tree looks like a circle, while that of a deciduous tree has an irregular shape. Airports, harbours, factories etc., can also be identified by their shape. 3. Shadow. It is usually a visual obstacle for image interpretation. However, shadow can also give height information about towers, tall buildings, etc., as well as shape information from the non – vertical perspective – such as shape of a bridge. 4. Tone. The continuous grey scale varying from white to black is called tone. In panchromatic photographs, any object will reflect its unique tone according to the reflectance. For example dry sand reflects white, while wet sand reflects black. In black and white near infra red photographs, water is black and healthy vegetation white to grey.

51 Image Interpretation Elements Cont. 5. Colour. Colour is more convenient for the identification of object details. For e.g., vegetation types and species can be more easily interpreted by less experienced interpreters using colour information. Sometimes colour infrared photographs or false colour images will give more specific information, depending on the emulsion of the film or the filter used and the object being imaged. 6. Texture. It is a group of repeated small patterns. For e.g. homogenous grassland exhibits a smooth texture, coniferous forests usually shows a coarse texture. However this will depend on the scale of the photograph or image. 7. Pattern. It is a regular usually repeated shape with respect to an object. For example, rows of houses or apartments, regularly spaced rice fields, interchanges of highways, orchards, etc., can provide information from their unique patterns. 8. Associated Relationship or Context. A specific combination of elements, geographic characteristics, configuration of the surroundings or the context of an object can provide the user with specific information for image interpretation.

52 Digital Image Processing Involves the manipulation and interpretation of digital images with the aid of a computer. It comprises the following four basic steps: (a) Image correction/restoration: Image data recorded by sensors on a satellite or aircraft contain errors related to geometry and brightness values of the pixels. These errors are corrected using suitable mathematical models, which are either definite or statistical models. (b) Image enhancement: Image enhancement is the modification of image, by changing the pixel brightness values, to improve its visual impact. Image enhancement techniques are performed by deriving the new brightness value for a pixel either from its existing value or from the brightness values of a set of surrounding pixels. (c) Image transformation: The multi-spectral character of image data allows it to be spectrally transformed to a new set of image components or bands with a purpose to get some information more evident or to preserve the essential information content of the image (for a given application), with a reduced number of transformed dimensions. The pixel values of the new components are related to the original set of spectral bands via a linear operation.

53 Digital Image Processing Cont. (d) Image classification: The overall objective of image classification procedures is to automatically categorize all pixels in an image into land cover classes or themes. A pixel is characterized by its spectral signature, which is determined by the relative reflectance in different wavelength bands. Multi-spectral classification is an information extraction process that analyses these spectral signatures and assigns the pixels to classes based on similar signatures.

54 Multi-spectral Classification There are two major approaches to multi-spectral classification: unsupervised and supervised. The unsupervised classification is the identification of natural groups, or structures, within multi-spectral data using only the available imagery. However the analyst has no control over the nature of the classes. The final classes will be relatively homogenous but may not correspond to any useful cover classes. The supervised classification is the process of using samples of known identity (ground truth sites) to classify pixels of unknown identity (i.e. to assign unclassified pixels to one of several informational classes). There are many classifiers used for supervised classification. One of the most common is maximum likelihood classifier. It relies upon the assumption that the populations from which the training samples are drawn are multivariate-normal in their distribution. This is not always the case.

55 Multi-spectral Classification Cont. To overcome the problem of normality, the currently favoured alternatives are non-parametric classifiers. A nonparametric classifier uses a set of nonparametric signatures to assign pixels to a class based on their location, either inside or outside the area in the feature space image. To overcome difficulties in conventional digital classification, new approaches like context classifiers, decision tree classifiers, neural network algorithms, etc., are being developed. Another technique is fuzzy classification in which, each pixel is assigned a number for each class, ranging from 0 to 1, which indicates the proportions of the different classes, which have contributed to the observed spectral signature. These classifiers are mainly used, when the spectral reflectance of different features do not follow normal distribution. A schematic diagram of commonly followed digital image processing procedures by resource scientists is presented in Figure below.

56 A schematic diagram of general image processing procedures.

57 Multi-source data fusion Remote sensing satellites carry sensors of varied characteristics. Many times data are complementary in nature, for example, panchromatic data with high spatial resolution and multispectral data with low spatial resolution. Fine spatial resolution is necessary for an accurate description of shapes, features and structures, whereas fine spectral resolution allows better discrimination between attributes (e.g. to classify land cover). Hence, merging of these two types of data to form multi-spectral images with high spatial resolution, is beneficial for various applications like vegetation mapping, land cover classification, precision farming and urban management for e.g. Image fusion can be performed at 3 different processing levels, according to the stage at which the fusion takes place, namely, pixel, feature and decision level. Various techniques are available for merging multisensory image data at pixel level.

58 Multi-source data fusion Methods of merging can be divided into two categories. The first consists of methods, which simultaneously take into account all bands in the merging process, whereas the second category groups together those methods, which deal separately with the spatial information and each spectral band. The most commonly used methods like IHS (Intensity- Hue- Saturation) and PCS (Principal Component Substitution) belong to the first category. Methods like Brovey, HPF (High Pass Filter), etc., belong to the second category. While classification of multi-source data (e.g. microwave and optical, together) is feature level data merging, the integration of various thematic layers in GIS to arrive at a common plan is a decision-level merging technique.

59 Flow chart for studying the ecosystem changes using satellite data (Hussin et. al., 1999)

60 Figure 2. Mangrove forest area change detection procedure (Hussin et. al., 1999)

61 Applications of GIS and Remote sensing: Further Examples

62 Land use and land cover mapping Knowledge of land use and land cover is important for planning and management activities and is considered an essential element for modelling and understanding the earth as a system. Aerial photographs have been used for land use/ cover mapping since the 1940s. Satellite images have been used more recently. Land cover relates to type of feature on earth’s surface e.g maize fields, lakes, forests, roads, etc. Land use relates to human activity or economic function associated with a specific piece of land e.g. farmland, residential area, plantation etc.

63 Geologic and Soil Mapping Use of aerial photographs for geologic mapping started around 1913s. Geologic mapping involves the identification of landforms, rock types and rock structures (folds, faults and fractures) and the portrayal of geologic units and structures on a map or other display in their correct spatial relationships with one another, Satellite images and aerial photo graphs provide much information about potential areas for mineral exploitation, Detailed soil surveys form a primary source of information about an area and hence are used in comprehensive land use planning, Air photo interpretation has been used since the 1930s to facilitate the mapping process. Beginning in the mid-1980s, soil survey map information for many countries has been made available by the USDA as maps and in digital files.

64 Agricultural Applications Crop type classification- require a knowledge of the developmental stages of each crop in the assessment area. Precision farming – information and technology based agricultural management systems to identify, analyse and manage site-soil- spatial and temporal variability within fields for optimum profitability, sustainability and protection of the environment. Crop management – Large scale images have been proven useful in many forms of crop management e.g. For documenting crop diseases, insect damage, plant stress and disaster management. The photos utilised are taken on different dates.

65 Forestry Applications Forestry is mostly concerned with management of forests for wood, forage, wildlife and recreation. Visual image interpretation provides a feasible means of monitoring many of the world’s forest conditions. Visual image interpretation can be used for tree species identification, studying harvested or deforested areas and the assessment of disease and insect infestations.

66 Water Resources Applications Water pollution detection Lake eutrophication assessment Flood damage assessment Ground Water location Watershed assessment Riparian vegetation mapping Reservoir Site selection Shoreline erosion studies Flood plain and shore land zoning compliancy Survey of recreational use lakes and rivers Wetlands mapping

67 Wildlife and ecology applications Wildlife ecology concerned with the interactions between wildlife conservation and wild life management. Wild life habitat mapping and wild life censuring are aspects of wildlife ecology for which visual image interpretation can most readily provide useful information.

68 Remote sensing literature-J ournal/Conferences Photogrammetric Engineering and Remote sensing (PE & RS) Photogrammetric Record International Journal of Remote Sensing ISPRS Journal of Photogrammetry and Remote Sensing ISPRS conference proceedings IGARSS conference proceedings

69 Remote sensing literature -B ooks Askne, J. (1995). Sensors and Environmental applications of remote sensing, Balkema, Rotterdam, NL Campbell, J. B., 1996. Introduction to Remote Sensing. 2 nd ed.,Taylor and Francis, London Dengre, J. (1994). Thematic Mapping from satellite imagery: Guide book, Elsevier ltd, Boulevard Lillesand, T. M. and R. W. Kiefer, 2000. Remote Sensing and Image Interpretation. 4 th ed., John Wiley and Sons, Inc. New York Simonette, D. S. (ed) (1983) Manual of remote sensing, the Sheridan Press, Falls church WWF,2008. Training Manual-Principles and Applications of Conservation GIS. GIS Laboratory, WWF-Pakistan. Remote sensing applications: An overview Ranganath R. Navalgund1,*, V. Jayaraman2 and P. S. Roy3 CURRENT SCI ENCE, VOL. 93, NO. 12, 25 DECEMBER 2007 Application of Remote Sensing and GIS Prof. S. Ramachandran Vice ‐ Chancellor Madras University NASA WATER SCIENCE & APPLICATIONS BASIC PRINCIPLES FOR SATELLITE REMOTE SENSING Ted Engman Science Applications International Corp., NASA/GSFC Satellite Remote Sensing and GIS Applications in Agricultural Meteorology pp. 23-38


Download ppt "LWR 407 Principles of REMOTE SENSING N.L Mufute Midlands State University FACULTY OF NATURAL RESOURCES MANAGEMENT AND AGRICULTURE DEPARTMENT OF LAND AND."

Similar presentations


Ads by Google