Presentation is loading. Please wait.

Presentation is loading. Please wait.

Multispectral Imager Design

Similar presentations


Presentation on theme: "Multispectral Imager Design"— Presentation transcript:

1 Multispectral Imager Design
for Nanosatellites V.H.R.I. Doedee, R. Deerenberg, E.Dokter – Faculties 3mE, AE & EEMCS

2 1. Introduction

3 Nanosatellite missions so far
Education Technology Demonstration No serious remote sensing Are nano-satellites capable for remote sensing jobs?

4 Major Constraints volume less than 10 cm x 10 cm x 15 cm
power consumption less than 3.0 W mass less than 1.5 kg imaging of preferably R, G, B, NIR, MIR, TIR bands system should survive space environment, including ‘accidental’ sun exposures operational life time should be at least five years

5 How can we achieve this? sensors with better quantum efficiencies and lower noise deployable instead of rigid optical systems in-orbit calibration instead of on-ground

6 Potential Applications
When using a constellation of nanosatellites Can increase temporal resolution of remote sensing for: Google Earth-like applications Precision agriculture Climate monitoring Disaster prevention & monitoring Military intelligence q.e. the percentage of photons hitting the photoreactive surface that will produce an electron–hole pair The amount of the pixel that is sensitive to light is called the 'Fill Factor'. Whereas with a CCD this may be as high as 95% with a CMOS it can be much lower, as small as 50-60% and the micro-lens is needed to make best use of the light falling onto it.

7 Contents Basics of Remote Imaging CCD & CMOS Orbit Noise
Motion Compensation Questions?

8 2. Basics of Remote Imaging

9 Basics of Remote Imaging
Sensing in the (optical) EM Spectrum Multispectral: multiple bandwiths UV; ultraviolet, μm VIS; the visible range, μm NIR; the near infrared range, μm SWIR; the short wave infrared range, μm MWIR; the midwave infrared range, μm LWIR; the long wave infrared range, μm Source:

10 Resolution Spatial Spectral Radiometric Temporal

11 Spatial Resolution Smallest measure of seperation between two objects that can be resolved by the system [T.A. Warner et al, 2009] Rayleigh Criterion Nominal Spatial Resolution Source:

12 Minimal Spatial Resolution

13 Spectral Resolution Unitless Ratio:
Source: Unitless Ratio:

14 Radiometric Resolution
How fine a difference in incident spectral radiance can be measured by the sensor [T.A. Warner, M. Duane Nellis, G.M. Foody, 2009] Quantization of incoming radiation

15 Temporal Resolution Time to refresh images

16 EM Propagation and Sensors
Optical sensors act like photon detectors Energy needed (bandgap): Band gap sets an upper limit

17 Materials Silicon (Si), 0.4-1 μm
Indium Gallium Arsenide (InGaAs), μm Indium Antimonide (InSb), μm Mercury Cadmium Telluride (HgCdTe), μm Wavelengths absorption of different InGaAs alloys. Source:

18 Beam splitters Split light waves in two consecutive beams
Source: Split light waves in two consecutive beams Cube, Plate, Pellicle Cube and Plate: only monochromatic light, heavy Pellicle: average transmission 50% (375–2400 nm), light, little ghosting, sensitive to vibrations

19 3. CCD & CMOS

20 CCD (Charged-coupled devices) ‘Traditional leader’
Great fill factor (~95%) High Quantum efficiency q.e. the percentage of photons hitting the photoreactive surface that will produce an electron–hole pair The amount of the pixel that is sensitive to light is called the 'Fill Factor'. Whereas with a CCD this may be as high as 95% with a CMOS it can be much lower, as small as 50-60% and the micro-lens is needed to make best use of the light falling onto it.

21 CMOS (Complementary metal-oxide-semiconductors)
Less circuitry required Low power consumption Individually read-out Cheap

22 CCD vs CMOS CCD: CMOS: + create high-quality, low noise images - more susceptible to noise + greater light sensitivity (fill factor, QE) - lower light sensitivity 100 times more power + consume little power Complex easy to manufacture Expensive + cheap

23 4. Orbit

24 Orbit Dusk-Dawn Orbit: No eclipse – No power storage needed.
Same illumination condition surface Earth. Easy data collection. Altitude = 400 Km: Typical nanosatellite perigee height. Lower altitude means better resolution. Lower altitude also means more drag. Design lifetime of 5 years achievable.

25 In-Orbit Calibration Normally the lens subsystem is calibrated on ground and designed such that it can withstand the launch without losing focus. This has major disadvantages Loss of focus will always be present. Increased risk. More mass.

26 In-Orbit Calibration Why not perform the calibration in-orbit?
Advantages: Less risk. Less mass. Can use the same mechanism of a possible deployable lens Disadvantages: Less precise calibration Not a simple solution!

27 In-Orbit Calibration How it works:
The camera makes an image, adjusts the focal length slightly and takes another image. The two images are then compared. And if the process is beneficial to the quality of the image then the process is repeated. Images are compared either on: The satellite itself – More dedicated electronics. or The ground – Issues with communication.

28 5. Noise

29 Signal to Noise Ratio - SNR
The SNR is a measure of the quality of the taken image. There are two ways to increase the SNR: Decrease noise as much as possible Increase Integration time to decrease single shot noise such as Shot Noise Read out Noise etc…

30 Thermal Noise Thermal Noise is one of the easiest ways to decrease noise. Materials emit a certain amount of electrons according to their temperature. Decreasing the temperature of the sensor would decrease Thermal Noise significantly. (factor 2 for every 6 degrees of cooling)

31 Thermal Control In order to decrease Thermal Noise, the sensor must be cooled. But: Thermal control of a nanosatellite is difficult due to it’s limited size No Active control can be applied Only Passive control is an option: Thus the satellite must be coated with a coating with high emissivity and a low absorption factor.

32 Further decreasing SNR
Other types of noise are dependent on the sensor and electronics. Such as: Quantum Efficiency (QE) – Measure of efficiency of the sensor, this value should be as high as possible within the required spectrum. Amplifying noise 1/f noise Etc… This should be as low as possible but this will increase the cost of the S/C.

33 Integration Time Some values of noise are a single event values, these do not increase with measurement time. When the time in which we view an object (Integration Time) is increased, the SNR goes up. This is however not a simple task since the S/C is moving with respect to the ground: Blurring effect.

34 Blurring Effect?

35 6. Motion Compensation

36 Motion Compensation Two ways in doing so:
Mechanical movement of the lens subsystem to track a point on the ground. Time Delay Integration - TDI.

37 Mechanical movement method.
In order to view the same point on the ground a tilting mirror can be placed in front of the lens subsystem to reflect the rays of light. The motion of this mirror has to be synchronized with the movement of the S/C w.r.t the ground. This increases complexity since moving parts are necessary Increase in power consumption Increase in development cost But large integration time is achievable.

38 Time Delay Integration - TDI
Time Delay Integration is a method which uses no moving parts. Instead it uses an array of pixels. When the satellite passes a point on the ground, the first pixel takes a measurement, and the pixel is read out. Due to the motion of the spacecraft, the same point can be seen by the next pixel in the direction of flight after a small time step. This holds for the entire pixel array. This method is also called push broom scanning

39 Time Delay Integration - TDI
Some advantages/disadvantages: Smaller Integration time achievable No CCD sensor can be used, only CMOS Much lower mass Tested and proven method.

40 Questions?


Download ppt "Multispectral Imager Design"

Similar presentations


Ads by Google