Presentation is loading. Please wait.

Presentation is loading. Please wait.


Similar presentations

Presentation on theme: "HOW A SILICON CHIP CAPTURES AN IMAGE"— Presentation transcript:


2 IMAGE SENSORS This chapter will help you understand how the image sensor in your camera works, but first you must understand what the Digitizing process is and how it works. We live in an Analog world, meaning that light and sound come to us as continuous analog waves that our senses interpret. It’s very difficult to invent a technology that can accurately record a continuous analog wave. For example, you can cut a continuous wave into a vinyl record , but because of the limitations of this storage process, the resulting recording is often noisy, scratchy, and unable to capture a full range of sound.

3 IMAGE SENSORS All cameras have certain things in common; lightproof body, lens, and a recording medium. Digital photography marks the first time that a nonchemical recording medium has been deployed. The invention of the Charge-Coupled Device in 1969 by George Smith and Willard Boyle was a new type of semi conductor that could be used for computer memory. Shortly there after, solid-state video camera using the CCD was invented. Since then CCDs have been used is cameras and fax machines all over the world.

4 COUNTING PHOTONS The image sensor in your camera is covered with a grid of small electrodes called Photosites (one for each pixel) Each Photosite contains a photodiode and a capacitor.

5 COUNTING PHOTONS When you turn on your camera, it places a uniform change, or voltage, onto the capacitor at each photosite of its image sensor. When light strikes a particular photosite, it causes the photodiode to drain some of the charge from the capacitor. The amount that the charge is lowered is directly proportional to the number of photons that strikes the photodiode. By measuring the reduction in voltage at a particular photosite, your camera is able to determine how much light hit that particular site during the exposure.

6 COUNTING PHOTONS Most cameras use either a 12-bit or 14-bit analog-to-digital converter. Which means that the value from each photosite is converted into a 12 or 14 bit number. If it is a 12-bit converter, then your looking at a number between 0 and 4,096. If it is a 14-bit converter, then you are looking at a number between 0 and 16, 384.

7 COUNTING PHOTONS The term CCD is derived from the way the camera reads the charges of the individual photosites. After exposing the CCD, the charges on the first row of photosites are transferred to a read-out register where they are amplified and then sent to an analog to digital converter. Each row of charges is electrically coupled to the next rows, so that after one row has been read and deleted, all of the other rows move down to fill the now empty space.

8 A LITTLE COLOR THEORY In 1869, James Clerk Maxwell asked photographer Thomas Sutton to take three black-and-white photographs of a tartan ribbon. Maxwell was trying to work on his theory of color pictures. Sutton placed three different filters over the camera for each shot : 1) Red 2) Green 3) Blue Maxwell projected all three pictures onto a screen using three projectors, with the same color filters. Placing them on top of each other created the first ever color picture!

9 A LITTLE COLOR THEORY In 1903 when the Lumiere brothers used red, green, and blue dyes to color grains of starch that could be applied to glass plates creates color images. This process was known as Autochrome. These are the three colors that can be mixed together to create all other colors. A digital image is composed of three different black-and-white images, which are combined to create a full color image.

10 INTERPOLATING COLOR By measuring photons, an image sensor can determine how much light has struck each part of its surface. The data that comes off the sensor doesn’t contain any information about color. At its heart, an image sensor is a grayscale device, which performs an interpolation (educated guess)

11 INTERPOLATING COLOR Each photosite on your camera’s image sensor is covered by a filter – red, green, or blue. This combination of filters is called ‘COLOR FILTER ARRAY” and most image sensors use a filter pattern called a “BAYER PATTERN”

12 INTERPOLATING COLOR The camera can calculate the color of any given pixel by analyzing all of the adjacent pixels. EXAMPLE: If you look at a particular pixel and see the pixel to the immediate left of it is a bright red pixel, the pixel to the right is a bright blue pixel, and the pixels above and below are bright green. Do not forget, a perfect combination of red, green, and blue creates the color white.

13 INTERPOLATING COLOR The process of interpolating is known as demosaicing. Some cameras use a region of up to 9 X 9 Pixels. Some cameras use a different type of color filter array, such as cyan, yellow, and green.

Before the light from your lens ever strikes the sensor, it is passed through several filters. Infrared filter Low pass filter After you take a picture, the data is read off the image sensor, amplified, passed through an analog-to-digital converter, and then passed to your camera’s on board processor.

First the image goes through a “colorimetric adjustment” During the demosaicing, the camera’s processor knows that a particular pixel was red, green, or blue. It does not know which precise shades of those primary colors were used in the filters. So the color needs to be skewed slightly to compensate for the specific color qualities of the color filter array.

A mathematical model that can be used to represent colors. EXAMPLE: 100% red in one color space might be a different shade from 100% red in another color space. Without color space, your image would be a meaningless array of numbers.

17 GAMMA CORRECTION In an imaging chip, when twice as much light hits a pixel, twice as much change in voltage is produced. In other words, the response of the pixels to light is linear. INCREASE IN LIGHT = LINEAR CHANGE IN VOLTAGE.


Similar presentations

Ads by Google