Presentation is loading. Please wait.

Presentation is loading. Please wait.

Fundamental concepts in video

Similar presentations


Presentation on theme: "Fundamental concepts in video"— Presentation transcript:

1 Fundamental concepts in video

2 Outline Types of Video Signals Analog Video Digital Video

3 Introduction This chapter introduce the principal notions needed to understand video. Digital video compression will be explored later.

4 Introduction Since video is created from a variety of sources, we begin with the signals themselves. Analog video is represented as a continuous (time- varying) signal. Digital video is represented as a sequence of digital images.

5 Video is the technology of electronically capturing, recording, processing, storing, transmitting, and reconstructing a sequence of still images representing scenes in motion.

6 Basic Concepts (Video Representation)
Human eye views video immanent properties of the eye determine essential conditions related to video systems. Video signal representation consists of 3 aspects: Visual Representation objective is to offer the viewer a sense of presence in the scene and of participation in the events portrayed. Transmission Video signals are transmitted to the receiver through a single television channel Digitalization analog to digital conversion, sampling of gray(color) level, quantization.

7 aspect ratio Aspect ratio describes the dimensions of video screens and video picture elements. All popular video formats are rectilinear, and so can be described by a ratio between width and height. The screen aspect ratio of a traditional television screen is 4:3. High definition televisions use an aspect ratio of 16:9. Video

8 Chrominance Chrominance (chroma for short), is the signal used in video systems to convey the color information of the picture, separately from the accompanying luma signal. Chrominance is usually represented as two color-difference components: U = B'–Y' (blue – luma) and V = R'–Y' (red – luma). Each of these difference components may have scale factors and offsets applied to them, as specified by the applicable video standard. luma represents the brightness in an image Video

9 Types of Video Signals Video signals can be organized in three different ways: component video, composite video, and S-video. Component video In popular use, it refers to a type of analog video information that is transmitted or stored as three separate signals for the red, green, and blue image planes. Each color channel is sent as a separate video signal. This kind of system has three kind wires (and connectors) connecting the camera or other devices to a TV or monitor. Most computer systems use Component Video, with separate signals for R, G,and B signals. For any color separation scheme, Component Video gives the best color reproduction since there is no “crosstalk” between the three channels. This is not the case for S-Video or Composite Video, discussed next. Component video, however, requires more bandwidth and good synchronization of the three components.

10 Component video

11 Composite Video — 1 Signal
Composite video: color (“chrominance”) and intensity (“luminance”) signals are mixed into a single carrier wave. This type of signal used by broadcast color TVs; it is downward compatible with black-and-white TV. When connecting to TVs, Composite Video uses only one wire and video color signals are mixed, not sent separately. The audio and sync signals are additions to this one signal. Since color and intensity are wrapped into the same signal, some interference between the luminance and chrominance signals is inevitable

12 S-Video (separate video) — 2 Signals
S-video as a compromise,) uses two wires, one for luminance and another for a composite chrominance signal. As a result, there is less crosstalk between the color information and the crucial gray-scale information. The reason for placing luminance into its own part of the signal is that black-and-white information is most crucial for visual perception. In fact, humans are able to differentiate spatial resolution in grayscale images with a much higher acuity than for the color part of color images. As a result, we can send less accurate color information than must be sent for intensity information — we can only see fairly large blobs of color, so it makes sense to send less color detail.

13 Analog Video Most TV is still sent and received as analog signal.
An analog signal f(t) samples a time-varying image. So-called “progressive” scanning traces through a complete picture (a frame) row-wise for each time interval. A high resolution computer monitor typically uses a time interval of 1/72 second. In TV, and in some monitors and multimedia standards as well, another system, called “interlaced” scanning is used: The odd-numbered lines are traced first, and then the even- numbered lines are traced. This results in “odd” and “even” fields —two fields make up one frame. In fact, the odd lines (starting from 1) end up at the middle of a line at the end of the odd field, and the even scan starts at a half-way point.

14 Analog Video Figure 5.1 shows the scheme used. First the solid (odd) lines are traced, P to Q, then R to S, etc., ending at T; then the even field starts at U and ends at V. The jump from Q to R, etc. in Figure 5.1 is called the horizontal retrace, during which the electronic beam in the CRT is blanked. The jump from T to U or V to P is called the vertical retrace. The scan lines are not horizontal because a small voltage is applied, moving the electron bean down over time.

15 Analog Video Interlacing was invented because, when standards were being defined, it was difficult to transmit the amount of information in a full frame quickly enough to avoid flicker, the double number of fields presented to the eye reduces the eye perceived flicker. Because of interlacing, the odd and even lines are displaced in time from each other —generally not noticeable except when very fast action is taking place on screen, when blurring may occur. Since it is sometimes necessary to change the frame rate, resize, or even produce stills from an interlaced source video, various schemes are used to “de-interlace” it.

16 Analog Video The simplest de-interlacing method consists of discarding one field and duplicating the scan lines of the other field. The information in one field is lost completely using this simple technique. Other more complicated methods that retain information from both fields are also possible.

17 NTSC Video NTSC (National Television System Committee) TV standard is mostly used in North America and Japan. It uses the familiar 4:3 aspect ratio (i.e., the ratio of picture width to its height) and uses 525 scan lines per frame at 30 frames per second (fps). The problem is that NTSC is an analog system. In computer video, colors and brightness are represented by numbers (digital). But with analog television, everything is just voltages, and voltages are affected by wire length, connectors, heat, cold, video tape, and so on. NTSC follows the interlaced scanning system, and each frame is divided into two fields, with lines/field. Thus the horizontal sweep frequency is 525×29.97 ≈ 15, 734 lines/sec,

18 PAL Video PAL (Phase Alternating Line) is a TV standard widely used in Western Europe, China, India, and many other parts of the world. PAL uses 625 scan lines per frame, at 25 frames/second, with a 4:3 aspect ratio and interlaced fields.

19 Digital video The advantages of digital representation for video are many, For example: Video can be stored on digital devices or in memory, ready to be processed (noise removal, cut and paste, etc.), and integrated to various multimedia applications; Direct access is possible, which makes nonlinear video editing achievable as a simple, rather than a complex, task; Repeated recording does not degrade image quality. Ease of encryption and better tolerance to channel noise

20 Chroma Subsampling Chroma subsampling is the practice of encoding images by implementing less resolution for chroma information than for luma information. It is used in many video encoding schemes — both analog and digital Because of storage and transmission limitations, there is always a desire to reduce (or compress) the signal. Since the human visual system is much more sensitive to variations in brightness than color, a video system can be optimized by devoting more bandwidth to the luma component (usually denoted Y'), than to the color difference components Cb and Cr. The signal is divided into a luma (Y') component and two color difference components (chroma)

21 Chroma Subsampling

22 CCIR Standards for Digital Video
CCIR is the Consultative Committee for International Radio, one of the most important standards it has produced is CCIR-601, for component digital video. Table 5.3 shows some of the digital video specifications, all with an aspect ratio of 4:3. The CCIR 601 standard uses an interlaced scan, so each field has only half as much vertical resolution

23 CCIR Standards for Digital Video

24 CCIR Standards for Digital Video
CIF stands for Common Intermediate Format specified by the CCITT (International Telegraph and Telephone Consultative Committee). The idea of CIF is to specify a format for lower bitrate. QCIF stands for “Quarter-CIF”.

25 High definition TV (HDTV)
refers to video having resolution substantially higher than traditional television systems. HD has one or two million pixels per frame. The first generation of HDTV was based on an analog technology developed by Sony in Japan in the late 1970s.


Download ppt "Fundamental concepts in video"

Similar presentations


Ads by Google