Video Fundamentals Signal Processing for Digital TV

Slides:



Advertisements
Similar presentations
Chapter 8-Video.
Advertisements

1 Chelmsford Amateur Radio SocietyMurray Niman G6JYBHigh Definition TV Talk v1.1, Feb-2005 High Definition Interested in Sky HDTV? Make sure you are...
Archiving Legacy Composite Recordings Made On Component VTRs Terry Harvey Southern Illinois University WSIU-TV (PBS)
Basics of MPEG Picture sizes: up to 4095 x 4095 Most algorithms are for the CCIR 601 format for video frames Y-Cb-Cr color space NTSC: 525 lines per frame.
Motivation Application driven -- VoD, Information on Demand (WWW), education, telemedicine, videoconference, videophone Storage capacity Large capacity.
Fundamental concepts in video
What We Must Understand
HD Overview & System Integration
Multimedia communications EG-371Dr Matt Roach Multimedia Communications EG 371 and EG 348 Dr Matthew Roach Lecture 2 Digital.
Video enhances, dramatizes, and gives impact to your multimedia application. Your audience will better understand the message of your application.
Course 2007-Supplement Part 11 NTSC (1) NTSC: 2:1 interlaced, 525 lines per frame, 60 fields per second, and 4:3 aspect ratio horizontal sweep frequency,
Integrated Circuits Design for Applications in Communications Dr. Charles Surya Department of Electronic and Information Engineering DE636  6220
SWE 423: Multimedia Systems Chapter 5: Video Technology (1)
Comp :: Fall 2003 Video As A Datatype Ketan Mayer-Patel.
Sample rate conversion At times, it will be necessary to convert the sampling rate in a source signal to some other sampling rate Consider converting from.
ATSC Digital Television
Fundamentals of Multimedia Chapter 5 Fundamental Concepts in Video Ze-Nian Li and Mark S. Drew 건국대학교 인터넷미디어공학부 임 창 훈.
CSc 461/561 CSc 461/561 Multimedia Systems Part A: 3. Video.
Video Basics – Chapter 4 The Video Camera.
Copyright © Magnum Semiconductor, Unpublished Introduction to Deinterlacing by Mark Korhonen.
MPEG-2 Digital Video Coding Standard
1 CCTV SYSTEMS CCTV MONITORS. 2 CCTV SYSTEMS A monitor simply allows remote viewing of cameras in a CCTV system from a control room or other location.
Chapter 5 Fundamental Concepts in Video
Understanding Video.  Video Formats  Progressive vs. Interlaced  Video Image Sizes  Frame Rates  Video Outputs  Video as Digital Data  Compression.
HDMI (High-Definition Multimedia Interface) What is HDMI? Background Info Review Terminology Explain Operation Advantages.
Ni.com Data Analysis: Time and Frequency Domain. ni.com Typical Data Acquisition System.
Multimedia Specification Design and Production 2012 / Semester 1 / L2 Lecturer: Dr. Nikos Gazepidis
Data and Computer Communications Chapter 8 – Multiplexing
CS Spring 2014 CS 414 – Multimedia Systems Design Lecture 5 – Digital Video Representation Klara Nahrstedt Spring 2014.
MPEG MPEG-VideoThis deals with the compression of video signals to about 1.5 Mbits/s; MPEG-AudioThis deals with the compression of digital audio signals.
Multimedia Data Video Compression The MPEG-1 Standard
LECTURE Copyright  1998, Texas Instruments Incorporated All Rights Reserved Encoding of Waveforms Encoding of Waveforms to Compress Information.
Fundamentals of video.
© 2011 The McGraw-Hill Companies, Inc. All rights reserved Chapter 6: Video.
Windows Media Video 9 Tarun Bhatia Multimedia Processing Lab University Of Texas at Arlington 11/05/04.
10/10/04 L5/1/28 COM342 Networks and Data Communications Ian McCrumRoom 5D03B Tel: voice.
ITBIS351 Multimedia Systems and Hypermedia Yaqoob Al-Slaise
1 Multimedia Information Representation. 2 Analog Signals  Fourier transform and analysis Analog signal and frequency components Signal bandwidth and.
Video Video.
DIGITAL Video. Video Creation Video captures the real world therefore video cannot be created in the same sense that images can be created video must.
Concepts of Multimedia Processing and Transmission IT 481, Lecture 3 Dennis McCaughey, Ph.D. 5 February, 2007.
1 University of Canberra Advanced Communications Topics Television Broadcasting into the Digital Era by: Neil Pickford Lecture 2 Digital Video Formats,
Video.
Digital Image and Video Coding 11. Basics of Video Coding H. Danyali
Presented by HDMI, L.L.C. May, 2005 HDMI Retail Training Program Part 1: Overview HDMI – The Standard for Connecting HDTV.
Glossary of Digital Broadcast. Analog  A type of waveform signal that contains information such as image, voice, and data. Analog signals have unpredictable.
Ch5: TELEVISION.
Lecture Slides Auxiliary materials Reference Books Study Guide.
Digital Video Digital video is basically a sequence of digital images  Processing of digital video has much in common with digital image processing First.
NTSC SYSTEM. INTRODUCTION It’s the national television system committee. Analog television – used in parts of North and South America,Myanmar,S.Korea,parts.
IntroductiontMyn1 Introduction MPEG, Moving Picture Experts Group was started in 1988 as a working group within ISO/IEC with the aim of defining standards.
Amplitude/Phase Modulation
SC200x Video Subsystem Tony Sloan DTV Source Applications Broadband Entertainment Division July 2001.
MPEG CODING PROCESS. Contents  What is MPEG Encoding?  Why MPEG Encoding?  Types of frames in MPEG 1  Layer of MPEG1 Video  MPEG 1 Intra frame Encoding.
1 Basics of Video Multimedia Systems (Module 1 Lesson 3) Summary: r Types of Video r Analog vs. Digital Video r Digital Video m Chroma Sub-sampling m HDTV.
Residential Audio & Video Systems Copyright © 2005 Heathkit Company, Inc. All Rights Reserved Presentation 18 – Digital Television (DTV) – Part 2.
Video Concepts and Techniques 1 SAMARTH COLLEGE OF ENGINEERING &TECHNOLOLOGY DEPARTMENT OF ELECTRONIC & COMMUNICATION ENGINEERING 5th semester (E&C) Subject.
Video System Dr inż. Zdzisław Pólkowski Badea George-Cosmin.
Digital Video Representation Subject : Audio And Video Systems Name : Makwana Gaurav Er no.: : Class : Electronics & Communication.
Overview of Digital Video 101 June 28, Aspect Ratio: 16:9 (Widescreen) CRT Type: PureFlat HDTV Display Capability: 480p/1080i Digital Comb Filter.
PRESENT BY:- DHVANI BHANKHAR RUCHA PATEL. INTRODUCTION  HD IS DESCRIBED FROM THE LATE 1930s.  HIGH DEFINITION TELEVISION.  DIGITAL TV BROAD CASTING.
Chapter 5 Fundamental Concepts in Video
Fundamental concepts in video
AMCOM Digital Archive Design Review - Week 4.
CSI-447 : Multimedia Systems
Digital 2D Image Basic Masaki Hayashi
Chapter 6: Video.
COURSE: AUDIO VIDEO ENGINNERING TV Transmitter and Receiver
Digital television systems - (DTS) Lectures
Faculty of Science Information Technology Safeen Hasan Assist Lecturer
Presentation transcript:

Video Fundamentals Signal Processing for Digital TV Presented to IEEE OC Computer Society by Dwight Borses, MTS FAE National Semiconductor, Irvine Original Presentation Materials Developed by: Dr. Nikhil Balram, CTO and Dr. Gwyn Edwards, TME National Semiconductor Displays Group Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Tonight’s Presentation Digital Television architecture and functionality NTSC Background (with a touch of PAL and SECAM) Major video processing building blocks Application examples Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Definition of Digital Television Any display system that Has digital processing of one or more of its inputs Includes the function of showing video Divide into 4 segments Monitor/TV Digital monitor with video function – 15” XGA LCD Monitor/TV TV/Monitor HD-Ready TV / EDTV / SDTV / 100 Hz TV – digital displays often include monitor as additional function MPEG TV Integrated HDTV (USA); iDTV [Integrated Digital TV] (Europe); BS Digital (Japan) Smart (IP) TV Internet connectivity, built-in hard-drive (PVR), interactivity etc Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Digital Television Smart TV MPEG TV TV/Monitor Monitor/TV iPTV HDTV + Media Communication Processor MPEG TV HDTV + ATSC tuner + VSB/QAM/ QPSK Receiver + MPEG Processor + Transport Demux + Multiple Stream Decoder TV/Monitor SDTV/EDTV/HDTV-READY + 3D Deinterlacer + Dual Scalers + Intelligent Color mgmt + FRC + 3D Decoder Monitor/TV Baseband Front-end ADC DVI-HDCP 2D Video Decoder Display Processor 2D Deinterlacer Scaling Color Mgmt OSD RF front end (type A) (NTSC/PAL/SECAM) Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

The Modern “HD Ready” TV Set RF Source(s) Sound IF (SIF) 4.5 to 6.5 MHz Audio Decoder & Amplifiers Other Video Sources Other Audio Sources CVBS YCbCr RGB Display Electronics (Display Dependent) Display e.g., CRT, LCD, DLP, LCOS, Plasma Tuner & Demodulator Video Decoder Video Deinterlacer, Scaler & CSC The tuner extracts one TV channel at a time from many, then downconverts and demodulates the signal to “baseband” The video (or “color”) decoder separates the colors from the “composite” (CVBS) signal The deinterlacer and scaler converts the format of the picture to match that of the display type (optional for CRT TVs) The display electronics converts the signal format to match that of the display type: e.g. analog for CRT, LVDS for LCD panel

Functional System Architecture for MPEG TV ATSC/NTSC/PAL Tuners VSB/QAM Rcvr MPEG Decoder Audio Audio ATSC/NTSC/PAL ATSC/NTSC/PAL VSB/QAM VSB/QAM MPEG MPEG Tuners Tuners Rcvr Rcvr Decoder Decoder IR/Keypad IR/Keypad System System VBI/CC Teletext VBI/CC VBI/CC I/F I/F CPU CPU Teletext Teletext CVBS CVBS 3D Decoder Y/C Y/C 3D Decoder 3D Decoder S E L C T O R YUV YUV OSD OSD CVBS CVBS 3D Decoder Display Display 3D Decoder 3D Decoder S S Y/C Y/C E E 3D 3D I/F I/F L L Deinterlacer Deinterlacer Scaler Scaler Blending & Color Mgt. ADC/Sync E E & NR & NR YPrPb (HD) YPrPb (HD) C C Blending Blending RGB/YUV RGB/YUV ADC/Sync ADC/Sync & & Output Output TTL TTL RGB (VGA) RGB (VGA) T T O O Color Color Format Format LVDS LVDS R R Mgt. Mgt. Analog Analog DVI / HDCP Receiver 3D 3D DVI / HDCP DVI / HDCP DVI DVI - - HDCP HDCP Deinterlacer Deinterlacer Scaler Scaler Receiver Receiver & NR & NR Frame Buffer (SDRAM) Frame Buffer Frame Buffer (SDRAM) (SDRAM) Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

System Interfaces RF – NTSC RF – ATSC Baseband analog NTSC Composite (CVBS) S-Video (Y/C) Component (YUV) Analog HD component (YPbPr) Analog PC graphics (VGA) Digital PC graphics (DVI-HDCP) Digital HD DVI-HDCP [High Definition Content Protection] from PC space used by STBs and current generation of HD-Ready TV HDMI - New CE version of DVI adds audio, video formats, control functions Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

High Definition Multimedia Interface (HDMI) HDMI is DVI plus Audio Support for YCbCr video CE control bus Additional control and configuration capabilities Small CE-friendly connector HDMI enables device communication To source Supported video and audio formats To display Video and audio stream information Developed by the HDMI Working Group Hitachi, Panasonic, Philips, Silicon Image, Sony, Thomson, Toshiba 1.0 specification released Dec. 2002 Information courtesy of Silicon Image Inc. Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Human Visual System (HVS) HVS properties influence the design/tradeoffs of imaging/video systems Basic understanding of “early vision” model from a signal processing (input/output) system is required to understand major video tradeoffs Basic properties of HVS “front-end” 4 types of photo-receptors in the retina Rods, 3 types of cones Rods achromatic (no concept of color) used for scotopic vision (low light levels) concentrated in periphery Cones 3 types: S - Short, M- Medium, L - Long red, green, and blue peaks used for photopic vision (daylight levels) concentrated in fovea (center of the retina) Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Major HVS Properties Tradeoff in resolution between space and time Low resolution for high spatial AND high temporal frequencies However, eye tracking can convert fast-moving object into low retinal frequency Achromatic versus chromatic channels Achromatic channel has highest spatial resolution Yellow/Blue has lower spatial resolution than Red/Green channel Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Color Television Systems Color TV systems developed in ’50s (NTSC) and ’60s (PAL) Backward compatibility with monochrome TVs more important than color quality! Basic parameters of signal (carrier frequencies, bandwidths, modulation format, etc.) had to remain unchanged NTSC and PAL systems added chrominance (color) information to luminance (brightness) signal in a manner transparent to monochrome TVs

NTSC Fundamentals Approved in US by FCC in 1953 as color system compatible with existing 525 line, 60 fields/sec, 2:1 interlace monochrome system Color added to existing luminance structure by interleaving luma and chroma in frequency domain Basic properties 525 lines/frame 2:1 interlace  2 fields/frame with 262.5 lines/field Field rate 59.94 Hz Line frequency (fh) = 15.734 KHz Chroma subcarrier frequency (fsc) = 3.58MHz = 227.5 fh = 119437.5 fv chosen so that consecutive lines and frames have opposite (180o) phase Luma: Y = 0.299R’ + 0.587 G’ + 0.114 B’, where R’, G’, B’ are gamma-corrected R, G, B Chroma: I (In-phase) and Q (Quadrature) used instead of color difference signals U, V U = 0.492 (B’-Y), V = 0.877 (R’-Y) I = V cos33o - U sin33o, Q = V sin33o + U cos33o Composite = Y + Q sin(wt) + I cos(wt) + sync + blanking + color burst, where w = 2 pi fsc Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Monochrome TV Signals (NTSC) 4.5 MHz Video Signal Audio Carrier Sub-Carrier In the NTSC monochrome system the luminance signal is AM/VSB (Amplitude Modulation/Vestigial Sideband) modulated onto the video carrier The sound signal is FM modulated onto the Audio Sub-Carrier located 4.5 MHz from the video carrier

Spectrum of Monochrome TV Signal (NTSC) Video Carrier Detail = Line freq. (15.734 kHz) = Frame freq. (30 Hz) Spectrum of the video extends from just below the video carrier frequency to just below the sound carrier Repetitive nature of the signal from line to line and frame to frame results in a “picket-fence”, or comb, spectrum

Color in Video In PC space, R+G+B signals generate color images In video space, color signals developed for backward compatibility with monochrome TVs Image brightness represented by luma signal (Y), equivalent of monochrome TV signal Color added with “color difference” signals: Cb and Cr Matrix equation translates color spaces Y (luma) = 0.299R' + 0.587G' + 0.114B' Cb (blue chroma) = 0.492(B'-Y) Cr (red chroma) = 0.877(R'-Y)

Principles of NTSC Color System Takes advantage of spectral nature of luminance signal Recognizes human eye is less sensitive to color changes than luma changes Low bandwidth chrominance information is modulated onto a Color Sub-Carrier and added to the luma signal The chroma signal has a picket-fence spectrum sub-carrier frequency very carefully chosen so of the chroma signal pickets are interlaced between those of the luma signal fSC = 227.5 x fH = 3.579545 MHz

Why a weird number like 59.94 Hz? Early TV systems used local power line frequency as the field rate reference Europe used 50 Hz, the USA used 60 Hz With the introduction of color, audio subcarrier frequency required integer relationship to color subcarrier to prevent interference Nearest value to the original 4.500 MHz was 4.5045 MHz, too large a difference for backward compatibility Reducing field rate from 60 to 59.94 Hz, allowed integer value of 4.49999 MHz possible for audio subcarrier This is close enough, solving the problem

The Result Chroma Components Luma Low-Level Mixing The chroma components can be mostly separated from the luma with a comb filter Note the mixing of lower-level luma and chroma components, resulting in residual cross-luma and cross-color artifacts

Implementation of NTSC Color Gamma correction applied to adjust for CRT non-linearity of Component color signals (R', G' and B') are converted to luma and chroma-difference signals with a matrix circuit: Cb and Cr are lowpass filtered, then quadrature modulated (QAM) onto the chroma sub-carrier Signal Amplitude represents the color saturation of video Phase represents the hue Chroma levels chosen such that peak level of composite signal does not exceed 100 IRE with 75% color bars Gamma Correction R G B Matrix R' G' B' Y Cb Cr Sub-carrier Generator LP Filter cos sin Composite

Spectrum of the NTSC Color Signal Video Carrier Audio Sub-Carrier 4.5 MHz Color 3.58 MHz Cr BW = 0.6 MHz Cb BW = 0.6 MHz Cr BW = 1.3 MHz Full chroma signal bandwidth, ±1.3 MHz around sub-carrier, too wide for transmission within channel allocation Usually, both Cb and Cr bandwidths are reduced to 600 kHz Reduces cross-color and cross-luma in TV Alternatively, compliant to true NTSC specification: Cb component (only) can be band-limited to 600 kHz Phase of sub-carrier rotated by 33°, puts flesh tones at around 0° Results in asymmetrical signal (shown) The rotation aligns flesh tones to the I axis and is transparent to demodulator since color-burst is rotated by same amount

The NTSC Color Video signal (EIA 75% color bar signal) =R+B+G =R+G =B+G =G =R+B =R =B =0

NTSC Color Video Signal EIA 75% Color Bar Signal 20 40 60 80 100 IRE Color burst Phase=0° White level Black Blank Sync Yellow Green Magenta Red Blue Phase=167° Phase=241° Phase=61° Phase=103° Phase=347° Blanking Interval Visible Line Interval 9 cycles -20 - 40 Cyan Phase=283° Backporch

PAL Fundamentals European standard with many flavors - broadcasting begun in 1967 in Germany and UK. Similar in concept to NTSC, except that line and field timings are different, and the phase of the V (chroma) component is reversed every line to allow color phase errors to be averaged out Basic properties (except for PAL-M which has NTSC like rates) 625 lines/frame 2:1 interlace  2 fields/frame with 312.5 lines/field Field rate 50 Hz Line frequency (fh) = 15.625 KHz Chroma subcarrier frequency (fsc) = 4.43MHz = (1135/4 + 1/625) fh consecutive lines and frames have 90o phase shift, so 2 lines or 2 frames required for opposite phase Luma: Y = 0.299R’ + 0.587 G’ + 0.114 B’, where R’, G’, B’ are gamma-corrected R, G, B Chroma: Usual color difference signals U, V U = 0.492 (B’-Y), V = 0.877 (R’-Y) Composite = Y + U sin(wt) +/- V cos(wt) + sync + blanking + color burst, where w = 2 pi fsc Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

SECAM Fundamentals Developed in France - broadcasting begun in 1967 Basic timing is identical to PAL but chroma is handled differently from NTSC/PAL Only one chroma component per line FM modulation is used to transmit chroma Basic properties 625 lines/frame 2:1 interlace  2 fields/frame with 312.5 lines/field Field rate 50 Hz Line frequency (fh) = 15.625 KHz Luma: Y = 0.299R’ + 0.587 G’ + 0.114 B’, where R’, G’, B’ are gamma-corrected R, G, B Chroma: Scaled color difference signals U, V Db = 1.505 (B’-Y), Dr = -1.902 (R’-Y) only one chroma component per line, alternating between Dr, Db separate subcarriers for Dr, Db Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Composite Video: NTSC / PAL / SECAM Long-time world television standards Basic properties Analog interlaced scanning 3D (H,V,T) information expressed as a 1D (temporal) raster scanned signal Each picture (frame) displayed as 2 interleaved fields - odd + even Luminance (Y) and Chrominance (R-Y, B-Y), sync, blanking, and color reference information all combined into one “composite” signal Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Luma/Chroma Separation Many approaches trading off complexity, cost and performance Basic approaches (historical) Low pass luma / high pass chroma Notch luma / bandpass chroma Advanced approaches (commonly used in most systems today) 2D passive line comb filter 2D adaptive line comb filter 3D (spatio-temporal) comb filter Decoding artifacts Loss of resolution Dot-crawl Cross-color Good decoding requires some black magic (art) because luma and chroma spectrums overlap in real motion video Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

S-Video Signals S-Video was developed in conjunction with the S-VHS VCR standard, where the luma and chroma signals are kept separate after initial Y/C separation Keeping the signals separate, i.e. never adding the luma and chroma back together, eliminates the NTSC artifacts Since video sources are generally composite (NTSC), the full benefit is not realized Keeping the signals separate after playback with the VCR does help, especially because of the timing jitter

Time-Base Correction (Jitter Removal) Removes line length variations produced by video devices like VCRs Original video with time-base error After time-base correction Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Luma and Chroma Signal Separation (Y/C Separation) The chroma signal (C) is separated from the composite signal by filtering Adaptive comb filtering is required for high quality Low cost TVs use a bandpass filter, resulting in incomplete separation and bad cross luma and chroma artifacts Non-adaptive comb filters introduce problems at edges The luma signal (Y) may be derived by subtracting the chroma from the composite signal Only works well if the chroma was separated well Low cost TVs use a bandstop filter to eliminate the chroma, resulting in poor luma bandwidth

NTSC Color Video signal after Y/C Separation (EIA 75% color bar signal) Blank level 0.961V 0.793V 0.507V 0.339V Chroma 700mV 0mV -300mV White level Black level Blank level Sync level Luma

The NTSC Color Video signal after chroma demodulation (EIA 75% color bar signal) Y Cb Cr

Chroma Demodulation The Cb and Cr color difference signals are recovered by coherent demodulation of the QAM chroma signal An absolute phase reference is provided to facilitate the process A color burst - 9 cycles of unmodulated color sub-carrier – is added between the horizontal sync pulses and the start of the active video (the “backporch”)

Notch/LPF versus Comb Filtering Comb filtering allows full bandwidth decoding Notch filter: loss of information in 2-4 MHz region Comb filter : full horizontal resolution Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Comb Filtering (cont.) Use 1 or more lines for each delay element, e.g., for NTSC, D = 1 line = 910 z-1 Apply “cos” version with positive coefficients to extract chroma or “sin” version with negative center coefficient to extract luma Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

HDTV Technical Overview Video: MPEG2 Main Profile @ High Level (MP@HL) 18 formats: 6 HD, 12 SD Audio: Dolby AC-3 Transport: Subset of MPEG2 Fixed length 188-byte packets RF/Transmission: Terrestrial: 8-VSB (Vestigial Side Band) with Trellis coding effective payload of ~19.3 Mb/s (18.9 Mb/s used for video) Cable: Uses QAM instead of VSB effective payload of ~38.6 Mb/s Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

ATSC Formats 18 formats: 6 HD, 12 SD HDTV/DTV Overview 18 formats: 6 HD, 12 SD 720 vertical lines and above considered High Definition Choice of supported formats left voluntary due to disagreement between broadcasters and computer industry Computer industry led by Microsoft wanted exclusion of interlace and initially use of only those formats which leave bandwidth for data services - “HD0” subset Different picture rates depending on motion content of application 24 frames/sec for film 30 frames/sec for news and live coverage 60 fields/sec, 60 frames/sec for sports and other fast action content 1920 x 1080 @ 60 frames/sec not included because it requires ~100:1 compression to fit in 19.3 Mb/s terrestrial channel, which cannot be done at high quality with MPEG2 Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

HDTV/DTV System Layers HDTV/DTV Overview HDTV/DTV System Layers layered system with header/descriptors 996 Mb/s 1920 x 1080 @60I Multiple Picture Formats and Frame Rates Picture Layer Compression Layer MPEG-2 video and Dolby AC-3 compression syntax Data Headers Motion Vectors Chroma and Luma DCT Coefficients Variable Length Codes Flexible delivery of data and future extensibility Packet Headers Transport Layer Video packet Audio packet Video packet Aux data MPEG-2 packets 19.3 Mb/s 8-VSB Transmission Layer 6 MHz Source:Sarnoff Corporation Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

HDTV/DTV MPEG2 Transport HDTV/DTV Overview HDTV/DTV MPEG2 Transport ...packets with header/descriptors enable flexibility and features... Many services can be dynamically multiplexed and delivered to the viewer video TEXT video audio 1 video video audio 2 video video PGM GD video 188 Byte Packet 184 Byte Payload (incl. optional Adaptation Header) Video Adaptation Header (variable length) 4 Byte Packet Header Time synchronization Media synchronization Random access flag Bit stream splice point flag Packet sync Type of data the packet carries Packet loss/misordering protection Encryption control Priority (optional) Source:Sarnoff Corporation Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Y Cr Cb 4 5 MPEG2 Video Basics: 1 2 3 B B I B B P B B P B B P Sequence HDTV/DTV Overview Sequence (Display Order) GOP (Display Order, N=12, M=3) B B I B B P B B P B B P Cr Y Picture Cb Slice Note: Y = Luma Cr = Red-Y Cb = Blue-Y MacroBlock 1 2 3 4 5 Source:Sarnoff Corporation Y Blocks Cr Block Cb Block Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Aspect Ratios 800 800 600 600 450 4:3 aspect ratio 16:9 aspect ratio HDTV/DTV Overview Aspect Ratios Two options: 16:9 and 4:3 4:3 standard aspect ratio for US TV and computer monitors HD formats are 16:9 better match with cinema aspect ratio better match for aspect ratio of human visual system better for some text/graphics tasks allows side-by-side viewing of 2 pages 800 800 600 600 450 4:3 aspect ratio 16:9 aspect ratio Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Additive White Gaussian Noise Ubiquitous in any electronics systems where analog is present Central Limit Theorem explains the underlying cause Noise can be dramatically reduced by motion-adaptive recursive filtering (“3D NR”) Basic equation: Yi = X + Zi where Zi = measurement at time I, X = original data Wi = noise at time i = Gaussian white noise with zero mean MMSE estimate for N measurements = Σ(Yi)/N Compute Average over same pixel location in each frame Noise averages to zero over a period of time Since averaging pixels that are in motion produces tails, we need reduce or stop averaging when there is motion Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

AWGN - Example Original After noise removal With Gaussian noise Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Impulse Noise Reduction Use nonlinear spatial filtering to remove impulsive noise without reducing resolution Original After noise removal With impulse noise Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Digital (MPEG) Noise Block Noise Mosquito Noise Tiling effect caused by having different DC coefficients for neighboring 8x8 blocks of pixels Mosquito Noise Ringing around sharp edges caused by removal of high-frequency coefficients Noise reduction is achieved by using adaptive filtering Different choice of filters across block boundaries versus within blocks Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Deinterlacing (Line Doubling) Conversion of interlaced (alternate line) fields into progressive (every line) frames Required to present interlaced TV material on progressive display Odd Even CRT-TV uses Interlaced scanning, with odd lines first followed by even lines PC Monitor and all digital displays are Progressive - scanning all lines in consecutive order Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Vertical-Temporal Spectrum of Interlaced Video I is original content, II, III, IV are replicas caused by V-T sampling Deinterlacing goal is to extract I and reject IV Spatial Freq. (cycles/picture height) 525 II C IV 262.5 D B E I III A F 30 60 Temporal Freq. (Hz) Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Vertical-Temporal Progression 1 2 3 = missing line 4 5 = original line (current field) 6 = original line (adjacent fields) 7 8 Lines t-1 t t+1 Time Current field Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Interlacing and Deinterlacing Artifacts Twitter Wide-area flicker Temporal aliasing Line Crawl Deinterlacing artifacts Feathering/ghosting Jaggies/stepping Loss of vertical detail Motion judder Motion blur Specialized artifacts Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Methods of Deinterlacing Spatial interpolation (“Bob”) Temporal interpolation (“Weave”) Spatio-temporal interpolation Median filtering Motion-adaptive interpolation Motion-compensated interpolation Inverse 3-2 and 2-2 pulldown (for film) Other (statistical estimation, model-based etc) Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Film vs. Video Nature of content is the most important factor Fundamentally two types – Progressive and Interlaced Progressive content is content that was originally acquired in progressive form but converted to fit into an interlaced standard Most common form of such content is film – 24 frames/sec or 30 frames/sec Other forms include computer graphics/animation Film-to-video (Teleciné) process is used to convert film to the desired interlaced video format 24 frames/sec  50 fields/sec PAL by running film at 25 fps and doing “2:2 pulldown” 24 frames/sec  60 fields/sec NTSC by doing “3:2 pulldown” Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Film-to-Video Transfer (NTSC) Movie frame 1 frame 2 frame 3 frame 4 Video Odd field Even frame 5 1/6 second Real time 1/24 second Conversion of 24 frames/sec into 60 fields/sec: 4 movie frames mapped to 5 video frames In this process, one movie frame is mapped into 3 video fields, the next into 2, etc... Referred to as “3:2 Pulldown” Similar process used to convert 25 frames/sec to 50 fields/sec and 30 frames/sec to 60 fields/sec (“2:2 pulldown”) Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

De-Interlacing of Film-Originated Material Incorrect field pairing Correct field pairing Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

De-Interlacing of Film-Originated Material Without Film Mode With Film Mode Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Video Odd and even lines are in different places when there is motion Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Video Deinterlacing Artifact - Feathering Feathering – caused by improper handling of motion Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Vertical Interpolation Moving Edges in Video Hardest problem in de-interlacing because odd and even lines are in different places Combining odd and even lines causes feathering Using spatial interpolation causes jaggies/staircasing Angled Line Line Doubled using Vertical Interpolation Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Video Deinterlacing Artifact – Jaggies / Staircasing Caused by vertical interpolation across lines in same field Typical Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Optimal Deinterlacing Content-adaptive Film vs. Video – detect film and use inverse 3-2 (NTSC) or inverse 2-2 (PAL) pulldown Bad edit detection/compensation – need to detect and compensate for incorrect cadence caused by editing Motion-adaptive Detect amount of motion and use appropriate mix of spatial and temporal processing Highest resolution for still areas with no motion artifacts in moving areas Edge-adaptive Interpolate along edge to get smoothest/most natural image Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Film-Mode: Inverse Pulldown From movie frame 1 Odd field Even frame 2 frame 3 frame 4 Odd 1+ Even 1 Odd 2+ Even 2 Odd 3+ Even 3 Odd 4+ Even 4 Odd and even fields generated from the same original movie frame can be combined with no motion artifacts “3:2 Pulldown” sequence detection is necessary Done by analysis of motion content Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Bad Edit Detection and Correction There are 25 potential edit breaks 2 Good edits 23 distinct disruptions of the film chain that cause visual bad edits Sequence has to be continuously monitored Odd field Even Odd 1+ Even 1 Odd 4+ Even 3 Edit From movie frame 1 frame 3 frame 4 Odd 3+ Even 4 Error Film to video transitions - commercial insertion or news flashes Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Motion-Adaptive Deinterlacing 1 2 (m)(S) + (1-m)(T) 3 = missing line 4 5 = original line (current field) 6 = original line (adjacent fields) 7 8 m = motion S = spatial interpol. T = temporal interpol. Lines t-1 t t+1 Time Current field Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Motion-Adaptive Deinterlacing Estimate motion at each pixel Use Motion value to cross-fade spatial and temporal interpolation at each pixel Low motion means use more of temporal interpolation High motion means use more of spatial interpolation Quality of motion detection is the differentiator Motion window size Vertical detail Noise Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Vertical Interpolation Edge-adaptive Interpolation Edge-Adaptive Deinterlacing Moving edges are interpolated cleanly by adjusting the direction of interpolation at each pixel to best match the predominant local edge One Field Angled Line Line Doubled using Vertical Interpolation One Field Angled Line Edge-adaptive Interpolation Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Scaling Linear Scaling Nonlinear scaling Variable scaling Warping Resolution conversion PIP/PAP/POP Nonlinear scaling Aspect ratio conversion Variable scaling Keystone correction Warping Resampling based on a mapping function Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Upscaling Interpolating low-pass filter F(nT) F(nT/2) 1 2 3 4 nT 1 2 3 Intermediate signal Input signal F(nT/2) F(nT) 1 2 3 4 nT 1 2 3 4 5 6 7 8 nT Output signal Interpolating low-pass filter F(nT/2) nT 1 2 3 4 5 6 7 8 nT Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Downscaling Decimating low-pass filter prevents alias at lower rate Input signal Decimating low-pass filter prevents alias at lower rate F(nT) 1 2 3 4 nT Output signal F(2nT) 1 2 Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Practical Scaling Textbook scaling implies you need a very large filter when dealing with expanded signal In practice you only need a small number of filter coefficients (“taps”) at any particular interpolation point because of all the zero values The interpolation points are called “phases” e.g., scaling by 4/3 requires 4 interpolation locations (phases) that repeat – 0, 0.25, 0.5, 0.75 Practical scalers use polyphase interpolation Pre-compute and store one set of filter coefficients for each phase Use DDA to step across the input space using step size = (input size / output size) Xi = Xi-1 + Step Fractional portion of Xi represents the filter phase for current location For each location, use filter coefficients corresponding to the current phase and compute the interpolated value Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Upscaling Comments Theoretically simpler than downscaling Fixed length filter can be used since there is no concern about aliasing However, poor reconstruction filter can introduce jaggies and Moiré often mistakenly referred to as aliasing. Quarter zone plate upscaled using replication - shows jaggies Quarter zone plate upscaled using interpolation - smooth Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Upscaling Comments (cont.) Moiré Introduced by beating of high frequency content with first harmonic that is inadequately suppressed by the reconstruction filter. Original sampled image: 1D sine wave grating - cos (2*pi*(0.45)x) - visible Moiré Upscaled 2X horizontally using linear interpolation - visible Moiré Upscaled 2X horizontally using a 16-tap reconstruction filter - negligible Moiré Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Downscaling Comments More difficult than upscaling Each new scaling factor needs cutoff frequency of reconstruction filter to be altered. Inverse relationship between time (space) and frequency requires filter length to grow proportionately to shrink factor. Aliasing and lost information can be very visible when a fixed low-order filter is used Grid downscaled using fixed 2-tap filter Grid downscaled using filter with dynamic taps Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Scaling for PIP/PAP/POP Main PIP PIP Main Main PIP Mode PAP Mode POP Mode Main TeleText Live PAT Mode Mosaic Mode Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Linear Scaling – State of the Art Polyphase interpolation Separate H and V scaling Typical number of phases from 8 to 64 Typical number of taps from 2 to 8 (H and V can be different), usually more than 2 (linear) Keep in mind the fundamental differences between graphics and video Graphics is non-Nyquist Watch out for marketing gimmicks – Total # of effective filter taps is NOT #taps x #phases, it is just #taps Correct definition of “#Taps” is how many input samples are used to compute an output sample Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Nonlinear Scaling for Aspect Ratio Conversion HDTV/DTV Overview Nonlinear Scaling for Aspect Ratio Conversion Aspect ratio conversion is required for going between 4:3 and Widescreen 4:3 material on 16:9 monitor 16:9 material on 4:3 monitor Several options (shown below) Full Zoom Squeeze 16 x 9 Display Modes 4 x 3 Display Modes Variable Expand Shrink (j) (d) (b) (a) 4 3 (e) (f) (g) (i) (h) 16 9 (c) Video Transmission Format Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Non-linear 3 Zone scaling Nonlinear Scaling Example – “Panoramic” Mode input Non-linear 3 Zone scaling The input aspect ratio is preserved in the middle zone of the output image while scaling. Aspect ratio slowly changes in the tail zones to accommodate rest of the input picture. output Horizontal nonlinear scaling Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Non-linear 3 Zone Scaling - Example Linear Scaling 16:9 Original 4:3 image Nonlinear scaling 16:9 Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Non-linear 3 Zone scaling – Example 2 Linear Scaling 16:9 Original 4:3 image Nonlinear scaling 16:9 Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Vertical Keystone Correction Image with vertical keystone correction and aspect ratio correction Projection of the image Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Edge Enhancement Adaptive peaking Transient improvement Extract high-pass filtered version of signal Apply gain Add back to original Transient improvement Compute derivative of signal Use shaped version of derivative to sharpen the transient without introducing undershoot or overshoot Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Temporal Rate Conversion Conversion from one picture rate (input) to another (display) LCD Monitor 100 Hz CRT-TV Standards conversion LCD Monitor example - situations where host system is not receptive to EDID information telling it that display wants 60Hz refresh Usually conversion required from higher input rate (e.g. 75Hz) to lower (usually 60Hz) Simple schemes are sufficient because displayed image is usually static Larger, widescreen CRT-TV in PAL countries produces unacceptable wide-area flicker at 50Hz, requiring upconversion to higher rate (100 Hz) Standards Conversion 50 Hz to 60 Hz 60 Hz to 50 Hz Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Techniques for Temporal Rate Conversion Frame repetition/dropping Okay for PC graphics Causes judder for video Linear interpolation Used for video Causes blurring/double images Motion-compensated (motion-vector-steered) interpolation Used in high quality standards conversion and 100 Hz TVs Object-motion-based interpolation New approach in R&D phase Rate conversion becomes a simple application of a very powerful new framework Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Upconversion: 50 to 100 Hz t-T t t+T t+2T Time 50 Hz input pictures Original picture Interpolated picture t-T t t+T t+2T 50 Hz input pictures 100 Hz output pictures Time Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Judder Results from Repeats 2:1 upconversion using repetition Spatial position 1 2 3 4 5 Field no. Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Rate Conversion Artifacts – Judder and Blur Temporal Aliasing causes jerky motion “judder” Upconversion by repetition causes judder and blur because of eye tracking Blurred/double image Clean image Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Motion-Vector-Steered Interpolation Upsample each moving object along its line of motion (optical flow axis) Only way to get genuine “new” snapshots in time Two main approaches to computing the motion vectors Block Matching Phase Correlation Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Motion Vector Steered Interpolation Spatial position 1 2 3 4 5 Field no. Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Smooth Motion Using Motion Vectors Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Object-Based Video Current video processing operates at pixel level and computes scalar quantities (“values”) This puts severe limitations on what you can achieve with the processing Also it does not match how the human visual system parses a scene Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Object-Based Video What do you see? Objects can be 300K pixels with gray levels between 16 and 235 OR “players”, “ball”, “spectators”, “benches”, ….. Objects can be Micro-level = “player”, “ball”, spectators, .. OR Macro-level = “Foreground” vs. “Background” Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Foreground vs. Background Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Sequence with compensated background Foreground vs. Background Original Sequence Sequence with compensated background Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Picture Enhancement and Controls Standard Picture Controls Brightness, Contrast, Saturation, Hue or Tint Advanced Picture Controls New 6-point controls – R, G, B, Cy, Mag, Yellow Automatic contrast and colour enhancements Intelligent Colour Remapping (ICRTM) produces more pleasing vivid images Locally Adaptive Contrast Enhancement (ACETM) expands the dynamic range of the scene to provide more detail Color Management sRGB Color space for internet Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Standard Global Picture Controls Typically comprises of a fully programmable [(3x3) matrix + (3x1) vector] Color-Space-Converter (CSC) and Look-Up-Table (LUT) Can be used to do linear color space transformations, standard picture controls (hue, saturation, brightness, contrast) and gamma correction Hue Saturation Brightness Contrast Original Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

New 6-point Control Separate controls for 6 chroma channels – R, G, B, Cyan, Magenta and Yellow Red Yellow Magenta Green Blue Cyan Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Intelligent Color Remapping (ICRTM) Example of automatic setting to enhance specific colour regions – green grass Greener grass Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Intelligent Color Remapping (ICRTM) Example of automatic setting to enhance specific colour regions – blue sky Bluer Sky Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Locally Adaptive Contrast Enhancement (ACETM) Contrast enhanced Original Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Fully Featured LCD-TV/Monitor Fully Featured MPEG-TV Application Examples Basic LCD-TV/Monitor Fully Featured LCD-TV/Monitor Fully Featured MPEG-TV Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Application Example 1: LCD TV/Monitor without PIP Pixelworks Examples Application Example 1: LCD TV/Monitor without PIP Tuner Board Inputs: Standard TV HDTV (480p/720p/1080i) Output: VGA – WXGA 4:3 & 16:9, Progressive Key Features: Motion Adaptive I/P Film Mode (3:2 & 2:2) Noise Reduction CC/V-Chip/Teletext Multi-Language UI IR Remote AV-AL / AV-AR Audio Decoder MSP3450 Audio Amp TDA1517 HD-AL / HD-AR PC-Audio TV-In Tuner FI12X6 SDRAM AV-IN (composite) Video Decoder PW1230 Flash S-VID (s-video) VPC3230 PromJet YUV LVDS V-chip / CC PW113 90C383 Z86129 Basic LCD-TV/Monitor TTL HD-Y/HD-Pb/HD-Pr (HD) MUX ADC (330) AD9883 keypad IR VGA-In Reference design courtesy of Pixelworks Inc. Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Application example 2: LCD TV/Monitor with PIP Pixelworks Examples Application example 2: LCD TV/Monitor with PIP AV-AL / AV-AR Audio Decoder MSP3450 Audio Amp TDA8944J Inputs: Standard TV HDTV (480p/720p/1080i) Output: VGA – WXGA 4:3 & 16:9, Progressive Key Features: Motion Adaptive I/P Film Mode (3:2 & 2:2) Noise Reduction Multi-regional scaling PIP/split screen/POP CC/V-Chip/Teletext Multi-Language UI IR Remote HD-AL / HD-AR PC-Audio V-chip / CC / Teletext SAA5264 SDRAM TV-In Tuner FI12X6 AV-IN (composite) Video Decoder SAA7118 PW1230 Flash PromJet S-VID (s-video) YUV LVDS 90C383 TV-In Tuner FI12X6 PW181 AV-In (composite) Video Decoder SAA7118 TMDS Sil164 S-VID (s-video) YUV keypad IR TTL V-chip / CC / Teletext SAA5264 Fully Featured LCD-TV/Monitor HD-Y/HD-Pb/HD-Pr (HD) ADC VGA-In AD9888 DVI-In DVI Rx SiI161 Reference design courtesy of Pixelworks Inc. Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Application example 3: Reference Design for MPEG-TV Pixelworks Examples Application example 3: Reference Design for MPEG-TV MPEG Decoder Audio Decoder 3D Y/C Video Switch 2D Video Decoder 3D Y/C Fully Featured MPEG-TV ADC Muxes Muxes Deinterlacer Dual-channel Scaler Video Decoder + ADC Reference design courtesy of Pixelworks Inc. Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Speaker gratefully acknowledges material and information provided by Acknowledgements Speaker gratefully acknowledges material and information provided by Dr. Nikhil Balram, Chief Technical Officer Dr. Gwyn Edwards, Technical Marketing Engineer National Semiconductor Displays Group Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

References Image/Video/Television “Fundamentals of Video” N. Balram, Short Course S-4, SID International Symposium, 2000. “Video Demystified: A Handbook for the Digital Engineer”, K. Jack, HighText Publications, 1993. “The Art of Digital Video”, J. Watkinson, Focal Press, 1994. “Digital Television”, C. P. Sandbank (editor), John Wiley & Sons, 1990. “Video Processing for Pixellized Displays”, Y. Faroudja, N. Balram, Proceedings of SID International Symposium, May, 1999. “Principles of Digital Image Synthesis”, Vols 1 & 2, A. Glassner, Morgan Kaufmann Publishers, 1995. “Digital Image Warping”, G. Wolberg, IEEE Computer Society Press, 1994 “Fundamentals of Digital Image Processing”, A. Jain, Prentice Hall, 1989 “Sampling-Rate Conversion of Video Signals”, Luthra, Rajan, SMPTE J. Nov. 1991. Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

References Temporal Rate Conversion HDTV/DTV Modeling Display Systems “IC for Motion Compensated 100 Hz TV with a Smooth Motion Movie-Mode”, G. de Haan, IEEE Transactions on Consumer Electronics, vol. 42, no. 2, May 1996 HDTV/DTV “HDTV Status and Prospects”, B. Lechner, SID 1997 Seminar M-10. detailed history of development of HDTV www.atsc.org web site for Advanced Television Systems Committee www.teralogic-inc.com white papers on set-top box and PC implementations of DTV www.fcc.gov/mmb/vsd web site for FCC - up-to-date information on TV stations DTV transition Modeling Display Systems “Multi-valued Modulation Transfer Function”, Proceedings of SID International Symposium, May, 1996. “Vertical Resolution of Monochrome CRT Displays”, Proceedings of SID International Symposium, May, 1996. Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

References Human Visual Systems Linear Systems HDMI MPEG2 “Visual Perception”, Cornsweet, 1970 Linear Systems “Signals and Systems”, Oppenheim, Willsky, Young, Prentice Hall. HDMI www.hdmi.com MPEG2 “An Introduction to MPEG-2” B. Haskell, A. Puri, A. Netravali, Chapman & Hall, 1997 Video2000 Benchmark www.madonion.com Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram

Thank You! Fall 2004 FAE Meeting Developed and Delivered by: 105 Fall 2004 FAE Meeting Developed and Delivered by: Gwyn Edwards, Nikhil Balram