Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.

Slides:



Advertisements
Similar presentations
Color Physics –Light is E-M radiation of different frequencies. –Superposition principle Perception –3 cones -> 3D color space. (Metamers). –Convex subset.
Advertisements

2002 by Jim X. Chen: 1 Understand brightness, intensity, eye characteristics, and gamma correction, halftone technology,
Color & Light, Digitalization, Storage. Vision Rods work at low light levels and do not see color –That is, their response depends only on how many photons,
Color Image Processing
Achromatic and Colored Light CS 288 9/17/1998 Vic.
Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.
Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.
School of Computing Science Simon Fraser University
Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.
Printer/monitor incompatibilities Gamut –Colors in one that are not in the other –Different whitepoint –Complements of one not in the other Luminance.
SWE 423: Multimedia Systems Chapter 4: Graphics and Images (2)
Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color perceived to belong to.
Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.
Color mixing Suppose a system of colorants (lights, inks,…). Given two colors with spectra c 1 ( ) and c 2 ( ). This may be reflectance spectra, transmittance.
Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.
Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.
Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.
What is color for?.
University of British Columbia CPSC 414 Computer Graphics © Tamara Munzner 1 Color 2 Week 10, Fri 7 Nov 2003.
Quantization If too few levels of gray, (e.g. decrease halftone spot size to increase spatial resolution), then boundaries between adjacent gray levels.
R+G+B vs. gray, LCD projector.
Trichromacy Helmholtz thought three separate images went forward, R, G, B. Wrong because retinal processing combines them in opponent channels. Hering.
COLOR and the human response to light
Gamut Mapping First try: map black points and fill destination gamut.
Display Issues Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico.
RGB Models human visual system? Gives an absolute color description? Models color similarity? Linear model? Convenient for color displays?
Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.
1 CSCE441: Computer Graphics: Color Models Jinxiang Chai.
CS559-Computer Graphics Copyright Stephen Chenney Color Recap The physical description of color is as a spectrum: the intensity of light at each wavelength.
Why Care About Color? Accurate color reproduction is commercially valuable - e.g. Kodak yellow, painting a house Color reproduction problems increased.
Color Models AM Radio FM Radio + TV Microwave Infrared Ultraviolet Visible.
9/14/04© University of Wisconsin, CS559 Spring 2004 Last Time Intensity perception – the importance of ratios Dynamic Range – what it means and some of.
Colour Digital Multimedia, 2nd edition Nigel Chapman & Jenny Chapman
Digital Multimedia, 2nd edition Nigel Chapman & Jenny Chapman Chapter 6 This presentation © 2004, MacAvon Media Productions Colour.
Understanding Colour Colour Models Dr Jimmy Lam Tutorial from Adobe Photoshop CS.
Any questions about the current assignment? (I’ll do my best to help!)
Chapter 3: Colorimetry How to measure or specify color? Color dictionary?
CS 445 / 645: Introductory Computer Graphics Color.
Computer Science 631 Lecture 7: Colorspace, local operations
Topic 5 - Imaging Mapping - II DIGITAL IMAGE PROCESSING Course 3624 Department of Physics and Astronomy Professor Bob Warwick.
Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
1 Chapter 2: Color Basics. 2 What is light?  EM wave, radiation  Visible light has a spectrum wavelength from 400 – 780 nm.  Light can be composed.
CSC361/ Digital Media Burg/Wong
CS6825: Color 2 Light and Color Light is electromagnetic radiation Light is electromagnetic radiation Visible light: nm. range Visible light:
Graphics Lecture 4: Slide 1 Interactive Computer Graphics Lecture 4: Colour.
Sensory Information Processing
CS5600 Computer Graphics by Rich Riesenfeld Spring 2006 Lecture Set 11.
Three-Receptor Model Designing a system that can individually display thousands of colors is very difficult Instead, colors can be reproduced by mixing.
1 CSCE441: Computer Graphics: Color Models Jinxiang Chai.
Introduction to Computer Graphics
David Luebke 1 2/5/2016 Color CS 445/645 Introduction to Computer Graphics David Luebke, Spring 2003.
David Luebke2/23/2016 CS 551 / 645: Introductory Computer Graphics Color Continued Clipping in 3D.
Chapter 9: Perceiving Color. Figure 9-1 p200 Figure 9-2 p201.
09/10/02(c) University of Wisconsin, CS559 Fall 2002 Last Time Digital Images –Spatial and Color resolution Color –The physics of color.
Color Measurement and Reproduction Eric Dubois. How Can We Specify a Color Numerically? What measurements do we need to take of a colored light to uniquely.
1 of 32 Computer Graphics Color. 2 of 32 Basics Of Color elements of color:
Color Models Light property Color models.
Half Toning Dithering RGB CMYK Models
Color Image Processing
Color Image Processing
(c) University of Wisconsin, CS559 Spring 2002
Color Image Processing
Perception and Measurement of Light, Color, and Appearance
© University of Wisconsin, CS559 Spring 2004
Color Representation Although we can differentiate a hundred different grey-levels, we can easily differentiate thousands of colors.
Computer Vision Lecture 4: Color
Introduction to Perception and Color
Color Image Processing
Slides taken from Scott Schaefer
Presentation transcript:

Why is this hard to read

Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color perceived to belong to an area seen in relation to other colors (CIE 17.4)

Illusory contour Shape, as well as color, depends on surround Most neural processing is about differences

Illusory contour

CS 768 Color Science Perceiving color Describing color Modeling color Measuring color Reproducing color

Spectral measurement Measurement p( ) of the power (or energy, which is power x time ) of a light source as a function of wavelength Usually relative to p(560nm) Visible light nm

Retinal line spread function retinal position relative intensity

Linearity additivity of response (superposition) r(m 1 +m 2 )=r(m 1 )+r(m 2 ) scaling (homogeneity) r(  m)=  r(m) r(m 1 (x,y)+m 2 (x,y))= r(m 1 )(x,y)+r(m 2 )(x,y)= (r(m 1 )+r(m 2 ))(x,y) r(  m(x,y))=  r(m)(x,y) retinal intensity monitor intensity

Non-linearity

Ganglion Bipolar Amacrine Rod Cone Epithelium Optic nerve Retinal cross section Light Horizontal

Visual pathways Three major stages –Retina –LGN –Visual cortex –Visual cortex is further subdivided

Optic nerve 130 million photoreceptors feed 1 million ganglion cells whose output is the optic nerve. Optic nerve feeds the Lateral Geniculate Nucleus approximately 1-1 LGN feeds area V1 of visual cortex in complex ways.

Photoreceptors Cones - –respond in high (photopic) light –differing wavelength responses (3 types) –single cones feed retinal ganglion cells so give high spatial resolution but low sensitivity –highest sampling rate at fovea

Photoreceptors Rods –respond in low (scotopic) light –none in fovea try to foveate a dim star—it will disappear –one type of spectral response –several hundred feed each ganglion cell so give high sensitivity but low spatial resolution

Luminance Light intensity per unit area at the eye Measured in candelas/m 2 (in cd/m 2 ) Typical ambient luminance levels (in cd/m 2 ): –starlight –moonlight –indoor lighting 10 2 –sunlight 10 5 –max intensity of common CRT monitors 10 ^2 From Wandell, Useful Numbers in Vision Science

Rods and cones Rods saturate at 100 cd/m 2 so only cones work at high (photopic) light levels All rods have the same spectral sensitivity Low light condition is called scotopic Three cone types differ in spectral sensitivity and somewhat in spatial distribution.

Cones L (long wave), M (medium), S (short) –describes sensitivity curves. “Red”, “Green”, “Blue” is a misnomer. See spectral sensitivity.

Receptive fields Each neuron in the visual pathway sees a specific part of visual space, called its receptive field Retinal and LGN rf’s are circular, with opponency; Cortical are oriented and sometimes shape specific On center rfRed-Green LGN rf Oriented Cortical rf

Channels: Visual Pathways subdivided Channels Magno –Color-blind –Fast time response –High contrast sensitivity –Low spatial resolution Parvo –Color selective –Slow time response –Low contrast sensitivity –High spatial resolution Video coding implications Magno –Separate color from b&w –Need fast contrast changes (60Hz) –Keep fine shading in big areas –(Definition) Parvo –Separate color from b&w –Slow color changes OK (40 hz) –Omit fine shading in small areas –(Definition) (Not obvious yet) pattern detail can be all in b&w channel

Trichromacy Helmholtz thought three separate images went forward, R, G, B. Wrong because retinal processing combines them in opponent channels. Hering proposed opponent models, close to right.

Opponent Models Three channels leave the retina: –Red-Green (L-M+S = L-(M-S)) –Yellow-Blue(L+M-S) –Achromatic (L+M+S) Note that chromatic channels can have negative response (inhibition). This is difficult to model with light.

+- +

Log Spatial Frequency (cpd) Contrast Sensitivity Luminance Red-Green Blue-Yellow

Color matching Grassman laws of linearity: (     )(   (   (   Hence for any stimulus s( ) and response r( ), total response is integral of s( ) r( ), taken over all or approximately  s( )r( )

Primary lights Test light Bipartite white screen Surround field Test lightPrimary lights Subject Surround light

Color Matching Spectra of primary lights s 1 ( ), s 2 ( ), s 3 ( ) Subject’s task: find c 1, c 2, c 3, such that c 1 s 1 ( )+c 2 s 2 ( )+c 3 s 3 ( ) matches test light. Problems (depending on s i ( )) –[c 1,c 2,c 3 ] is not unique (“metamer”) –may require some c i <0 (“negative power”)

Color Matching Suppose three monochromatic primaries r,g,b at , , nm and a 10° field (Styles and Burch 1959). For any monochromatic light t( ) at  find scalars R=R(  G=G(  B=B(  such that t( ) = R(  r  G(  g  B(  b R( ,  G( ,  B(  are the color matching functions based on r,g,b.

Color matching Grassman laws of linearity: (     )(   (   (   Hence for any stimulus s( ) and response r( ), total response is integral of s( ) r( ), taken over all or approximately  s( )r( )

Color matching What about three monochromatic lights? M( ) = R* R ( ) + G* G ( ) + B* B ( ) Metamers possible good: RGB functions are like cone response bad: Can’t match all visible lights with any triple of monochromatic lights. Need to add some of primaries to the matched light

Primary lights Test light Bipartite white screen Surround field Test lightPrimary lights Subject Surround light

Color matching Solution: CIE XYZ basis functions

Color matching Note Y is V( ) None of these are lights Euclidean distance in RGB and in XYZ is not perceptually useful. Nothing about color appearance

XYZ problems No correlation to perceptual chromatic differences X-Z not related to color names or daylight spectral colors One solution: chromaticity

Chromaticity Diagrams x=X/(X+Y+Z) y=Y/(X+Y+Z) z=Z/(X+Y+Z) Perspective projection on X-Y plane z=1-(x-y), so really 2-d Can recover X,Y,Z given x,y and on XYZ, usually Y since it is luminance

Chromaticity Diagrams No color appearance info since no luminance info. No accounting for chromatic adaptation. Widely misused, including for color gamuts.

Some gamuts SWOP ENCAD GA ink

MacAdam Ellipses JND of chromaticity Bipartite equiluminant color matching to a given stimulus. Depends on chromaticity both in magnitude and direction.

MacAdam Ellipses For each observer, high correlation to variance of repeated color matches in direction, shape and size –2-d normal distributions are ellipses –neural noise? See Wysecki and Styles, Fig 1(5.4.1) p. 307

MacAdam Ellipses JND of chromaticity –Weak inter-observer correlation in size, shape, orientation. No explanation in Wysecki and Stiles 1982 More modern models that can normalize to observer?

MacAdam Ellipses JND of chromaticity –Extension to varying luminence: ellipsoids in XYZ space which project appropriately for fixed luminence

MacAdam Ellipses JND of chromaticity –Technology applications: Bit stealing: points inside chromatic JND ellipsoid are not distinguishable chromatically but may be above luminance JND. Using those points in RGB space can thus increase the luminance resolution. In turn, this has appearance of increased spatial resolution (“anti-aliasing”) Microsoft ClearType. See and

CIELab L* = 116 f(Y/Y n )-16 a* = 500[f(X/X n ) – f(Y/Y n )] b* = 200[f(Y/Y n ) –f(Z/Z n )] where X n,Y n,Z n are the CIE XYZ coordinates of the reference white point. f(z) = z 1/3 if z> f(z)=7.787z+16/116 otherwise L* is relative achromatic value, i.e. lightness a* is relative greenness-redness b* is relative blueness-yellowness

CIELab L* = 116 f(Y/Y n )-16 a* = 500[f(X/X n ) – f(Y/Y n )] b* = 200[f(Y/Y n ) –f(Z/Z n )] where X n,Y n,Z n are the CIE XYZ coordinates of the reference white point. f(z) = z 1/3 if z> f(z)=7.787z+16/116 otherwise

CIELab L* = 116 f(Y/Y n )-16 a* = 500[f(X/X n ) – f(Y/Y n )] b* = 200[f(Y/Y n ) –f(Z/Z n )] where X n,Y n,Z n are the CIE XYZ coordinates of the reference white point. f(z) = z 1/3 if z> f(z)=7.787z+16/116 otherwise C* ab = sqrt(a* 2 +b* 2 ) corresponds to perception of chroma (colorfulness). hue angle h ab =tan -1 (b*/a*) corresponds to hue perception. L* corresponds to lightness perception Euclidean distance in Lab space is fairly correlated to color matching and color distance judgements under many conditions. Good correspondence to Munsell distances.

a*>0 redder a*<0 greener b*>0 yellower b*<0 bluer chroma hue lightness

Complementary Colors c1 and c2 are complementary hues if they sum to the whitepoint. Not all spectral (i.e. monochromatic) colors have complements. See chromaticity diagram. See Photoshop Lab interface.

CIELab defects Perceptual lines of constant hue are curved in a*-b* plane, especially for red and blue hues (Fairchiled Fig 10.5) Doesn’t predict chromatic adaptation well without modification Axes are not exactly perceptual unique r,y,g,b hues. Under D65, these are approx 24°, 90°,162°,246° rather than 0°, 90°, 180°, 270° (Fairchild)

CIELab color difference model  E*=sqrt(  L* 2 +  a* 2 +  b* 2 ) –May be in the same L*a*b* space or to different white points (but both wp’s normalized to same max Y, usually Y=100). –Typical observer reports match for  E* in range 2.5 – 20, but for simple patches, 2.5 is perceptible difference (Fairchild)

Viewing Conditions Illuminant matters. Fairchild Table 7-1 shows  E* using two different illuminants. Consider a source under an illuminant with SPD T( ). If color at a pixel p has spectral distribution p(  and reflectance factor of screen is r(  then SPD at retina is r( )T( )+p( ). Typically r(  is constant, near 1, and diffuse.

Color ordering systems Want system in which finite set of colors vary along several (usually three) axes in a perceptually uniform way. Several candidates, with varying success –Munsell Spectra available at Finnish site –NCS –OSA Uniform Color Scales System –…–…

Color ordering systems CIE L*a*b* still not faithful model, e.g. contours of constant Munsell chroma are not perfect circles in L*a*b* space. See Fairchild Fig 10-4, Berns p. 69.

Effect of viewing conditions Impact of measurement geometry on Lab –Need illumination and viewing angle standards –Need reflection descriptions for opaque material, transmission descriptions for translucent

Reflection geometry diffuse specular

Reflection geometry Semi-glossy glossy

Reflection geometry Semi-glossy glossy

Some standard measurement geometries d/8:i diffuse illumination, 8° view, specular component included d/8:e as above, specular component excluded d/d:i diffuse illumination and viewing, specular component included 45/0 45° illumination, 0° view

Viewing comparison L*C*h EE d/8:i / d/8:e Measurement differences of a semi-gloss tile under different viewing conditions (Berns, p. 86).  E is vs. d/8:i. Data are for Lab.

L*u*v* CIE u' v' chromaticity coordinates: u'=4X/(X+15Y+3Z)= 4x/(-2+12y+3) v'=9Y/(X+15Y+3Z)=9y/(-2+12y+3) Gives straighter lines of constant Munsell chroma (See figures on p. 64 of Berns). L* = 116(Y/Y n ) 1/3 – 16 u* = 13L*(u' – u n ') v* = 13L*(v'-v n ')

L*u*v* L* = 116(Y/Y n ) 1/3 – 16 u* = 13L*(u' – u n ') v* = 13L*(v'-v n ') u n ', v n ' values for whitepoint

Models for color differences Euclidean metric in CIELab (or CIELuv) space not very predictive. Need some weighting  V = (1/k E) )[(  L*)/k L S L ) 2 +(  C  */k C S C ) 2 +(  H  */k H S H ) 2 ] 1/2  = uv or ab according to whether using L*a*b* or L*u*v* The k's are parameters fit to the data. The S's are functions of the underlying variable, estimated from data.

Models for color differences  E* 94 k L = k C = k H = 1 S L = 1 S C = C* ab S H = C* ab Fitting with one more parameter for scaling gives good predictions. Berns p 125.

Color constancy Color difference models such as previous have been used to predict color inconstancy under change of illumination. Berns p. 214.

Other color appearance phenomena Models still under investigation to account for: –Colorfulness (perceptual attribute of chroma) increases with luminance ("Hunt effect") –Brightness contrast (perceptual attribute of lightness difference) increases with luminance –Chromatic adaptation

Color Gamuts Gamut: the range of colors that are viewable under stated conditions Usually given on chromaticity diagram –This is bad because it normalizes for lightness, but the gamut may depend on lightness. –Should really be given in a 3d color space –L*a*b* is usual, but has some defects to be discussed later

Color Gamut Limitations 1.CIE XYZ underlies everything –this permits unrealizable colors, but usually "gamut" means restricted to the visible spectrum locus in chromaticity diagram 2.Gamut can depend on luminance –usually on illuminant relative luminance, i.e. Y/Y n

Color Gamut Limitations Surface colors –reflectance varies with gloss. Generally high gloss increases lightness and generally lightness reduces gamut (see figures in Berns, p. 145 ff) Stricter performance requirements often reduce gamut –e.g. require long term fade resistance

Color Gamut Limitations Physical limitations of colorants and illuminants –Specific set of colorants and illuminants are available. For surface coloring we can not realize arbitrary XYZ values even within the chromaticity spectral locus Economic factors –Color may be available but expense not justified

Color mixing Suppose a system of colorants (lights, inks,…). Given two colors with spectra c 1 ( ) and c 2 ( ). This may be reflectance spectra, transmittance spectra, emission spectra,…Let d be a mix of c 1 and c 2. The system is additive if d( ) = c 1 ( ) + c 2 ( ) no matter what c 1 and c 2 are.

Scalability Suppose the system has some way of scaling the intensity of the color by a scalar k. Examples: –CRT: increase intensity by k. –halftone printing: make dots k times bigger –colored translucent materials: make k times as thick If c is a color, denote the scaled color as d. If the spectrum d (  is k(c( )) for each  the system is scalable

Scalability Consider a color production system and a colors c 1,c 2 with c 2 =kc 1. Let m i =max(c i ( )) and d i =(1/m i )c i. Highschool algebra shows that the system is scalable if and only if d 1 ( )=d 2 ( ) for all, no matter what c 1 and k.

Control in color mixing systems Normally we control some variable to control intensity: –CRT voltage on electron gun integer –Translucent materials (liquids, plastics...): thickness –Halftone printing: dot size

Linearity A color production system is linear if it is additive and scalable. Linearity is good: it means that model computations involving only linear algebra make good predictions. Interesting systems are typically additive over some range, but rarely scalable. A simple compensation can restore often restore linearity by considering a related mixing system.

kL 0 L0L0 k*kL 0 knL0knL0 n ddd Scalability in subtractive systems 0<=k<=1

L0L0 knL0knL0 n L(nd) = k n L 0 n integer; L(bd) = k b L 0 b arbitrary L(b) = k b L 0 when d = 1; L(b)/L 0 = k b Scalability in subtractive systems T  = t b where T is total transmittance at wavelength, t transmittance of unit thickness and b is thickness 0<=k<=1

Linearity in subtractive systems Absorbance A = -log(T ) defn = -log(t b ) = -blog(t ) = -ba  where a =absorbance of unit thickness so absorbance is scalable when thickness b is the control variable By same argument as for scalability, the transmittance of the "sum" of colors T  and S  will be their product and so the absorbance of the sum will be the sum of the absorbances. Thus absorbance as a function of thickness is a linear mixture system

Tristimulus Linearity [X mix Y mix Z mix ] = [X 1 Y 1 Z 1 ] + [X 2 Y 2 Z 2 ] c [X Y Z] = [cX cY cZ] This is true because –r( ) g( ) b( ) are the basis of a 3-d linear space (of functions on wavelength) describing lights –Grassman's laws are precisely the linearity of light when described in that space. –[X Y Z] is a linear transformation from this space to R 3

Monitor (non)Linearity L 1 (A,B,C) L 2 (A,B,C) L 3 (A,B,C) f 2 (L 1, L 2, L 3 ) ABCABC Linear stage Non-linear stage f 1 (L 1, L 2, L 3 ) f 3 (L 1, L 2, L 3 )

Monitor (non)Linearity In = [A,B,C] --> L = [L 1, L 2, L 3 ]--> Out = [O 1 O 2 O 3 ] = [f 1 (L 1, L 2, L 3 ) f 2 (L 1, L 2, L 3 ) f 3 (L 1, L 2, L 3 )] Interesting monitor cases to consider: –In = [dr dg db] where d r, d g, d b are integers 0…255 or numbers 0…1 describing the programming API for red, green, blue channels –Out = [X Y Z] tristimulus coords or monitor intensities in each channel –Typically: f i depends only on L i f i are all the same f i (u) = u  for some  characteristic of the monitor

Monitor (non)Linearity Warning: LCD non-linearity is logistic, not exponential but flat panel displays are usually built to mimic CRT because much software is gamma- corrected (with typical  = ) Somewhat related: Most LCD displays are built with analog instead of digital inputs, in order to function as SVGA monitors. This is changing.

Monitor (non)Linearity drdgdbdrdgdb RGBRGB a a a = + b b b where a=1.02/255, b= -.02 (CRT Colorimetry example of Berns, p ) Non-linearity is f(u)=u ,  = 2.7, same for all output channels. Linearity is diagonal:

R+G+B vs. gray, LCD projector

More depth on Gamma Poynton, Gamma and its disguises: The nonlinear mappings of intensity in perception, CRTs, film and video. SMPTE Journal, 1993,

Halftoning The problem with ink: it’s opaque Screening: luminance range is accomplished by printing with dots of varying size. Collections of big dots appear dark, small dots appear light. % of area covered gives darkness.

Halftoning references A commercial but good set of tutorials Digital Halftoning, by Robert Ulichney, MIT Press, 1987Digital Halftoning Stochastic halftoning

Color halftoning Needs screens at different angles to avoid moire moire Needs differential color weighting due to nonlinear visual color response and spatial frequency dependencies.

Halftone ink May not always be opaque Three inks can give 2 ^3 =8 distinct colors Visual system gives more since dot size, spacing, yields intensity, gives somewhat additive system Highly nonlinear. See Berns et al. The Spectral Modeling of Large Format Ink Jet PrintersThe Spectral Modeling of Large Format Ink Jet Printers

From

108°162° 90°45°

Quantization If too few levels of gray, (e.g. decrease halftone spot size to increase spatial resolution), then boundaries between adjacent gray levels become apparent. This can happen in color halftoning also. See demo at se/dip/demos/Quat.html se/dip/demos/Quat.html

Saturation Distance from white point Adding white desaturates but does not change hue or perceptual brightness. HSB model is approximate representative of this. See PhotoShop

Device Independence Calibration to standard space –typically CIE XYZ Coordinate transforms through standard space Gamut mapping

Device independence Stone et. al. “Color Gamut Mapping and the Printing of Digital Color Images”, ACM Transactions on Graphics, 7(4) October 1998, pp The following slides refer to their techniques.

Device to XYZ Sample gamut in device space on 8x8x8 mesh (7x7x7 = 343 cubes). Measure (or model) device on mesh. Interpolate with trilinear interpolation –for small mesh and reasonable function XYZ=f(device 1, device 2, device 3 ) this approximates interpolating to tangent.

XYZ to Device Invert function XYZ=f(device 1, device 2, device 3 ) –hard to do in general if f is ill behaved –At least make f monotonic by throwing out distinct points with same XYZ. e.g. CMY device: –(continued)

XYZ to CMY Invert function XYZ=f(c,m,y) –Given XYZ=[x,y,z] want to find CMY=[c,m,y] such that f(CMY)=XYZ –Consider X(c,m,y), Y(c,m,y), Z(c,m,y) –A continuous function on a closed region has max and min on the region boundaries, here the cube vertices. Also, if a continuous function has opposite signs on two boundary points, it is zero somewhere in between.

XYZ to CMY –Given X 0, find [c,m,y] such that f(c,m,y) = X 0 –if [c i,m i,y i ] [c j,m j,y j ] are vertices on a given cube, and U=X(c,m,y)- X 0 has opposite sign on them, then it is zero in the cube. Similarly Y, Z. If find such vertices for all of X 0,Y 0,Z 0, then the found cube contains the desired point. (and use interpolation). Doing this recursively will find the desired point if there is one.

Gamut Mapping Criteria: –preserve gray axis of original image –maximum luminance contrast –few colors map outside destination gamut –hue, saturation shifts minimized –increase, rather than decrease saturation –do not violate color knowledge, e.g. sky is blue, fruit colors, skin colors

Gamut Mapping Special colors and problems –Highlights: this is a luminance issue so is about the gray axis –Colors near black: locus of these colors in image gamut must map into something reasonably similar shape else contrast and saturation is wrong

Gamut Mapping Special colors and problems –Highly saturated colors (far from white point): printers often incapable. –Colors on the image gamut boundary occupying large parts of the image. Should map inside target gamut else have to project them all on target boundary.

CRT Printer Gamuts

Gamut Mapping First try: map black points and fill destination gamut.

device gamut image gamut

translate B i to B d device gamut image gamut bs (black shift)

translate B i to B d scale by csf device gamut image gamut

translate B i to B d scale by csf rotate device gamut image gamut

Gamut Mapping X d = B d + csf R (X i - B i ) B i = image black, B d = destination black R = rotation matrix csf = contrast scaling factor X i = image color, X d = destination color Problems: Image colors near black outside of destination are especially bad: loss of detail, hue shifts due to quantization error,...

shift and scale along destination gray X d = B d + csf R (X i - B i ) + bs (W d - B d )

Fig 14a, bs>0, csf small, image gamut maps entirely into printer gamut, but contrast is low. Fig 14b, bs=0, csf large, more contrast, more colors inside printer gamut, but also more outside.

Saturation control “Umbrella transformation” [R s G s B s ] = monitor whitepoint [R n G n B n ] new RGB coordinates such that R s + G s + B s = R n + G n + B n and [R n G n B n ] maps inside destination gamut First map R R s +G G s +B B s to R R n +G G n +B B n Then map into printer coordinates Makes minor hue changes, but “relative” colors preserved. Achromatic remain achromatic.

Projective Clipping After all, some colors remain outside printer gamut Project these onto the gamut surface: –Try a perpendicular projection to nearest triangular face in printer gamut surface. –If none, find a perpendicular projection to the nearest edge on the surface –If none, use closest vertex

Projective Clipping This is the closest point on the surface to the given color Result is continuous projection if gamut is convex, but not else. –Bad: want nearby image colors to be nearby in destination gamut.

Projective Clipping Problems –Printer gamuts have worst concavities near black point, giving quantization errors. –Nearest point projection uses Euclidean distance in XYZ space, but that is not perceptually uniform. Try CIELAB? SCIELAB? Keep out of gamut distances small at cost of use of less than full printer gamut use.