Download presentation
Presentation is loading. Please wait.
Published byOswald Shepherd Modified over 9 years ago
1
Ron Yanovich & Guy Peled 1
2
Contents Grayscale coloring background Luminance / Luminance channel Segmentation Discrete Cosine Transform K-nearest-neighbor (Knn) Linear Discriminant Analysis (LDA) Colorization using optimization Colorization by Example (i) Training (ii) Classification (iii) Color transfer (iv) Optimization 2
3
Grayscale coloring background Colorization definition: ‘The process of adding colors to monochrome image.’ 3
4
Grayscale coloring background Colorization is a term introduced by Wilson Markle in 1970 to describe the computer-assisted process he invented for adding color to black and white movies or TV programs. 4
5
Grayscale coloring background Black magic ( PC tool ) Motion video and film colorization “Color transfer between images” (Reinhard et al.) Transferring the color pallet from one color image to another “Transferring color to greyscale images” (Welsh et al.) Colorizes an image by matching small pixel neighborhoods in the image to those in the reference image “Unsupervised colorization of black-and-white cartoons” (Sykora et al.) Colorization of black and white cartoons (segmented), patch-based sampling and probabilistic reasoning. 5
6
6 Reinhard et al. Black magic (tool)
7
7 Welsh et al Sykora et al.
8
Contents Grayscale coloring background luminance / luminance channel Segmentation Discrete Cosine Transform K-nearest-neighbor (Knn) Linear Discriminant Analysis (LDA) Colorization using optimization Colorization by Example (i) training (ii) classification (iii) color transfer (iv) optimization 8
9
Luminance / Luminance channel Luminance The amount of light that passes through or is emitted from a particular area Luminance Channel Y - Full resolution plane that represents the mean luminance information only U, V - Full resolution, or lower, planes that represent the chroma (color) information only 9
10
Luminance / Luminance channel 10
11
11 Luminance / Luminance channel
12
12 Luminance / Luminance channel
13
Contents Grayscale coloring background luminance / luminance channel Segmentation Discrete Cosine Transform K-nearest-neighbor (Knn) Linear Discriminant Analysis (LDA) Colorization using optimization Colorization by Example (i) training (ii) classification (iii) color transfer (iv) optimization 13
14
Segmentation The process of partitioning a digital image into multiple segments (sets of pixels, also known as superpixels) 14
15
Segmentation Making the image more meaningful and easier to analyze locate objects and boundaries assigning a label to every pixel in an image 15
16
Segmentation ‘Superpixel’ - A polygonal part of a digital image, larger than a normal pixel, that is rendered in the same color and brightness 16
17
Segmentation Possible implementation is mean-shift segmentation 17
18
Contents Grayscale coloring background luminance / luminance channel Segmentation Discrete Cosine Transform K-nearest-neighbor (Knn) Linear Discriminant Analysis (LDA) Colorization using optimization Colorization by Example (i) training (ii) classification (iii) color transfer (iv) optimization 18
19
Discrete Cosine Transform Finite sequence of data points in terms of a sum of cosine functions oscillating at different frequencies DCT is a Fourier-related transform similar to the discrete Fourier transform (DFT), but using only real numbers 19
20
Discrete Cosine Transform 20
21
Discrete Cosine Transform Can be used for compression 21
22
Contents Grayscale coloring background luminance / luminance channel Segmentation Discrete Cosine Transform K-nearest-neighbor (Knn) Linear Discriminant Analysis (LDA) Colorization using optimization Colorization by Example (i) training (ii) classification (iii) color transfer (iv) optimization 22
23
K-nearest-neighbor (Knn) In pattern recognition, the k-nearest neighbor algorithm (k-NN) is a non-parametric method for classifying objects based on closest training examples in the feature space. 23
24
K-nearest-neighbor (Knn) All instances are points in n-dimensional space “Closeness” between points determined by some distance measure Example Classification made by Majority Vote among the neighbors 24
25
Given n points K-nearest-neighbor – 2D Ex 25 a a a a a a a a a a a a a b b b b b b b b b b b b Point location Point Class Given new point Classification for k=2 Given new point Classification for k=5
26
Contents Grayscale coloring background luminance / luminance channel Segmentation Discrete Cosine Transform K-nearest-neighbor (Knn) Linear Discriminant Analysis (LDA) Colorization using optimization Colorization by Example (i) training (ii) classification (iii) color transfer (iv) optimization 26
27
Linear discriminant analysis (LDA) In the field of machine learning, the goal of statistical classification is to use an object's characteristics to identify which class (or group) it belongs to 27 Background
28
Linear discriminant analysis (LDA) A linear classifier achieves this by making a classification decision based on the value of a linear combination of the characteristics An object's characteristics are also known as feature values and are typically presented to the machine in a vector called a feature vector. 28 Background
29
Linear discriminant analysis (LDA) There are two broad classes of methods for determining the parameters of a linear classifier Generative models (conditional density functions) LDA (or Fisher’s linear discriminant) Discriminative models Support vector machine (SVM) 29 Background
30
Linear discriminant analysis (LDA) Discriminative training often yields higher accuracy than modeling the conditional density functions. However, handling missing data is often easier with conditional density models 30 Background
31
Linear discriminant analysis (LDA) LDA seeks to reduce dimensionality while preserving as much of the class discriminatory information as possible LDA finds a linear subspace that maximizes class separability among the feature vector projections in this space 31
32
LDA – two classes Having a set of D-dimensional samples: The samples are divided into 2 groups: N1 – belongs to class w1 N2 – belongs to class w2 Seek to obtain a scalar y by projecting the samples x onto a line: 32 http://research.cs.tamu.edu
33
LDA – two classes Of all the possible lines we would like to select the one that maximizes the separability of the scalars 33
34
Try to separate the two classes by projecting it onto different lines: 34 Unsuccessful separation LDA – two classes
35
Try to separate the two classes by projecting it onto different lines: 35 Successful separation Reducing the problem dimensionality from two features (x1,x2) to only a scalar value y. LDA – two classes
36
In order to find a good projection vector, we need to define a measure of separation Measure by Distance between mean vectors 36 This axis yields better class separability This axis has a larger distance between means
37
LDA – two classes - Fisher’s solution Fisher suggested maximizing the difference between the means, normalized by a measure of the within- class scatter For each class we define the scatter, an equivalent of the variance The Fisher linear discriminant is defined as the linear function that maximizes the criterion function 37 Within class scatter Scatter (per class)
38
LDA – two classes - Fisher’s solution Therefore, we are looking for a projection where samples from the same class are projected very close to each other and, at the same time, the projected means are as farther apart as possible 38 w hyperplane
39
2 sample classes X1, X2 39 Two Classes - Example
40
are the mean vectors of each class 40 Two Classes - Example S1, S2 are the covariance matrixes of X1 & X2 (the scatter)
41
41 Two Classes - Example S b is the Between-class scatter matrix S w is the Within-class scatter matrix
42
42 Two Classes - Example Finding eigenvalues and eigenvectors
43
43 Two Classes - Example LDA Projection found by Fisher’s Linear Discriminant Apparently, the projection vector that has the highest eigen value provides higher discrimination power between classes
44
LDA Limitation LDA is a parametric method since it assumes Gaussian conditional density models Therefore if the samples distribution are non- Gaussian, LDA will have difficulties to make the classification for complex structures 44
45
Contents Grayscale coloring background luminance / luminance channel Segmentation Discrete Cosine Transform K-nearest-neighbor (Knn) Linear Discriminant Analysis (LDA) Colorization using optimization Colorization by Example (i) training (ii) classification (iii) color transfer (iv) optimization 45
46
Colorization using optimization User scribbles desired colors inside regions Colors are propagated to all pixels Looking at the YUV space Remember: neighboring pixels with similar intensities should have similar colors 46 Levin at el.
47
Colorization using optimization input: Y(x; y; t) intensity volume output: U(x; y; t) color volume V(x; y; t) color volume w(rs) is a weighting function that sums to one and are the mean and variance of the intensities in a window around the pixel 47 Levin at el.
48
48 Colorization using optimization
49
49 Colorization using optimization
50
Contents Grayscale coloring background luminance / luminance channel Segmentation Discrete Cosine Transform K-nearest-neighbor (Knn) Linear Discriminant Analysis (LDA) Colorization using optimization Colorization by Example (i) training (ii) classification (iii) color transfer (iv) optimization 50
51
Colorization by Example Levin at el. Process main disadvantage is the need for manually adding colored scribbles. If we could place colored scribbles automatically, we could get Levin Process to do the rest. Given a reference color image and a grayscale one, the new process should output a colored image. 51 R. Irony, D. Cohen-Or and D. Lischinski
52
Colorization by Example 52 Input grayscale image Automatically create scribbled image Input reference colored imageOutput colored image
53
Training stage 53 Segment reference image
54
Training stage 54 Use the reference image in the luminance space (the Y dimension)
55
Training stage 55 Randomly extract k x k pixels surrounding a single pixel. (mach it to its given label)
56
Training stage 56 Extract DCT from each k x k pixels, and add it to the training set Feature vector
57
Colorization by Example Colorization by example has four stages I. Training The luminance channel of the reference image along with the accompanying partial segmentation are provided as a training set Construct a feature space and a corresponding classifier To classify between pixels, the classifier must be able to distinguish between different classes mainly based on texture. 57
58
Classification stage This classifier examines the K nearest neighbors of the feature vector and chooses the label by a majority vote. Extracting K x K pixel surrounding a single pixel from the grayscale image Appling DCT transform on the K x K pixels as it`s feature vector Enter the vector to the classifier 58
59
Classification stage 59 Sometimes Knn is not enough
60
Classification stage For better results - Discriminating subspace by LDA 60
61
Classification stage Applying Knn in a discriminating subspace 61
62
Classification stage The result of this process is a transformation T which transforms the vector of k^2 DCT coefficients to a point in the low-dimensional subspace between pixels p and q as Let f(pixel) – its feature vector (DCT coefficients) Let the distance between pixels p and q to be 62
63
Classification stage 63 Grayscale imageNaive nearest neighbor Voting in feature space
64
Classification stage Replace the label of p with the dominant label in his k×k neighborhood 64 The dominant label is the label with the highest confidence conf(p, l )
65
Classification stage 65 p – middle pixel N(p) – pixels from K x K neighbors N(p, l ) - pixels from K x K neighbors with label l W x – (x – some pixel) weight function, depend on the distance between the pixel x and its best match M x - (x – some pixel) the nearest neighbor of q in the feature space, which has the same label as q Note: this improvement done in the image space
66
Classification stage 66
67
Classification stage 67
68
Colorization by Example I. II. Classification Attempt to robustly determine, for each grayscale image pixel, which region should be used as a color reference for this pixel Using pixel’s nearest neighbors in the feature space for classification 68
69
Color transfer stage Getting the color for each grayscale pixel 69 C(p) – chrominance coordinates of a pixel p Mq(p) denotes the pixel in the colored reference image, whose position with respect to Mq is the same as the position of p with respect to q
70
Color transfer stage Each neighbor of p has a matching neighborhood in the reference image (Mq and Mr respectively), which “predicts” a different color for p The color of p is a result of a weighted average between these predictions 70
71
Colorization by Example I. II. III. Color transfer The matches found for each pixel and its image space neighbors also determine the color that should be assigned to each pixel, along with a measure of confidence in that choice 71
72
Optimization stage Transferring color in this manner produces a colorized result Since some areas might still be misclassified, the colorization will be wrong in such areas 72
73
Optimization stage To improve the colorization, color transferred only to pixels whose confidence in their label is sufficiently large, conf(p, l ) > 0.5 Those pixels are considered “micro-scribbles” 73
74
Colorization by Example I. II. III. IV. Optimization High level of “confidence” are given as “micro-scribbles” to the optimization-based colorization algorithm of Levin et al. 74
75
Results 75
76
Lets review Grayscale coloring background luminance / luminance channel Segmentation Discrete Cosine Transform K-nearest-neighbor (Knn) Linear Discriminant Analysis Levin at el. Colorization by Example (i) training (ii) classification (iii) color transfer (iv) optimization 76
77
Reference http://www.realtypin.com/news/story/1066-how-to-choose-the-perfect-paint-color http://www.gimp.org/tutorials/Selective_Color/ http://www.inf.ufrgs.br/~eslgastal/DomainTransform/Colorization/index.html http://www.kyivpost.com/guide/people/ukrainian-puts-color-into-soviet-movies- 320761.html?flavour=mobile http://www.blackmagic-color.com/?xcmpx=2557 http://maginmp.blogspot.co.il/2012/11/rec601709-and-luminance-range-explained.html http://www.pixiq.com/article/luminance-vs-luminosity http://www.umiacs.umd.edu/~mishraka/activeSeg.html http://www.na-mic.org/Wiki/index.php/Projects:BayesianMRSegmentation http://ivrg.epfl.ch/research/superpixels http://en.wiktionary.org/wiki/superpixel http://en.wikipedia.org/wiki/YUV http://www.slidefinder.net/n/nearest_neighbor_locality_sensitive_hashing/nearestniegborlecture/15 562362 http://webee.technion.ac.il/~lihi/Teaching/048983/Colorization.pdf
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.