Various Regularization Methods in Computer Vision Min-Gyu Park Computer Vision Lab. School of Information and Communications GIST.

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

Bayesian Belief Propagation
Professor Horst Cerjak, Thomas Pock A Duality Based Approach for Realtime TV-L 1 Optical Flow ICG A Duality Based Approach for Realtime TV-L.
Fast and Accurate Optical Flow Estimation
Johann Radon Institute for Computational and Applied Mathematics: 1/25 Signal- und Bildverarbeitung, Image Analysis and Processing.
Ter Haar Romeny, FEV Geometry-driven diffusion: nonlinear scale-space – adaptive scale-space.
DTAM: Dense Tracking and Mapping in Real-Time
CISC 489/689 Spring 2009 University of Delaware
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
Edge Preserving Image Restoration using L1 norm
TVL1 Models for Imaging: Global Optimization & Geometric Properties Part I Tony F. Chan Math Dept, UCLA S. Esedoglu Math Dept, Univ. Michigan Other Collaborators:
Engineering Optimization
Investigation Into Optical Flow Problem in the Presence of Spatially-varying Motion Blur Mohammad Hossein Daraei June 2014 University.
Pattern Recognition and Machine Learning
Introduction to Variational Methods and Applications
Ter Haar Romeny, Computer Vision 2014 Geometry-driven diffusion: nonlinear scale-space – adaptive scale-space.
Martin Burger Total Variation 1 Cetraro, September 2008 Variational Methods and their Analysis Questions: - Existence - Uniqueness - Optimality conditions.
Separating Hyperplanes
Proportion Priors for Image Sequence Segmentation Claudia Nieuwenhuis, etc. ICCV 2013 Oral.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Jun Zhu Dept. of Comp. Sci. & Tech., Tsinghua University This work was done when I was a visiting researcher at CMU. Joint.
ECE 472/572 - Digital Image Processing Lecture 8 - Image Restoration – Linear, Position-Invariant Degradations 10/10/11.
Visual Recognition Tutorial
ICG Professor Horst Cerjak, Thomas Pock Variational Methods for 3D Reconstruction Thomas Pock 1, Chrisopher Zach 2 and Horst Bischof 1 1 Institute.
Last Time Pinhole camera model, projection
Lecture 5: Learning models using EM
Martin Burger Institut für Numerische und Angewandte Mathematik European Institute for Molecular Imaging CeNoS Total Variation and related Methods Numerical.
SVM QP & Midterm Review Rob Hall 10/14/ This Recitation Review of Lagrange multipliers (basic undergrad calculus) Getting to the dual for a QP.
Proximal Support Vector Machine Classifiers KDD 2001 San Francisco August 26-29, 2001 Glenn Fung & Olvi Mangasarian Data Mining Institute University of.
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU
Support Vector Machine (SVM) Classification
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
Reformulated - SVR as a Constrained Minimization Problem subject to n+1+2m variables and 2m constrains minimization problem Enlarge the problem size and.
Optical flow and Tracking CISC 649/849 Spring 2009 University of Delaware.
Background vs. foreground segmentation of video sequences = +
Support Vector Machines and Kernel Methods
1 Computational Learning Theory and Kernel Methods Tianyi Jiang March 8, 2004.
Optical Flow Estimation using Variational Techniques Darya Frolova.
Lecture 10: Support Vector Machines
Martin Burger Total Variation 1 Cetraro, September 2008 Numerical Schemes Wrap up approximate formulations of subgradient relation.
1 Multiple Kernel Learning Naouel Baili MRL Seminar, Fall 2009.
Greg GrudicIntro AI1 Support Vector Machine (SVM) Classification Greg Grudic.
Stereo Matching & Energy Minimization Vision for Graphics CSE 590SS, Winter 2001 Richard Szeliski.
Advanced Image Processing Image Relaxation – Restoration and Feature Extraction 02/02/10.
Adaptive Regularization of the NL-Means : Application to Image and Video Denoising IEEE TRANSACTION ON IMAGE PROCESSING , VOL , 23 , NO,8 , AUGUST 2014.
Graph Cut Algorithms for Binocular Stereo with Occlusions
Machine Learning Seminar: Support Vector Regression Presented by: Heng Ji 10/08/03.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 3: LINEAR MODELS FOR REGRESSION.
Effective Optical Flow Estimation
Fields of Experts: A Framework for Learning Image Priors (Mon) Young Ki Baik, Computer Vision Lab.
1 Markov random field: A brief introduction (2) Tzu-Cheng Jen Institute of Electronics, NCTU
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Ch.8 Efficient Coding of Visual Scenes by Grouping and Segmentation Bayesian Brain Tai Sing Lee and Alan L. Yuille Heo, Min-Oh.
Motion Estimation using Markov Random Fields Hrvoje Bogunović Image Processing Group Faculty of Electrical Engineering and Computing University of Zagreb.
Large-Scale Matrix Factorization with Missing Data under Additional Constraints Kaushik Mitra University of Maryland, College Park, MD Sameer Sheoreyy.
Efficient Belief Propagation for Image Restoration Qi Zhao Mar.22,2006.
Greg GrudicIntro AI1 Support Vector Machine (SVM) Classification Greg Grudic.
Regularization of energy-based representations Minimize total energy E p (u) + (1- )E d (u,d) E p (u) : Stabilizing function - a smoothness constraint.
Amir Yavariabdi Introduction to the Calculus of Variations and Optical Flow.
Introduction to Medical Imaging Week 6: Introduction to Medical Imaging Week 6: Denoising (part II) – Variational Methods and Evolutions Guy Gilboa Course.
1 Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 23, 2010 Piotr Mirowski Based on slides by Sumit.
Biointelligence Laboratory, Seoul National University
Geometrical intuition behind the dual problem
Probabilistic Models for Linear Regression
Support Vector Machines Introduction to Data Mining, 2nd Edition by
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Support Vector Machines
EE 458 Introduction to Optimization
Occlusion and smoothness probabilities in 3D cluttered scenes
Presentation transcript:

Various Regularization Methods in Computer Vision Min-Gyu Park Computer Vision Lab. School of Information and Communications GIST

Vision Problems (intro) Such as stereo matching, optical flow estimation, de- noising, segmentation, are typically ill-posed problems. –Because these are inverse problems. Properties of well-posed problems. –Existence: a solution exists. –Uniqueness: the solution is unique. –Stability: the solution continuously depends on the input data.

Vision Problems (intro) Vision problems are difficult to compute the solution directly. –Then, how to find a meaningful solution to such a hard problem? Impose the prior knowledge to the solution. –Which means we constrict the space of possible solutions to physically meaningful ones.

Vision Problems (intro) This seminar is about imposing our prior knowledge to the solution or to the scene. There are various kinds of approaches, –Quadratic regularization, –Total variation, –Piecewise smooth models, –Stochastic approaches, –With either L1 or L2 data fidelity terms. We will study about the properties of different priors.

Bayesian Inference & Probabilistic Modeling We will see the simple de-noising problem. –f is a noisy input image, u is the noise-free (de-noised) image, and n is Gaussian noise. Our objective is finding the posterior distribution, –Where the posterior distribution can be directly estimated or can be estimated as,

Bayesian Inference & Probabilistic Modeling Probabilistic modeling Depending on how we model p(u), the solution will be significantly different. Likelihood term (data fidelity term) Prior term Evidence (does not depend on u)

De-noising Problem Critical issue. –How to smooth the input image while preserving some important features such as image edge. Input (noisy) image De-noised image via L1 regularization term

De-noising Problem Formulation. Quadratic smoothness of a first order derivatives. First order: flat surface Second order: quadratic surface

De-noising Problem By combining both likelihood and prior terms, Thus, maximization of p(f|u)p(u) is equivalent to minimize the free energy of Gibbs distribution. Is the exactly Gibbs function!!!

How to minimize the energy function? Directly solve the Euler-Lagrange equations. –Because the solution space is convex! (having a globally unique solution)

The Result of a Quadratic Regularizer Input (noisy) image Noise are removed (smoothed), but edges are also blurred. The result is not satisfactory….

Why? Due to bias against discontinuities Discontinuity are penalized more!!! intensity whereas L1 norm(total variation) treats both as same.

Pros & Cons If there is no discontinuity in the result such as depth map, surface, and noise-free image, quadratic regularizer will be a good solution. –L2 regulaizer is biased against discontinuities. –Easy to solve! Descent gradient will find the solution. Quadratic problems has a unique global solution. –Meaning it is a well-posed problem. –But, we cannot guarantee the solution is truly correct.

Introduction to Total Variation If we use L1-norm for the smoothness prior, Furthermore, if we assume the variance is 1 then,

Introduction to Total Variation Then, the free energy is defined as total variation of a function u. x u(x) 0 Definition of total variation:

Characteristics of Total Variation Advantages: –No bias against discontinuities. –Contrast invariant without explicitly modeling the light condition. –Robust under impulse noise. Disadvantages: –Objective functions are non-convex. Lie between convex and non-convex problems.

How to solve it? With L1, L2 data terms, we can use –Variational methods Explicit Time Marching Linearization of Euler-Lagrangian Nonlinear Primal-dual method Nonlinear multi-grid method –Graph cuts –Convex optimization (first order scheme) –Second order cone programming To solve original non-convex problems.

Variational Methods Definition. –Informally speaking, they are based on solving Euler- Lagrange equations. Problem Definition (constrained problem). The first total variation based approach in computer vision, named after Rudin, Osher and Fatemi, shortly as ROF model (1992).

Variational Methods Unconstrained (Lagrangian) model Can be solved by explicit time matching scheme as,

Variational Methods What happens if we change the data fidelity term to L1 norm as, More difficult to solve (non-convex), but robust against outliers such as occlusion. This formulation is called as TV-L1 framework.

Variational Methods Comparison among variational methods in terms of explicit time marching scheme. Where the degeneracy comes from. L2-L2 TV-L2 TV-L1

Variational Methods In L2-L2 case, where

Duality-based Approach Why do we use duality instead of the primal problem? –The function becomes continuously differentiable. –Not always, but in case of total variation. For example, we use below property to introduce a dual variable p,

Duality-based Approach Deeper understandings of duality in variational methods will be given in the next seminar.

Applying to Other Problems Optical flow (Horn and Schunck – L2-L2) Stereo matching (TV-L1) Segmentation (TV-L2)

Q&A / Discussion