Pixel Recovery via Minimization in the Wavelet Domain Ivan W. Selesnick, Richard Van Slyke, and Onur G. Guleryuz *: Polytechnic University, Brooklyn, NY.

Slides:



Advertisements
Similar presentations
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Advertisements

Joint work with Irad Yavneh
Prediction with Regression
Coherent Multiscale Image Processing using Quaternion Wavelets Wai Lam Chan M.S. defense Committee: Hyeokho Choi, Richard Baraniuk, Michael Orchard.
Onur G. Guleryuz & Ulas C.Kozat DoCoMo USA Labs, San Jose, CA 95110
Multi-Task Compressive Sensing with Dirichlet Process Priors Yuting Qi 1, Dehong Liu 1, David Dunson 2, and Lawrence Carin 1 1 Department of Electrical.
Wangmeng Zuo, Deyu Meng, Lei Zhang, Xiangchu Feng, David Zhang
Chapter 2: Lasso for linear models
Image Denoising using Locally Learned Dictionaries Priyam Chatterjee Peyman Milanfar Dept. of Electrical Engineering University of California, Santa Cruz.
Extensions of wavelets
1 Micha Feigin, Danny Feldman, Nir Sochen
An Introduction to Sparse Coding, Sparse Sensing, and Optimization Speaker: Wei-Lun Chao Date: Nov. 23, 2011 DISP Lab, Graduate Institute of Communication.
Compressed sensing Carlos Becker, Guillaume Lemaître & Peter Rennert
ECE Department Rice University dsp.rice.edu/cs Measurements and Bits: Compressed Sensing meets Information Theory Shriram Sarvotham Dror Baron Richard.
Bayesian Robust Principal Component Analysis Presenter: Raghu Ranganathan ECE / CMR Tennessee Technological University January 21, 2011 Reading Group (Xinghao.
Sparse and Overcomplete Data Representation
SRINKAGE FOR REDUNDANT REPRESENTATIONS ? Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000, Israel.
Richard Baraniuk Rice University dsp.rice.edu/cs Compressive Signal Processing.
Modern Sampling Methods Summary of Subspace Priors Spring, 2009.
Image Denoising via Learned Dictionaries and Sparse Representations
DoCoMo USA Labs All Rights Reserved Sandeep Kanumuri, NML Fast super-resolution of video sequences using sparse directional transforms* Sandeep Kanumuri.
Compressive Signal Processing
7th IEEE Technical Exchange Meeting 2000 Hybrid Wavelet-SVD based Filtering of Noise in Harmonics By Prof. Maamar Bettayeb and Syed Faisal Ali Shah King.
Random Convolution in Compressive Sampling Michael Fleyer.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Multiscale transforms : wavelets, ridgelets, curvelets, etc.
Alfredo Nava-Tudela John J. Benedetto, advisor
A Nonlinear Loop Filter for Quantization Noise Removal in Hybrid Video Compression Onur G. Guleryuz DoCoMo USA Labs
Compressed Sensing Compressive Sampling
Sparsity-Aware Adaptive Algorithms Based on Alternating Optimization and Shrinkage Rodrigo C. de Lamare* + and Raimundo Sampaio-Neto * + Communications.
ENG4BF3 Medical Image Processing
Unitary Extension Principle: Ten Years After Zuowei Shen Department of Mathematics National University of Singapore.
Image Denoising using Wavelet Thresholding Techniques Submitted by Yang
Jinhui Tang †, Shuicheng Yan †, Richang Hong †, Guo-Jun Qi ‡, Tat-Seng Chua † † National University of Singapore ‡ University of Illinois at Urbana-Champaign.
Compressive Sampling: A Brief Overview
Predicting Wavelet Coefficients Over Edges Using Estimates Based on Nonlinear Approximants Onur G. Guleryuz Epson Palo Alto Laboratory.
AMSC 6631 Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Midyear Report Alfredo Nava-Tudela John J. Benedetto,
WEIGHTED OVERCOMPLETE DENOISING Onur G. Guleryuz Epson Palo Alto Laboratory Palo Alto, CA (Please view in full screen mode to see.
On Missing Data Prediction using Sparse Signal Models: A Comparison of Atomic Decompositions with Iterated Denoising Onur G. Guleryuz DoCoMo USA Labs,
A Framework for Distributed Model Predictive Control
Cs: compressed sensing
Iterated Denoising for Image Recovery Onur G. Guleryuz To see the animations and movies please use full-screen mode. Clicking on.
“A fast method for Underdetermined Sparse Component Analysis (SCA) based on Iterative Detection- Estimation (IDE)” Arash Ali-Amini 1 Massoud BABAIE-ZADEH.
EE369C Final Project: Accelerated Flip Angle Sequences Jan 9, 2012 Jason Su.
Learning With Structured Sparsity
Rajeev Aggarwal, Jai Karan Singh, Vijay Kumar Gupta, Sanjay Rathore, Mukesh Tiwari, Dr.Anubhuti Khare International Journal of Computer Applications (0975.
Basis Expansions and Regularization Part II. Outline Review of Splines Wavelet Smoothing Reproducing Kernel Hilbert Spaces.
1 STOCHASTIC SAMPLING FROM IMAGE CODER INDUCED PROBABILITY DISTRIBUTIONS presenting author: Google.
Direct Robust Matrix Factorization Liang Xiong, Xi Chen, Jeff Schneider Presented by xxx School of Computer Science Carnegie Mellon University.
Image Denoising Using Wavelets
EE565 Advanced Image Processing Copyright Xin Li Image Denoising Theory of linear estimation Spatial domain denoising techniques Conventional Wiener.
EE565 Advanced Image Processing Copyright Xin Li Image Denoising: a Statistical Approach Linear estimation theory summary Spatial domain denoising.
R EGRESSION S HRINKAGE AND S ELECTION VIA THE L ASSO Author: Robert Tibshirani Journal of the Royal Statistical Society 1996 Presentation: Tinglin Liu.
Image Priors and the Sparse-Land Model
Large-Scale Matrix Factorization with Missing Data under Additional Constraints Kaushik Mitra University of Maryland, College Park, MD Sameer Sheoreyy.
Nonlinear Approximation Based Image Recovery Using Adaptive Sparse Reconstructions Onur G. Guleryuz Epson Palo Alto Laboratory.
Structure from motion Multi-view geometry Affine structure from motion Projective structure from motion Planches : –
Compressive Sensing Techniques for Video Acquisition EE5359 Multimedia Processing December 8,2009 Madhu P. Krishnan.
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
Bayesian fMRI analysis with Spatial Basis Function Priors
Wavelet domain image denoising via support vector regression
Jeremy Watt and Aggelos Katsaggelos Northwestern University
Depth from disparity (x´,y´)=(x+D(x,y), y)
Basic Algorithms Christina Gallner
A Motivating Application: Sensor Array Signal Processing
Optimal sparse representations in general overcomplete bases
Outline Sparse Reconstruction RIP Condition
Image restoration, noise models, detection, deconvolution
Presentation transcript:

Pixel Recovery via Minimization in the Wavelet Domain Ivan W. Selesnick, Richard Van Slyke, and Onur G. Guleryuz *: Polytechnic University, Brooklyn, NY # : DoCoMo Communications Laboratories USA, Inc., San Jose, CA * * # presenting author

Problem statement: Estimation/Recovery of missing data. Formulation as a linear expansion over overcomplete basis. Expansions that minimize the norm. Why do this? Connections to adaptive linear estimators and sparsity. Connections to recent results and statistics Simulation results and comparisons to our earlier work. Why not to do this: Analysis of what is going on. Conclusion and ways of modifying the solutions for better results. Overview ( Presentation is much more detailed than the paper.) ( Some software available, please check the paper.)

Problem Statement Original available pixels lost pixels (assume zero mean) 1. Lost Block Image 2. Derive predicted 3.

Formulation 1. Take NxM matrix of overcomplete basis, 2. Write y in terms of the basis 3. Find the expansion coefficients (two ways) available data projection

Find the expansion coefficients to minimize the norm norm of expansion coefficients RegularizationAvailable data constraint subject to

Why minimize the norm? “Under i.i.d. Laplacian model for coefficient probabilities, norm Real reason: sparse decompositions. min subject to Bogus reason

What does sparsity have to do with estimation/recovery? 1. Any such decomposition builds an adaptive linear estimate. 2. In fact “any” estimate can be written in this form. Onur G. Guleryuz, ``Nonlinear Approximation Based Image Recovery Using Adaptive Sparse Reconstructions and Iterated Denoising: Part I - Theory,‘’ IEEE Tr. on IP, in review. (google: onur guleryuz).

The recovered signal must be sparse 3. The recovered becomes Onur G. Guleryuz, ``Nonlinear Approximation Based Image Recovery Using Adaptive Sparse Reconstructions and Iterated Denoising: Part I - Theory,‘’ IEEE Tr. on IP, in review. (google: onur guleryuz). null space of dimension y has to be sparse

Who cares about y, what about the original x? Onur G. Guleryuz, ``Nonlinear Approximation Based Image Recovery Using Adaptive Sparse Reconstructions and Iterated Denoising: Part I - Theory,‘’ IEEE Tr. on IP, in review. (google: onur guleryuz). If successful prediction is possible x also has to be ~sparse small, then x ~ sparse 1. Predictable sparse 2. Sparsity of x is not a bad leap of faith to make in estimation i.e., if If not sparse, cannot estimate well anyway. (caveat: the data may be sparse, but not in the given basis)

Why minimize the norm? Under certain conditions the problem gives the solution to the problem: D. Donoho, M. Elad, and V. Temlyakov, ``Stable Recovery of Sparse Overcomplete Representations in the Presence of Noise‘’. subject to Find the “most predictable”/sparsest expansion that agrees with the data. subject to (solving convex, not combinatorial)

Why minimize the norm? R. Tibshirani, ``Regression shrinkage and selection via the lasso’’. J. Royal. Statist. Soc B., Vol. 58, No. 1, pp subject to Experience from statistics literature. The “lasso” is known to generate sparse expansions.

Simulation Results subject to vs. Onur G. Guleryuz, ``Nonlinear Approximation Based Image Recovery Using Adaptive Sparse Reconstructions and Iterated Denoising: Part II –Adaptive Algorithms,‘’ IEEE Tr. on IP, in review. (google: onur guleryuz). H: Two times expansive M=2N, real, dual-tree, DWT. Real part of: N. G. Kingsbury, ``Complex wavelets for shift invariant analysis and filtering of signals,‘’ Appl. Comput. Harmon. Anal., 10(3): , May Iterated Denosing (ID) with no layering and no selective thresholding :

Simulation Results

Sparse Modeling Generates Non- Convex Problems Pixel coordinates for a “two pixel” image x x Transform coordinates available pixel missing pixel x available pixel constraint

+ = “Sparse=non-convex”, who cares. What about reality, natural images?

x Geometry x ball x Case 1 Case 2Case 3 Not sparse Bogus reason

Why not to minimize the norm What about all the optimality/sparsest results? Results such as: D. Donoho et. al. ``Stable Recovery of Sparse Overcomplete Representations in the Presence of Noise‘’. are very impressive, but they are tied closely to H providing the sparsest decomposition for x. overwhelming noise: modeling error error due to missing data

Why not to minimize the norm subject to “nice” basis, “decoherent” “not nice” basis (due to cropping), may become very “coherent” (problem due to )

Examples orthonormal, coherency=0 unnormalized coherency= normalized coherency= 1 (worst possible) 1. Optimal solution sometimes tries to make coefficients of scaling functions zero. 2. solution never sees the actual problem.

... Progression 2: What does ID do? Uses correctly modeled components to reduce the overwhelming errors/”noise” Progression 1: Decomposes the big problem into many progressions. Arrives at the final complex problem by solving much simpler problems. is conceptually a single step, greedy version of ID.

ID is all about robustly selecting sparsity Tries to be sparse, not the sparsest. Robust to model failures. Other constraints easy to incorporate

Conclusion 1. Have to be more agnostic than smoothest, sharpest, smallest, sparsest, *est. minimum mse not necessarily = sparsest 2. Have to be more robust to modeling errors. When a convex approximation is possible to the underlying non-convex problem, great. But have to make sure assumptions are not violated. 3. Is it still possible to use, but with ID principles? Yes For lasso/ fans:

subject to... But must ensure no case 3 problems (ID stays away from those). 1. It’s not about the lasso or how you tighten the lasso, it’s about what (plural) you tighten the lasso to. subject to available data Do you think you reduced mse? No: you shouldn’t have done this. Yes: Do it again. 2. This is not “LASSO”, “LARS”,.... This is Iterated Denoising (use hard thresholding!).