“A fast method for Underdetermined Sparse Component Analysis (SCA) based on Iterative Detection- Estimation (IDE)” Arash Ali-Amini 1 Massoud BABAIE-ZADEH.

Slides:



Advertisements
Similar presentations
Independent Component Analysis
Advertisements

ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The Linear Prediction Model The Autocorrelation Method Levinson and Durbin.
Color Imaging Analysis of Spatio-chromatic Decorrelation for Colour Image Reconstruction Mark S. Drew and Steven Bergner
Slide 1 Bayesian Model Fusion: Large-Scale Performance Modeling of Analog and Mixed- Signal Circuits by Reusing Early-Stage Data Fa Wang*, Wangyang Zhang*,
MMSE Estimation for Sparse Representation Modeling
Joint work with Irad Yavneh
Pixel Recovery via Minimization in the Wavelet Domain Ivan W. Selesnick, Richard Van Slyke, and Onur G. Guleryuz *: Polytechnic University, Brooklyn, NY.
Fast Bayesian Matching Pursuit Presenter: Changchun Zhang ECE / CMR Tennessee Technological University November 12, 2010 Reading Group (Authors: Philip.
The General Linear Model. The Simple Linear Model Linear Regression.
2008 SIAM Conference on Imaging Science July 7, 2008 Jason A. Palmer
Extensions of wavelets
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
An Introduction to Sparse Coding, Sparse Sensing, and Optimization Speaker: Wei-Lun Chao Date: Nov. 23, 2011 DISP Lab, Graduate Institute of Communication.
1 Applications on Signal Recovering Miguel Argáez Carlos A. Quintero Computational Science Program El Paso, Texas, USA April 16, 2009.
Sparse and Overcomplete Data Representation
Independent Component Analysis (ICA)
Image Denoising via Learned Dictionaries and Sparse Representations
Over-Complete & Sparse Representations for Image Decomposition and Inpainting Michael Elad The Computer Science Department The Technion – Israel Institute.
Subband-based Independent Component Analysis Y. Qi, P.S. Krishnaprasad, and S.A. Shamma ECE Department University of Maryland, College Park.
Independent Component Analysis (ICA) and Factor Analysis (FA)
Analysis of the Basis Pustuit Algorithm and Applications*
Recent Trends in Signal Representations and Their Role in Image Processing Michael Elad The CS Department The Technion – Israel Institute of technology.
Image Decomposition and Inpainting By Sparse & Redundant Representations Michael Elad The Computer Science Department The Technion – Israel Institute of.
ICA Alphan Altinok. Outline  PCA  ICA  Foundation  Ambiguities  Algorithms  Examples  Papers.
Over-Complete & Sparse Representations for Image Decomposition*
A Weighted Average of Sparse Several Representations is Better than the Sparsest One Alone Michael Elad The Computer Science Department The Technion –
Topics in MMSE Estimation for Sparse Approximation Michael Elad The Computer Science Department The Technion – Israel Institute of technology Haifa 32000,
Empirical Bayes approaches to thresholding Bernard Silverman, University of Bristol (joint work with Iain Johnstone, Stanford) IMS meeting 30 July 2002.
Alfredo Nava-Tudela John J. Benedetto, advisor
AMSC 6631 Sparse Solutions of Linear Systems of Equations and Sparse Modeling of Signals and Images: Midyear Report Alfredo Nava-Tudela John J. Benedetto,
Sparse Coding Arthur Pece Outline Generative-model-based vision Linear, non-Gaussian, over-complete generative models The penalty method.
WEIGHTED OVERCOMPLETE DENOISING Onur G. Guleryuz Epson Palo Alto Laboratory Palo Alto, CA (Please view in full screen mode to see.
Electrode Selection for Noninvasive Fetal Electrocardiogram Extraction using Mutual Information Criteria R. Sameni, F. Vrins, F. Parmentier, C. Hérail,
Cs: compressed sensing
INDEPENDENT COMPONENT ANALYSIS OF TEXTURES based on the article R.Manduchi, J. Portilla, ICA of Textures, The Proc. of the 7 th IEEE Int. Conf. On Comp.
Hongyan Li, Huakui Wang, Baojin Xiao College of Information Engineering of Taiyuan University of Technology 8th International Conference on Signal Processing.
Blind Separation of Speech Mixtures Vaninirappuputhenpurayil Gopalan REJU School of Electrical and Electronic Engineering Nanyang Technological University.
Fast and incoherent dictionary learning algorithms with application to fMRI Authors: Vahid Abolghasemi Saideh Ferdowsi Saeid Sanei. Journal of Signal Processing.
Shriram Sarvotham Dror Baron Richard Baraniuk ECE Department Rice University dsp.rice.edu/cs Sudocodes Fast measurement and reconstruction of sparse signals.
ECE 8443 – Pattern Recognition LECTURE 10: HETEROSCEDASTIC LINEAR DISCRIMINANT ANALYSIS AND INDEPENDENT COMPONENT ANALYSIS Objectives: Generalization of.
A Sparse Non-Parametric Approach for Single Channel Separation of Known Sounds Paris Smaragdis, Madhusudana Shashanka, Bhiksha Raj NIPS 2009.
Learning to Sense Sparse Signals: Simultaneous Sensing Matrix and Sparsifying Dictionary Optimization Julio Martin Duarte-Carvajalino, and Guillermo Sapiro.
Parameter estimation. 2D homography Given a set of (x i,x i ’), compute H (x i ’=Hx i ) 3D to 2D camera projection Given a set of (X i,x i ), compute.
Sparse Signals Reconstruction Via Adaptive Iterative Greedy Algorithm Ahmed Aziz, Ahmed Salim, Walid Osamy Presenter : 張庭豪 International Journal of Computer.
Full-rank Gaussian modeling of convolutive audio mixtures applied to source separation Ngoc Q. K. Duong, Supervisor: R. Gribonval and E. Vincent METISS.
A Weighted Average of Sparse Representations is Better than the Sparsest One Alone Michael Elad and Irad Yavneh SIAM Conference on Imaging Science ’08.
OR Chapter 8. General LP Problems Converting other forms to general LP problem : min c’x  - max (-c)’x   = by adding a nonnegative slack variable.
Review of Spectral Unmixing for Hyperspectral Imagery Lidan Miao Sept. 29, 2005.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 12: Advanced Discriminant Analysis Objectives:
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Introduction to Independent Component Analysis Math 285 project Fall 2015 Jingmei Lu Xixi Lu 12/10/2015.
Siemens Corporate Research Rosca et al. – Generalized Sparse Mixing Model & BSS – ICASSP, Montreal 2004 Generalized Sparse Signal Mixing Model and Application.
Xiaoying Pang Indiana University March. 17 th, 2010 Independent Component Analysis for Beam Measurement.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems Prof. Hao Zhu Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign.
Parameter estimation class 5 Multiple View Geometry CPSC 689 Slides modified from Marc Pollefeys’ Comp
Jianchao Yang, John Wright, Thomas Huang, Yi Ma CVPR 2008 Image Super-Resolution as Sparse Representation of Raw Image Patches.
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images Alfred M. Bruckstein (Technion), David L. Donoho (Stanford), Michael.
Lecture 16: Image alignment
Latent Variables, Mixture Models and EM
Basic Algorithms Christina Gallner
A Motivating Application: Sensor Array Signal Processing
Sudocodes Fast measurement and reconstruction of sparse signals
Optimal sparse representations in general overcomplete bases
A Fast Fixed-Point Algorithm for Independent Component Analysis
* * Joint work with Michal Aharon Freddy Bruckstein Michael Elad
Parametric Methods Berlin Chen, 2005 References:
INFONET Seminar Application Group
Sudocodes Fast measurement and reconstruction of sparse signals
Outline Sparse Reconstruction RIP Condition
Presentation transcript:

“A fast method for Underdetermined Sparse Component Analysis (SCA) based on Iterative Detection- Estimation (IDE)” Arash Ali-Amini 1 Massoud BABAIE-ZADEH 1,2 Christian Jutten 2 1- Sharif University of Technology, Tehran, IRAN 2- Laboratory of Images and Signals (LIS), CNRS, INPG, UJF, Grenoble, FRANCE

MaxEnt2006, 8-13 July, Paris, France 2/42 Outline Introduction to Blind Source Separation Geometrical Interpretation Sparse Component Analysis (SCA), underdetermined case  Identifying mixing matrix  Source restoration Finding sparse solutions of an Underdetermined System of Linear Equations (USLE):  Minimum L0 norm  Matching Pursuit  Minimum L1 norm or Basis Pursuit (  Linear Programming)  Iterative Detection-Estimation (IDE) - our method Simulation results Conclusions and Perspectives

MaxEnt2006, 8-13 July, Paris, France 3/42 Blind Source Separation (BSS) Source signals s 1, s 2, …, s M Source vector: s=(s 1, s 2, …, s M ) T Observation vector: x=(x 1, x 2, …, x N ) T Mixing system  x = As Goal  Finding a separating matrix y = Bx

MaxEnt2006, 8-13 July, Paris, France 4/42 Blind Source Separation (cont.) Assumption:  N >=M (#sensors >= #sources)  A is full-rank (invertible) prior information: Statistical “Independence” of sources Main idea: Find “B” to obtain “independent” outputs (  Independent Component Analysis=ICA)

MaxEnt2006, 8-13 July, Paris, France 5/42 Blind Source Separation (cont.) Separability Theorem [Comon 1994,Darmois 1953]: If at most 1 source is Gaussian: statistical independence of outputs  source separation (  ICA: a method for BSS) Indeterminacies: permutation, scale A = [a 1, a 2, …, a M ], x=As  x = s 1 a 1 + s 2 a 2 + … + s M a M

MaxEnt2006, 8-13 July, Paris, France 6/42 Geometrical Interpretation x=As Statistical Independence of s1 and s2, bounded PDF  rectangular support for joint PDF of (s1,s2)  rectangular scatter plot for (s1,s2)

MaxEnt2006, 8-13 July, Paris, France 7/42 Sparse Sources s-planex-plane 2 sources, 2 sensors:

MaxEnt2006, 8-13 July, Paris, France 8/42 Importance of Sparse Source Separation The sources may be not sparse in time, but sparse in another domain (frequency, time- frequency, time-scale, etc.) x=As  T{x}=A T{s}  T: a ‘sparsifying’ transform Example. Speech signals are usually sparse in ‘time-frequency’ domain.

MaxEnt2006, 8-13 July, Paris, France 9/42 Sparse sources (cont.) 3 sparse sources, 2 sensors Sparsity  Source Separation, with more sensors than sources?

MaxEnt2006, 8-13 July, Paris, France 10/42 Estimating the mixing matrix A = [a 1, a 2, a 3 ]  x = s 1 a 1 + s 2 a 2 + s 3 a 3  Mixing matrix is easily identified for sparse sources Scale & Permutation indeterminacy ||a i ||=1

MaxEnt2006, 8-13 July, Paris, France 11/42 Restoration of the sources How to find the sources, after having found the mixing matrix (A)? 2 equations, 3 unknowns  infinitely many solutions! Underdertermined SCA, underdetermined system of equations

MaxEnt2006, 8-13 July, Paris, France 12/42 Identification vs Separation #Sources <= #Sensors: Identifying A  Source Separation #Sources > #Sensors (underdetermined) Identifying A  Source Separation  Two different problems: Identifying the mixing matrix (relatively easy) Restoring the sources (difficult)

MaxEnt2006, 8-13 July, Paris, France 13/42 Is it possible, at all? A is known, at eash instant (n 0 ), we should solve un underdetermined linear system of equations: Infinite number of solutions s(n 0 )  Is it possible to recover the sources?

MaxEnt2006, 8-13 July, Paris, France 14/42 ‘Sparse’ solution s i (n) sparse in time  The vector s(n 0 ) is most likely a ‘sparse vector’ A.s(n 0 ) = x(n 0 ) has infinitely many solutions, but not all of them are sparse! Idea: For restoring the sources, take the sparsest solution (most likely solution)

MaxEnt2006, 8-13 July, Paris, France 15/42 Example ( 2 equations, 4 unknowns ) Some of solutions: Sparsest

MaxEnt2006, 8-13 July, Paris, France 16/42 The idea of solving underdetermined SCA A s(n) = x(n), n=0,1,…,T Step 1 (identification): Estimate A (relatively easy) Step 2 (source restoration): At each instant n 0, find the sparsest solution of A s( n 0 ) = x( n 0 ), n 0 =0,…,T Main question: HOW to find the sparsest solution of an Underdetermined System of Linear Equations (USLE)?

MaxEnt2006, 8-13 July, Paris, France 17/42 Another application of USLE: Atomic decomposition over an overcompelete dictionary Decomposing a signal x, as a linear combination of a set of fixed signals (atoms) Terminology: Atoms:  i, i=1,…,M Dictionary: {  1,  2,…,  M }

MaxEnt2006, 8-13 July, Paris, France 18/42 Atomic decomposition (cont.) M=N  Complete dictionary  Unique set of coefficients Examples: Dirac dictionary, Fourier Dictionary Dirac Dictionary:

MaxEnt2006, 8-13 July, Paris, France 19/42 Atomic decomposition (cont.) M=N  Complete dictionary  Unique set of coefficients Examples: Dirac dictionary, Fourier Dictionary Fourier Dictionary:

MaxEnt2006, 8-13 July, Paris, France 20/42 Atomic decomposition (cont.) Matrix Form: If just a few number of coefficient are non-zero  The underlying structure is very well revealed Example.  signal has just a few non-zero samples in time  its decomposition over the Dirac dictionary reveals it  Signals composed of a few pure frequencies  its decomposition over the Fourier dictionary reveals it  How about a signals which is the sum of a pure frequency and a dirac?

MaxEnt2006, 8-13 July, Paris, France 21/42 Atomic decomposition (cont.) Solution: consider a larger dictionary, containing both Dirac and Fourier atoms M>N  Overcomplete dictionary. Problem: Non-uniqueness of  (Non-unique representation) (  USLE) However: we are looking for sparse solution

MaxEnt2006, 8-13 July, Paris, France 22/42 Sparse solution of USLE Underdetermined SCA Atomic Decomposition on over-complete dictionaries Findind sparsest solution of USLE

MaxEnt2006, 8-13 July, Paris, France 23/42 Uniqueness of sparse solution x=As, n equations, m unknowns, m>n Question: Is the sparse solution unique? Theorem (Donoho 2004): if there is a solution s with less than n/2 non-zero components, then it is unique with probability 1 (that is, for almost all A’s).

MaxEnt2006, 8-13 July, Paris, France 24/42 How to find the sparsest solution A.s = x, n equations, m unknowns, m>n Goal: Finding the sparsest solution Note: at least m-n sources are zero. Direct method:  Set m-n (arbitrary) sources equal to zero  Solve the remaining system of n equations and n unknowns  Do above for all possible choices, and take sparsest answer. Another name: Minimum L 0 norm method  L 0 norm of s = number of non-zero components =  |s i | 0

MaxEnt2006, 8-13 July, Paris, France 25/42 Example s1=s2=0  s=(0, 0, 1.5, 2.5) T  L 0 =2 s1=s3=0  s=(0, 2, 0, 0) T  L 0 =1 s1=s4=0  s=(0, 2, 0, 0) T  L 0 =1 s2=s3=0  s=(2, 0, 0, 2) T  L 0 =2 s2=s4=0  s=(10, 0, -6, 0) T  L 0 =2 s3=s4=0  s=(0, 2, 0, 0) T  L 0 =2  Minimum L 0 norm solution  s=(0, 2, 0, 0) T

MaxEnt2006, 8-13 July, Paris, France 26/42 Drawbacks of minimal norm L 0 Highly (unacceptably) sensitive to noise Need for a combinatorial search: Example. m=50, n=30, On our computer: Time for solving a 30 by 30 system of equation=2x10 -4 s Total time  (5x10 13 )(2x10 -4 )  300 years!  Non-tractable

MaxEnt2006, 8-13 July, Paris, France 27/42 faster methods? Matching Pursuit [Mallat & Zhang, 1993] Basis Pursuit (minimal L1 norm  Linear Programming) [Chen, Donoho, Saunders, 1995] Our method (IDE)

MaxEnt2006, 8-13 July, Paris, France 28/42 Matching Pursuit (MP) [Mallat & Zhang, 1993]

MaxEnt2006, 8-13 July, Paris, France 29/42 Properties of MP Advantage:  Very Fast Drawback  A very ‘greedy’ algorithm  Mistake in a stage can never be corrected  Not necessarily a sparse solution x=s1a1+s2a2x=s1a1+s2a2 a1 a2 a3 a5 a4

MaxEnt2006, 8-13 July, Paris, France 30/42 Minimum L 1 norm or Basis Pursuit [Chen, Donoho, Saunders, 1995] Minimum norm L1 solution: MAP estimator under a Laplacian prior Recent theoretical support (Donoho, 2004) : For ‘most’ ‘large’ underdetermined systems of linear equations, the minimal L 1 norm solution is also the sparsest solution

MaxEnt2006, 8-13 July, Paris, France 31/42 Minimal L 1 norm (cont.) Minimal L 1 norm solution may be found by Linear Programming (LP) Fast algorithms for LP:  Simplex  Interior Point method

MaxEnt2006, 8-13 July, Paris, France 32/42 Minimal L 1 norm (cont.) Advantages:  Very good practical results  Theoretical support Drawback:  Tractable, but still very time-consuming

MaxEnt2006, 8-13 July, Paris, France 33/42 Iterative Detection-Estmation (IDE)- Our method Main Idea:  Step 1 (Detection): Detect which sources are ‘active’, and which are ‘non-active’  Step 2 (Estimation): Knowing active sources, estimate their values Problem: Detecting the activity status of a source, requires the values of all other sources! Our proposition: Iterative Detection-Estimation Activity Detection  Value Estimation

MaxEnt2006, 8-13 July, Paris, France 34/42 IDE, Detection step Assumptions:  Mixture of Gaussian model for sparse sources: s i active  s i ~N(0,  1 2 ) s i in-active  s i ~N(0,  0 2 ),  1 >>  0  0 : probability of in-activity  1  Columns of A are normalized Hypotheses: Proposition:  t=a 1 T x=s 1 +   Activity detection test: g 1 (s)=|t-  |>    depends on other sources  use their previous estimates

MaxEnt2006, 8-13 July, Paris, France 35/42 IDE, Estimation step Estimation equation: Solution using Lagrange multipliers: Closed-form solution  See the paper 

MaxEnt2006, 8-13 July, Paris, France 36/42 IDE, summary Detection Step (resulted from binary hypothesis testing, with a Mixture of Gaussian source model): Estimation Step:

MaxEnt2006, 8-13 July, Paris, France 37/42 IDE, simulation results m=1024, n=0.4m=409

MaxEnt2006, 8-13 July, Paris, France 38/42 IDE, simulation results (cont.) m=1024, n=0.4m=409 IDE is about two order of magnitudes faster than LP method (with interior point optimization).

MaxEnt2006, 8-13 July, Paris, France 39/42 Speed/Complexity comparision

MaxEnt2006, 8-13 July, Paris, France 40/42 IDE, simulation results: number of possible active sources m=500, n=0.4m=200, averaged on 10 simulations

MaxEnt2006, 8-13 July, Paris, France 41/42 Conclusion and Perspectives Two problems of Underdetermined SCA:  Identifying mixing matrix  Restoring sources Two applications of finding sparse solution of USLE’s:  Source restoration in underdetermined SCA  Atomic Decomposition on over-complete dictionaries Methods:  Minimum L0 norm (  Combinatorial search)  Minimum L1 norm or Basis Pursuit (  Linear Programming)  Matching Pursuit  Iterative Detection-Estimation (IDE) Perspectives:  Better activity detection (removing thresholds?)  Applications in other domains

MaxEnt2006, 8-13 July, Paris, France 42/42 Thank you very much for your attention