Presentation is loading. Please wait.

Presentation is loading. Please wait.

T HE POWER OF C ONVEX R ELAXATION : N EAR - OPTIMAL MATRIX COMPLETION E MMANUEL J. C ANDES AND T ERENCE T AO M ARCH, 2009 Presenter: Shujie Hou February,

Similar presentations


Presentation on theme: "T HE POWER OF C ONVEX R ELAXATION : N EAR - OPTIMAL MATRIX COMPLETION E MMANUEL J. C ANDES AND T ERENCE T AO M ARCH, 2009 Presenter: Shujie Hou February,"— Presentation transcript:

1 T HE POWER OF C ONVEX R ELAXATION : N EAR - OPTIMAL MATRIX COMPLETION E MMANUEL J. C ANDES AND T ERENCE T AO M ARCH, 2009 Presenter: Shujie Hou February, 04 th,2011 Department of Electrical and Computer Engineering Cognitive Radio Institute Tennessee Technological university

2 The notations and basic knowledge on matrix completion are assumed to be known in this set of slides!

3 CONTRIBUTION Concerning with the theoretical underpinnings of matrix completion, specifically, quantifying the minimum number of entries needed to recover a low-rank matrix exactly. Developing a simple hypotheses about the matrix which makes the matrix recoverable by semidefinite programming from nearly minimally sampled entries.

4 I NFORMATION THEORETIC LIMIT Assume, it is singular value decomposing (SVD) The degrees of freedom of this matrix by counting the number of the parameters in SVD However, the minimum number of samples should be on the order of rather than on the order of based on coupon collector’s effect. Theoretic limit

5 M ODEL REVIEW This paper develops three useful matrix models for which nuclear norm minimization is guaranteed to succeed from nearly minimally sampled entries. Relax

6 S TRONG INCOHERENCE PROPERTY Strong incoherence property is that there exists parameter Basis vector in Euclidean space Implies that the eigenvector is spread out

7 T HE FIRST T HEOREM Matrices obeying strong incoherence property with a small value of the parameter can be recovered from few entries Practical problem: Do many matrices of interest obey the strong incoherence property?

8 F IRST MODEL : UNIFORMLY BOUNDED MODEL (1) In all, the only assumption about this model is that the singular vectors has small components.

9 F IRST MODEL : UNIFORMLY BOUNDED MODEL (2) This theorem shows that the uniformly bounded model satisfies the strong coherence property with large probability. This means that theorem (1.1) suits this model with large probability

10 S ECOND MODEL : L OW - RANK LOW - COHERENCE MODEL This model is a special case of the first model. Any theorems and lemma of course suites for this model. Namely, it obeys strong incoherence property, also, theorem 1.1 can be specialized for this model.

11 T HIRD MODEL : R ANDOM ORTHOGONAL MODEL This model is also a special case of the first model. Based on the above three models, it can be shown all of them obey strong incoherence property It proves that the matrices with large probability obey the strong coherence property

12 P ROOF OF THEOREM Revisit Instead, a more generalized theorem The proof of this theorem

13 A NECESSARY CONDITION (1)

14 A NECESSARY CONDITION (2) Give a lower bound of sampling number for matrix completion

15 P ROOF OF THEOREM (1.2)(1) Y is a dual certificate of M Note: In order to prove theorem(1.2), what needs to do is to prove dual certicate Y of M obey this lemma.

16 P ROOF OF THEOREM (1.2)(2) Introducing a new operator The crux of the proof is to bound the spectrum norm of

17 T HE PROOF OF THEOREM (1.2)(3) “Exact matrix recovery via convex optimization” makes use of the tools in asymptotic geometric analysis to prove the key result. This paper employs moment methods in random matrix theory to estimate the spectral norm of. The details of proof for every lemma and theorem is from page13-page45.

18 DISCUSSIONS Many algorithms have been proposed for matrix completion with reasonable time cost which implies its practical usefulness. Can the matrix completion problem be robust to noise?

19 M ATRIX COMPLETION WITH NOISE E MMANUEL J. C ANDES AND Y ANIV P LAN Presenter: Shujie Hou February, 04 th,2011 Department of Electrical and Computer Engineering Cognitive Radio Institute Tennessee Technological university

20 C ONTRIBUTIONS Mainly focuses on the review of the previous theoretical results (the two previously presented papers). Proves the results that matrix completion is robust under the condition that observed entries are corrupted with a small amount of noise.

21 N OISE MODEL or If the observations are The optimization model to recover noisy elements Relax from equality to inequality to tolerate small noise

22 K EY THEOREM Roughly speaking, the theorem states that when perfect noiseless recovery occurs, then matrix completion is stable vis a vis perturbations.

23 O RACLE MODEL Assume the eigenvector space is known by oracle. The solution of the above model The solution of the noiseless model The solution of this model achieves

24 N UMERICAL EXPERIMENTS (1) Low-rank matrix is Noisy component is

25 N UMERICAL E XPERIMENTS (2) For the large size matrix, for example, 1000 by 1000, the RMS is 0.24. Big improvement than doing nothing at all!

26 N UMERICAL E XPERIMENTS (3) Regularized model for noisy observations is set to be the smallest possible value such that if the solution of the above model is zero.

27 E XPERIMENTAL RESULTS (4) Regularized model Theoretic estimated error of oracle model RMS error achievable for oracle model

28 E XPERIMENTAL RESULTS (5)

29 E XPERIMENTAL RESULTS (6)

30 D ISCUSSIONS Expecting the tremendous growth for matrix completion in the next few years. Whether one can recover low-rank matrices from a few general linear functional. Computational algorithms.

31 Questions? Thank you!


Download ppt "T HE POWER OF C ONVEX R ELAXATION : N EAR - OPTIMAL MATRIX COMPLETION E MMANUEL J. C ANDES AND T ERENCE T AO M ARCH, 2009 Presenter: Shujie Hou February,"

Similar presentations


Ads by Google