Presentation is loading. Please wait.

Presentation is loading. Please wait.

CIS 700: β€œalgorithms for Big Data”

Similar presentations


Presentation on theme: "CIS 700: β€œalgorithms for Big Data”"β€” Presentation transcript:

1 CIS 700: β€œalgorithms for Big Data”
Lecture 9: Compressed Sensing Slides at Grigory Yaroslavtsev

2 Compressed Sensing Given a sparse signal π‘₯∈ ℝ 𝑛 can we recover it from a small number of measurements? Goal: design 𝐴∈ ℝ 𝑑×𝑛 which allows to recover any 𝑠-sparse π‘₯∈ ℝ 𝑛 from 𝐴π‘₯. 𝐴 = matrix of i.i.d. Gaussians 𝑁(0,1) Application: signals are usually sparse in some Fourier domain

3 Reconstruction Reconstruction: min π‘₯ 0 , subject to: 𝐴π‘₯=𝑏
Uniqueness: If there are two 𝑠-sparse solutions π‘₯ 1 , π‘₯ 2 : 𝐴 π‘₯ 1 βˆ’ π‘₯ 2 =0 then 𝐴 has 2𝑠 linearly dependent columns If 𝑑=Ξ©( 𝑠 2 ) and 𝐴 is Gaussian then unlikely to have linearly dependent columns π‘₯ 0 not convex, NP-hard to reconstruct π‘₯ 0 β†’ π‘₯ 1 : min π‘₯ 1 , subject to: 𝐴π‘₯=𝑏 When does this give sparse solutions?

4 Subgradient min π‘₯ 1 , subject to: 𝐴π‘₯=𝑏
π‘₯ 1 is convex but not differentiable Subgradient 𝛻f: equal to gradient where 𝑓 is differentiable any linear lower bound where 𝑓 is not differentiable βˆ€ π‘₯ 0 ,Ξ”π‘₯: 𝑓 π‘₯ 0 +Ξ”π‘₯ β‰₯𝑓 π‘₯ 𝛻𝑓 𝑇 Ξ”π‘₯ Subgradient for π‘₯ 1 : 𝛻 π‘₯ 𝑖 =𝑠𝑖𝑔𝑛( π‘₯ 𝑖 ) if π‘₯ 𝑖 β‰ 0 𝛻 π‘₯ 𝑖 ∈ βˆ’1,1 if π‘₯ 𝑖 =0 For all Ξ”π‘₯ such that 𝐴Δπ‘₯=0 satisfies 𝛻 𝑇 Ξ”π‘₯β‰₯0 Sufficient: βˆƒπ‘€ such that 𝛻= 𝐴 𝑇 𝑀 so 𝛻 𝑇 Ξ”π‘₯= 𝑀𝐴Δπ‘₯=0

5 Exact Reconstruction Property
Subgradient Thm. If 𝐴 π‘₯ 0 =𝑏 and there exists a subgradient 𝛻 for π‘₯ 1 such that 𝛻= 𝐴 𝑇 𝑀 and columns of 𝐴 corresponding to π‘₯ 0 are linearly independent then π‘₯ 0 minimizes π‘₯ 1 and is unique. (Minimum): Assume 𝐴𝑦=𝑏. Will show 𝑦 1 β‰₯ π‘₯ 𝑧=π‘¦βˆ’ π‘₯ 0 ⇒𝐴𝑧=π΄π‘¦βˆ’π΄ π‘₯ 0 =0 𝛻 𝑇 𝑧=0β‡’ 𝑦 1 = π‘₯ 0 +𝑧 β‰₯ π‘₯ 𝛻 𝑇 𝑧= π‘₯

6 Exact Reconstruction Property
(Uniqueness): assume π‘₯ 0 is another minimum 𝛻 at π‘₯ 0 is also a subgradient at π‘₯ 0 βˆ€Ξ”π‘₯:𝐴Δπ‘₯=0: π‘₯ 0 +Ξ”π‘₯ 1 = π‘₯ 0 + π‘₯ 0 βˆ’ π‘₯ 0 +Ξ”π‘₯ β‰₯ π‘₯ 𝛻 𝑇 ( π‘₯ 0 βˆ’ π‘₯ 0 +Ξ”π‘₯) = π‘₯ 𝛻 𝑇 π‘₯ 0 βˆ’ π‘₯ 0 + 𝛻 T Ξ”π‘₯ 𝛻 𝑇 π‘₯ 0 βˆ’ π‘₯ 0 = 𝑀 𝑇 𝐴 π‘₯ 0 βˆ’ π‘₯ 0 = 𝑀 𝑇 π‘βˆ’π‘ =0 π‘₯ 0 +Ξ”π‘₯ 1 β‰₯ π‘₯ 𝛻 T Ξ”π‘₯ 𝛻 i =sign (x 0 i )=sign ( π‘₯ 0 i ) if either is non-zero, otherwise equal to 0 β‡’ π‘₯ 0 and π‘₯ 0 have same sparsity pattern By linear independence of columns of 𝐴: π‘₯ 0 = π‘₯ 0

7 Restricted Isometry Property
Matrix 𝐴 satisfies restricted isometry property (RIP), if for any 𝑠-sparse π‘₯ there exists 𝛿 𝑠 : 1 βˆ’ 𝛿 𝑠 π‘₯ ≀ 𝐴π‘₯ ≀ 1+ 𝛿 𝑠 π‘₯ 2 2 Exact isometry: all eigenvalues are Β±1 for orthogonal π‘₯,𝑦: π‘₯ 𝑇 𝐴 𝑇 𝐴𝑦=0 Let 𝐴 𝑆 be the set of columns of 𝐴 in set 𝑆 Lem: If 𝐴 satisfies RIP and 𝛿 𝑠 1 + 𝑠 2 ≀ 𝛿 𝑠 1 + 𝛿 𝑠 2 : For 𝑆 of size 𝑠 singular values of 𝐴 𝑆 in [1βˆ’ 𝛿 𝑠 ,1+ 𝛿 𝑠 ] For any orthogonal π‘₯,𝑦 with supports of size 𝑠 1 , 𝑠 2 : π‘₯ 𝑇 𝐴 𝑇 𝐴𝑦 ≀ π‘₯ | 𝑦 |( 𝛿 𝑠 1 + 𝛿 𝑠 2 )

8 Restricted Isometry Property
Lem: If 𝐴 satisfies RIP and 𝛿 𝑠 1 + 𝑠 2 ≀ 𝛿 𝑠 1 + 𝛿 𝑠 2 : For 𝑆 of size 𝑠 singular values of 𝐴 𝑆 in [1βˆ’ 𝛿 𝑠 ,1+ 𝛿 𝑠 ] For any orthogonal π‘₯,𝑦 with supports of size 𝑠 1 , 𝑠 2 : π‘₯ 𝑇 𝐴 𝑇 𝐴𝑦 ≀3/2 π‘₯ 𝑦 ( 𝛿 𝑠 1 + 𝛿 𝑠 2 ) W.l.o.g π‘₯ = 𝑦 =1 so π‘₯+𝑦 2 =2 2 1βˆ’ 𝛿 𝑠 1 + 𝑠 2 ≀ 𝐴 π‘₯+𝑦 2 ≀ 2 1+ 𝛿 𝑠 1 + 𝑠 2 2 1βˆ’ 𝛿 𝑠 1 + 𝛿 𝑠 2 ≀ 𝐴 π‘₯+𝑦 2 ≀ 𝛿 𝑠 1 + 𝛿 𝑠 2 1 βˆ’ 𝛿 𝑠 1 ≀ 𝐴π‘₯ 2 ≀(1+ 𝛿 𝑠 1 ) 1 βˆ’ 𝛿 𝑠 2 ≀ 𝐴𝑦 2 ≀(1+ 𝛿 𝑠 2 )

9 Restricted Isometry Property
2 π‘₯ 𝑇 𝐴 𝑇 𝐴𝑦 = π‘₯+𝑦 𝑇 𝐴 𝑇 𝐴 π‘₯+𝑦 βˆ’ π‘₯ 𝑇 𝐴 𝑇 𝐴π‘₯βˆ’ 𝑦 𝑇 𝐴 𝑇 𝐴𝑦 = 𝐴 π‘₯+𝑦 2 βˆ’ 𝐴π‘₯ 2 βˆ’ 𝐴𝑦 2 2 π‘₯ 𝑇 𝐴 𝑇 𝐴𝑦≀2 1+ 𝛿 𝑠 1 + 𝛿 𝑠 2 βˆ’ 1 βˆ’ 𝛿 𝑠 1 βˆ’ 1 βˆ’ 𝛿 𝑠 2 =3( 𝛿 𝑠 1 + 𝛿 𝑠 2 ) π‘₯ 𝑇 𝐴 𝑇 𝐴𝑦≀ π‘₯ β‹… 𝑦 β‹…( 𝛿 𝑠 1 + 𝛿 𝑠 2 )

10 Reconstruction from RIP
Thm. If 𝐴 satisfies RIP with 𝛿 𝑠+1 ≀ 𝑠 and π‘₯ 0 is 𝑠-sparse and satisfies 𝐴 π‘₯ 0 =𝑏. Then a 𝛻( β‹… 1 ) exists at π‘₯ 0 which satisfies conditions of the β€œsubgradient theorem”. Implies that π‘₯ 0 is the unique minimum 1-norm solution to 𝐴π‘₯=𝑏. 𝑆= 𝑖 π‘₯ 0 𝑖 β‰ 0 , 𝑆 = 𝑖 π‘₯ 0 𝑖 =0 Find subgradient 𝑒 search for 𝑀: 𝑒= 𝐴 𝑇 𝑀 for π‘–βˆˆπ‘†: 𝑒 𝑖 =𝑠𝑖𝑔𝑛 π‘₯ 0 2-norm of the coordinates in 𝑆 is minimized

11 Reconstruction from RIP
Let 𝑧 be a vector with support 𝑆: 𝑧 𝑖 =sign (π‘₯ 0 𝑖 ) Let 𝑀= 𝐴 𝑆 𝐴 𝑆 𝑇 𝐴 𝑆 βˆ’1 𝑧 𝐴 𝑆 has independent columns by RIP For coordinates in 𝑆: 𝐴 𝑇 𝑀 𝑆 = 𝐴 𝑆 𝑇 𝐴 𝑆 𝐴 𝑆 𝑇 𝐴 𝑆 βˆ’1 𝑧=𝑧 For coordinates in 𝑆 : 𝐴 𝑇 𝑀 𝑆 = 𝐴 𝑆 𝑇 𝐴 𝑆 𝐴 𝑆 𝑇 𝐴 𝑆 βˆ’1 𝑧 Eigenvalues of 𝐴 𝑆 𝑇 𝐴 𝑆 are in 1βˆ’ 𝛿 𝑠 2 , 1+ 𝛿 𝑠 2 || 𝐴 𝑆 𝑇 𝐴 𝑆 βˆ’1 ||≀ 1 1βˆ’ 𝛿 𝑠 2 , let 𝑝= 𝐴 𝑆 𝑇 𝐴 𝑆 βˆ’1 𝑧, 𝑝 ≀ 𝑠 1βˆ’ 𝛿 𝑠 2 𝐴 𝑆 𝑝=π΄π‘ž where π‘ž has all coordinates in 𝑆 equal 0 For π‘—βˆˆ 𝑆 : 𝐴 𝑇 𝑀 𝑗 = 𝑒 𝑗 𝑇 𝐴 𝑇 π΄π‘ž so | 𝐴 𝑇 𝑀 𝑗 |≀ 𝛿 𝑠 + 𝛿 1 𝑠 1βˆ’ 𝛿 𝑠 2 ≀ 𝛿 𝑠 𝑠 1βˆ’ 𝛿 𝑠 2 ≀ 1 2


Download ppt "CIS 700: β€œalgorithms for Big Data”"

Similar presentations


Ads by Google