Rice/Duke | Compressive Optical Devices | August 2007 Richard Baraniuk Kevin Kelly Rice University Compressive Optical Imaging Systems – Theory, Devices,

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Aggregating local image descriptors into compact codes
Compressive Sensing IT530, Lecture Notes.
11/11/02 IDR Workshop Dealing With Location Uncertainty in Images Hasan F. Ates Princeton University 11/11/02.
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Richard G. Baraniuk Chinmay Hegde Sriram Nagaraj Go With The Flow A New Manifold Modeling and Learning Framework for Image Ensembles Aswin C. Sankaranarayanan.
CS-MUVI Video compressive sensing for spatial multiplexing cameras Aswin Sankaranarayanan, Christoph Studer, Richard G. Baraniuk Rice University.
Multi-Task Compressive Sensing with Dirichlet Process Priors Yuting Qi 1, Dehong Liu 1, David Dunson 2, and Lawrence Carin 1 1 Department of Electrical.
Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 2: Compressive Sampling for Analog Time Signals.
Graph Laplacian Regularization for Large-Scale Semidefinite Programming Kilian Weinberger et al. NIPS 2006 presented by Aggeliki Tsoli.
compressive nonsensing
Compressed sensing Carlos Becker, Guillaume Lemaître & Peter Rennert
Richard Baraniuk Rice University Progress in Analog-to- Information Conversion.
Compressive Signal Processing Richard Baraniuk Rice University.
ECE Department Rice University dsp.rice.edu/cs Measurements and Bits: Compressed Sensing meets Information Theory Shriram Sarvotham Dror Baron Richard.
Bayesian Robust Principal Component Analysis Presenter: Raghu Ranganathan ECE / CMR Tennessee Technological University January 21, 2011 Reading Group (Xinghao.
Multiple Criteria for Evaluating Land Cover Classification Algorithms Summary of a paper by R.S. DeFries and Jonathan Cheung-Wai Chan April, 2000 Remote.
“Random Projections on Smooth Manifolds” -A short summary
School of Computing Science Simon Fraser University
Volkan Cevher, Marco F. Duarte, and Richard G. Baraniuk European Signal Processing Conference 2008.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Richard Baraniuk Rice University dsp.rice.edu/cs Compressive Signal Processing.
Compressed Sensing for Networked Information Processing Reza Malek-Madani, 311/ Computational Analysis Don Wagner, 311/ Resource Optimization Tristan Nguyen,
Compressive Signal Processing
Spatial and Temporal Data Mining
Introduction to Compressive Sensing
Rice University dsp.rice.edu/cs Distributed Compressive Sensing A Framework for Integrated Sensing and Processing for Signal Ensembles Marco Duarte Shriram.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Ashish Uthama EOS 513 Term Paper Presentation Ashish Uthama Biomedical Signal and Image Computing Lab Department of Electrical.
Random Projections of Signal Manifolds Michael Wakin and Richard Baraniuk Random Projections for Manifold Learning Chinmay Hegde, Michael Wakin and Richard.
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Introduction --Classification Shape ContourRegion Structural Syntactic Graph Tree Model-driven Data-driven Perimeter Compactness Eccentricity.
Compressive Sampling: A Brief Overview
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Chapter 2. Image Analysis. Image Analysis Domains Frequency Domain Spatial Domain.
Lensless Imaging Richard Baraniuk Rice University Ashok Veeraraghavan
Game Theory Meets Compressed Sensing
 Coding efficiency/Compression ratio:  The loss of information or distortion measure:
Compressive Sensing A New Approach to Signal Acquisition and Processing Richard Baraniuk Rice University.
Rice “Single-Pixel” CS Camera random pattern on DMD array DMD single photon detector image reconstruction object Richard Baraniuk, Kevin Kelly, and students.
Richard Baraniuk Chinmay Hegde Marco Duarte Mark Davenport Rice University Michael Wakin University of Michigan Compressive Learning and Inference.
Cs: compressed sensing
Intelligent Vision Systems ENT 496 Object Shape Identification and Representation Hema C.R. Lecture 7.
Model-Based Compressive Sensing Presenter: Jason David Bonior ECE / CMR Tennessee Technological University November 5, 2010 Reading Group (Richard G. Baraniuk,
Compressive Sensing for Multimedia Communications in Wireless Sensor Networks By: Wael BarakatRabih Saliba EE381K-14 MDDSP Literary Survey Presentation.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
CCN COMPLEX COMPUTING NETWORKS1 This research has been supported in part by European Commission FP6 IYTE-Wireless Project (Contract No: )
H. Lexie Yang1, Dr. Melba M. Crawford2
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
SAR-ATR-MSTAR TARGET RECOGNITION FOR MULTI-ASPECT SAR IMAGES WITH FUSION STRATEGIES ASWIN KUMAR GUTTA.
Chapter1: Introduction Chapter2: Overview of Supervised Learning
Chapter 13 (Prototype Methods and Nearest-Neighbors )
Multiscale Geometric Signal Processing in High Dimensions
Reconstruction-free Inference on Compressive Measurements Suhas Lohit, Kuldeep Kulkarni, Pavan Turaga, Jian Wang, Aswin Sankaranarayanan Arizona State.
Terahertz Imaging with Compressed Sensing and Phase Retrieval Wai Lam Chan Matthew Moravec Daniel Mittleman Richard Baraniuk Department of Electrical and.
Compressive Sensing Techniques for Video Acquisition EE5359 Multimedia Processing December 8,2009 Madhu P. Krishnan.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
1 C.A.L. Bailer-Jones. Machine Learning. Data exploration and dimensionality reduction Machine learning, pattern recognition and statistical data modelling.
Compressive Coded Aperture Video Reconstruction
Intrinsic Data Geometry from a Training Set
Bag-of-Visual-Words Based Feature Extraction
ISOMAP TRACKING WITH PARTICLE FILTERING
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Terahertz Imaging with Compressed Sensing and Phase Retrieval
Using Manifold Structure for Partially Labeled Classification
Sudocodes Fast measurement and reconstruction of sparse signals
Review and Importance CS 111.
Goodfellow: Chapter 14 Autoencoders
Presentation transcript:

Rice/Duke | Compressive Optical Devices | August 2007 Richard Baraniuk Kevin Kelly Rice University Compressive Optical Imaging Systems – Theory, Devices, Implementation David Brady Rebecca Willett Duke University

Rice/Duke | Compressive Optical Devices | August 2007 Project Overview Richard Baraniuk

Digital Revolution

camera arrayshyperspectral cameras distributed camera networks

Sensing by Sampling Long-established paradigm for digital data acquisition –sample data at Nyquist rate (2x bandwidth) –compress data (signal-dependent, nonlinear) –brick wall to resolution/performance compress transmit/store receivedecompress sample sparse wavelet transform

Compressive Sensing (CS) Directly acquire “compressed” data Replace samples by more general “measurements” compressive sensing transmit/store receivereconstruct

Compressive Sensing When data is sparse/compressible, can directly acquire a condensed representation with no/little information loss through dimensionality reduction measurements sparse signal sparse in some basis

Compressive Sensing When data is sparse/compressible, can directly acquire a condensed representation with no/little information loss Random projection will work measurements sparse signal sparse in some basis [Candes-Romberg-Tao, Donoho, 2004] for signal reconstruction

Compressive Optical Imaging Systems – Theory, Devices, and Implementation $400k budget for roughly April –administered by ONR –Rice portion expended; Duke portion in NCE Goals: –forge collaboration between Rice and Duke teams –demonstrate new Compressive Imaging technologies  hardware testbeds/demos at Rice and Duke  new theory/algorithms –quantify performance –articulate emerging directions Collaborations: –telecons, visits, joint projects, joint papers, artwork

Rice/Duke | Compressive Optical Devices | August 2007 Gerhard Richter 4096 Farben / 4096 Colours cm X 254 cm Laquer on Canvas Catalogue Raisonné: 359 Museum Collection: Staatliche Kunstsammlungen Dresden (on loan) Sales history: 11 May 2004 Christie's New York Post-War and Contemporary Art (Evening Sale), Lot 34 US$3,703,500

Rice/Duke | Compressive Optical Devices | August 2007 Gerhard Richter Dresden Cathedral Stained Glass

Agenda Rebecca Willett, Duke[theory/algorithms] Kevin Kelly, Rice[hardware] David Brady, Duke[hardware] Richard Baraniuk, Rice[theory/algorithms] Discussion and Conclusions

Rice/Duke | Compressive Optical Devices | August 2007 Compressive Image Processing Richard Baraniuk

Mike Wakin Marco Duarte Mark Davenport Shri Sarvotham Petros Boufounos Matthew Moravec Mona Sheikh Jason Laska

Rice/Duke | Compressive Optical Devices | August 2007 Image Classification/Segmentation using Duke Hyperspectral System (with Rebecca Willett)

Information Scalability If we can reconstruct a signal from compressive measurements, then we should be able to perform other kinds of statistical signal processing: –detection –classification –estimation … Hyperspectral image classification/segmentation

Classification Example spectrum 2 spectrum 1 spectrum 3

Nearest Projected Neighbor normalize measurements compute nearest neighbor

Naïve Results block size

Results naïve independent classification tree-based classification

Voting / Cycle Spinning block radius in pixels

Summary Direct hyperspectral classification/segmentation without reconstructing 3D data cube Future directions –replace nearest projected neighbor with more sophisticated methods  smashed filter  projected SVM  quad-tree based multiscale segmentation (HMTseg, …) Joint paper in the works

Rice/Duke | Compressive Optical Devices | August 2007 Performance Analysis of Multiplexed Cameras

Single-Pixel Camera Analysis random pattern on DMD array DMD photon detector image reconstruction or processing Analyze performance in terms of –dynamic range and #bits of A/D –MSE due to photon counting noise –number of measurements

Rice/Duke | Compressive Optical Devices | August 2007 Single Pixel Image Acquisiton For a N -pixel, K -sparse image under T -second exposure: Raster Scan: Acquire one pixel at a time, repeat N times Basis Scan:Acquire one coefficient of image in a fixed basis at a time, repeat N times CS Scan:Acquire one incoherent/random projection of the image at a time, repeat times

Rice/Duke | Compressive Optical Devices | August 2007 Worst-Case Performance N : Number of pixels P : Number of photons per pixel T : Total capture time M : Number of measurements C N : CS noise amplification constant Sensor array shown as baseline Table shows requirements to match worst-case performance CS beats Basis Scan if

Rice/Duke | Compressive Optical Devices | August 2007 Single Pixel Camera Experimental Performance N = M = 1640  = Daub-8

Multiplexed Camera Analysis random pattern on DMD array DMD S photon detectors image reconstruction or processing lens(es) Dude, you gotta multiplex!

Rice/Duke | Compressive Optical Devices | August 2007 S -Pixel Camera Performance N : Number of pixels P : Number of photons per pixel T : Total capture time M : Number of measurements C N : CS noise amplification constant Sensor array shown as baseline M measurements split across S sensors Single pixel camera: S = 1

Rice/Duke | Compressive Optical Devices | August 2007 S -Pixel Camera Performance N : Number of pixels P : Number of photons per pixel T : Total capture time M : Number of measurements C N : CS noise amplification constant Sensor array shown as baseline M measurements split across S sensors Single pixel camera: S = 1 CS beats Basis Scan if

Rice/Duke | Compressive Optical Devices | August 2007 Smashed Filter – Compressive Matched Filtering

Information Scalability If we can reconstruct a signal from compressive measurements, then we should be able to perform other kinds of statistical signal processing: –detection –classification –estimation … Smashed filter: compressive matched filter

Matched Filter Signal classification in additive white Gaussian noise –LRT: classify test signal as from Class i if it is closest to template signal i –GLRT: when test signal can be a transformed version of template, use matched filter When signal transformations are well-behaved, transformed templates form low-dimensional manifolds –GLRT = matched filter = nearest manifold classification M1M1 M2M2 M3M3

Compressive LRT Compressive observations By the Johnson-Lindenstrauss Lemma, random projection preserves pairwise distances with high probability

Smashed Filter Compressive observations of transformed signal Theorem: Structure of smooth manifolds is preserved by random projection w.h.p. provided distances, geodesic distances, angles, volume, dimensionality, topology, local neighborhoods, … [Wakin et al 2006; to appear in Foundations on Computational Mathematics] M1M1 M2M2 M3M3 M1M1 M2M2 M3M3

Stable Manifold Embedding Theorem: Let F ½ R N be a compact K -dimensional manifold with –condition number 1/ (curvature, self-avoiding) –volume V Let  be a random M x N orthoprojector with [Wakin et al 2006] Then with probability at least 1-, the following statement holds: For every pair x, y 2 F

Manifold Learning from Compressive Measurements ISOMAPHLLE Laplacian Eigenmaps R 4096 RMRM M =15 M =20

Smashed Filter – Experiments 3 image classes: tank, school bus, SUV N = pixels Imaged using single-pixel CS camera with –unknown shift –unknown rotation

Smashed Filter – Unknown Position Object shifted at random ( K =2 manifold) Noise added to measurements Goal: identify most likely position for each image class identify most likely class using nearest-neighbor test number of measurements M avg. shift estimate error classification rate (%) more noise

Smashed Filter – Unknown Rotation Object rotated each 2 degrees Goals: identify most likely rotation for each image class identify most likely class using nearest-neighbor test Perfect classification with as few as 6 measurements Good estimates of rotation with under 10 measurements number of measurements M avg. rot. est. error

How Low Can M Go? Empirical evidence that many fewer than measurements are needed for effective classification Late-breaking results (experimental+nascent theory)

Summary – Smashed Filter Compressive measurements are info scalable reconstruction > estimation > classification > detection Random projections preserve structure of smooth manifolds (analogous to sparse signals) Smashed filter: dimension-reduced GLRT for parametrically transformed signals –exploits compressive measurements and manifold structure –broadly applicable: targets do not have to have sparse representation in any basis –effective for detection/classification –number of measurements required appears to be independent of the ambient dimension

Rice/Duke | Compressive Optical Devices | August 2007 Compressive Phase Retrieval for Fourier Imagers

Coherent Diffraction Imaging Image by sampling in Fourier domain Challenge:we observe only the magnitude of the Fourier measurements

Phase Retrieval Given: Fourier magnitude + additional constraints (typically support) Goal:Estimate phase of Fourier transform Compressive Phase Retrieval (CPR) replace image support constraint with a sparsity/compressibility constraint nonconvex reconstruction

Rice/Duke | Compressive Optical Devices | August 2007 Conclusions and Future Directions

Rice/Duke | Compressive Optical Devices | August 2007 Project Outcomes Forged collaboration between Rice and Duke teams –several joint papers in progress Demonstrated new Compressive Imaging technologies –hardware testbeds/demos  hyperspectral, low-light, infrared DMD cameras  coded aperture spectral imagers –new theory/algorithms  spectral image reconstruction/classification methods  smashed filter Quantified performance –coded aperture tradeoffs –multiplexing tradeoff –number of measurements required for reconstruction/classification

Emerging Directions Nonimaging cameras –exploit information scalability –attentive/adaptive cameras –meta-analysis –separating “imaging process” from “display” Multiple cameras –image beamforming, 3D geometry imaging, … Deeper links between physics and signal processing –significance of coherence and spectral projections Links to analog-to-information program –nonidealities as challenges vs. opportunities Other modalities –THz, LWIR/MWIR, UV, soft x-rays, …

Rice/Duke | Compressive Optical Devices | August 2007 N - Pixel Camera Performance N : Number of pixels P : Number of photons per pixel T : Total capture time M : Number of measurements C N : CS noise amplification constant Sensor array shown as baseline 1 sensor per pixel - CS is unnecessary

Rice/Duke | Compressive Optical Devices | August 2007 Smashed Filter under Poisson noise Problem: vehicle image classification under variable parameter (shift, rotation, etc.) Image acquisition: M random projections under signal-dependent (Poisson) noise with single pixel camera Limited capture time T split among M projections Solution: use articulation manifold structure and generalized maximum likelihood classification (smashed filter)

Rice/Duke | Compressive Optical Devices | August 2007 Smashed Filter performance under Poisson noise Shift (2D manifold) Rotation (1D manifold) Small number of measurements M for good performance “Sweet spot” on M for shorter exposures T

CS Hallmarks CS changes the rules of the data acquisition game –exploits a priori signal sparsity information Universal –same random projections / hardware can be used for any compressible signal class (generic) Democratic –each measurement carries the same amount of information –simple encoding –robust to measurement loss and quantization Asymmetrical (most processing at decoder) Random projections weakly encrypted

Rice/Duke | Compressive Optical Devices | August 2007 Smashed Filter: How Low Can M Go?

Rice/Duke | Compressive Optical Devices | August 2007 Preservation of Manifold Structure Manifold Learning  Used for classification, visualization of high dimensional data, robust parameter estimation Network of single-pixel cameras ==== Randomly projected version of low-dimensional image manifold.  New result: stable manifold learning is possible without ever reconstructing the original images  Number of measurements sufficient for arbitrarily small learning error: linear in the information level K of the manifold

Rice/Duke | Compressive Optical Devices | August 2007 Translating disk manifold Learning algorithm: LTSA (Zhang, Zha ) ‏ 25 random projections 50 random projections 100 random projections N = 64 x 64 = 4096, K = 2 Learning with original data: (N = 4096) ‏

Rice/Duke | Compressive Optical Devices | August 2007 Manifold learning using random projections Demonstrates that random projections contain sufficient information about the manifold structure Two stages in manifold learning –Intrinsic dimension estimation –Construction of nonlinear map into low-dimensional Euclidean space –New result: estimation errors in both stages due to dimensionality-reducing projections can be controlled up to arbitrary accuracy with small number of measurements Ideal for distributed networks; sensors need to transmit very few pieces of information to the centralized learning algorithm

Rice/Duke | Compressive Optical Devices | August 2007 Intrinsic Dimension estimation GP algorithm used directly on random projections of hyperspheres Empirically compute the number of measurements required for estimate to be within 10% of the original. Observation: M linear in the intrinsic dimension K

Rice/Duke | Compressive Optical Devices | August 2007 Real data: Hand rotation database N = 64 x 60 = 3840, K = 2

Rice/Duke | Compressive Optical Devices | August 2007 New Bound for Classification? Smashed Filter – Nearest Neighbor classifier Indyk, Naor : preservation of approximate nearest neighbors requires merely O(K) random projections Minimum number of measurements required for classification in noiseless case (where D is the minimum separation between signal classes ):

Rice/Duke | Compressive Optical Devices | August 2007 Experiment: Hyperspherical manifolds 1000 labeled training samples each from two unit 3-dimensional hyperspheres, separated by a distance D along an arbitrary direction in dimensional space Generate unlabeled samples, perform nearest neighbor classification in the compressed (“smashed”) domain Determine minimum number of measurements M required to obtain 99% classification rate. Bound: M decreases as square of the separation distance.

Rice/Duke | Compressive Optical Devices | August 2007 Hyperspherical manifolds: empirical verification of bound

Why Does CS Work (1)? Random projection not full rank, but stably embeds –sparse/compressible signal models (CS) –point clouds (JL) into lower dimensional space with high probability Stable embedding: preserves structure –distances between points, angles between vectors, …

Why Does CS Work (1)? Random projection not full rank, but stably embeds –sparse/compressible signal models (CS) –point clouds (JL) into lower dimensional space with high probability Stable embedding: preserves structure –distances between points, angles between vectors, … provided M is large enough: Compressive Sensing K -dim planes K -sparse model

CS Signal Recovery Recover sparse/compressible signal x from CS measurements y via linear programming K -dim planes K -sparse model recovery

Why Does CS Work (2)? Random projection not full rank, but stably embeds –sparse/compressible signal models (CS) –point clouds (JL) into lower dimensional space with high probability Stable embedding: preserves structure –distances between points, angles between vectors, … provided M is large enough: Johnson-Lindenstrauss Q points

Tree-based classification Refine classification of blocks having neighbors from a different class

Tree-based classification Refine classification of blocks having neighbors from a different class

Tree-based classification Refine classification of blocks having neighbors from a different class