Presentation is loading. Please wait.

Presentation is loading. Please wait.

Recent Advances of Compact Hashing for Large-Scale Visual Search Shih-Fu Chang Columbia University October 2012 Joint work with Junfeng He (Facebook),

Similar presentations


Presentation on theme: "Recent Advances of Compact Hashing for Large-Scale Visual Search Shih-Fu Chang Columbia University October 2012 Joint work with Junfeng He (Facebook),"— Presentation transcript:

1 Recent Advances of Compact Hashing for Large-Scale Visual Search Shih-Fu Chang Columbia University October 2012 Joint work with Junfeng He (Facebook), Sanjiv Kumar (Google), Wei Liu (IBM Research), and Jun Wang (IBM Research)

2 digital video | multimedia lab Outline Lessons learned in designing hashing functions The importance of balancing hash bucket size How to incorporate supervised information Prediction of NN search difficulty & hashing performance Demo: Bag of hash bits for Mobile Visual Search

3 Fast Nearest Neighbor Search Applications: image search, texture synthesis, denoising … Avoid exhaustive search ( time complexity) 3 Dense matching, Coherence sensitive hashing (Korman&Avidan ’11) Photo tourism patch search Image search

4 Locality-Sensitive Hashing hash code collision probability proportional to original similarity l : # hash tables, K : hash bits per table 0 1 0 1 0 1 4 hash function random 101 Index by compact code [Indyk, and Motwani 1998] [Datar et al. 2004]

5 Hash Table based Search 5 O(1) search time by table lookup bucket size is important (affect accuracy & post processing cost) xixi n q 01101 01110 01111 01100 hash table hash bucketaddress

6 Different Approaches 6 Unsupervised Hashing LSH ‘98, SH ‘08, KLSH ‘09, AGH ’10, PCAH, ITQ ‘11 Semi-Supervised Hashing SSH ‘10, WeaklySH ‘10 Supervised Hashing RBM ‘09, BRE ‘10, MLH ‘11, LDAH ’11, ITQ ‘11, KSH ‘12

7 PCA + Minimize Quantization Errors PCA to maximize variance in each hash dimension find optimal rotation in the subspace to minimize quantization error ITQ method, Gong&Lazebnik, CVPR 11

8 Effects of Min Quantization Errors 580K tiny images PCA-ITQ, Gong&Lazebnik, CVPR 11 PCA-random rotation PCA-ITQ optimal alignment

9 Utilize supervised labels Semantic Category Supervision 9 Metric Supervision similar dissimilar similar dissimilar

10 Design Hash Codes to Match Supervised Information 10 similar dissimilar 0 1 Preferred hashing function

11 Adding Supervised Labels to PCA Hash Relaxation: Wang, Kumar, Chang, CVPR ’10, ICML’10 “adjusted” covariance matrix solution W: eigen vectors of adjusted covariance matrix If no supervision (S=0), it is simply PCA hash Fitting labels PCA covariance matrix dissimilar pair similar pair

12 Semi-Supervised Hashing (SSH) 1 Million GIST Images 1% labels, 99% unlabeled Supervised RBM Random LSH Unsupervised SH SSH Precision @ top 1K

13 Problem of orthogonal projections Many buckets become empty when # bits increases. Need to search many neighbor buckets at query time Precision @ hamming radius 2

14 Explicitly optimize two terms – Preserve similarity (accuracy) – Balanced bucket size  max entropy  min mutual info I (search time) Search accuracy ICA Type Hashing Balanced bucket size SPICA Hash, He et al, CVPR 11 Fast ICA to find non-orthogonal projections

15 The Importance of balanced size Bucket index Bucket size LSH SPICA Hash Balanced bucket size Simulation over 1M tiny image samples The largest bucket of LSH contains 10% of all 1M samples

16 Different Approaches 16 Unsupervised Hashing LSH ‘98, SH ‘08, KLSH ‘09, AGH ’10, PCAH, ITQ ‘11 Semi-Supervised Hashing SSH ‘10, WeaklySH ‘10 Supervised Hashing RBM ‘09, BRE ‘10, MLH ‘11, LDAH ’11, ITQ ‘11, KSH ‘12

17 Better ways to handle supervised information? 17 MLH [Norouzi & Flee, ‘11] BRE [Kulis & Darrell, ‘10] Hamming distance between H(x i ) and H(x j ) hinge loss But optimizing Hamming Distance (D H, XOR) is not easy!

18 A New Supervision Form: Code Inner Products 18 S x2x2 x3x3 x1x1 dissimilar similar supervised hashing labeled data dissimilar Х T code matrix x1x1 x2x2 x3x3 x1x1 x2x2 x3x3 pair-wise label matrix code inner products r x1x1 x2x2 x3x3 code matrix fitting Liu, Wang, Ji, Jiang, Chang, CVPR’12 proof: code inner product ≡ Hamming distance

19 Code Inner Product enables efficient optimization Much easier/faster to optimize and extend to kernels 19 sample hash bit Hashing: Design hash codes to match supervised information Liu, Wang, Ji, Jiang, Chang, CVPR2012

20 Extend Code Inner Product to Kernel Following KLSH, construct a hash function using a kernel function and m anchor samples: zero-mean normalization applied to k(x). 20 = sgn hash coefficients kernel matrix × l samples m anchors

21 Benefits of Code Inner Product 21 CIFAR 10, 60K object images from 10 classes, 1K query images. 1K supervised labels. KSH 0 Spec Relax, KSH Sigmoid hashing function Supervised Methods Open Issue: empty buckets and balance not addressed

22 Speedup by Inner Code Product 22CVPR 2012 Method Train Time Test Time 48 bits SSH 2.1 0.9×10 −5 LDAH 0.7 0.9×10 −5 BRE 494.7 2.9×10 −5 MLH 3666.3 1.8×10 −5 KSH 0 7.0 3.3×10 −5 KSH 156.1 4.3×10 −5 Significant speedup

23 Effect of training data size 23

24 Tiny-1M 24CVPR 2012 1M tiny images from the web, 2K query images. Pseudo labels: top 5% L2 neighbors. top 5K NN neighbors used in as supervised labels Supervised Methods

25 25 Tiny-1M: Visual Search Results CVPR 2012 More visually relevant

26 Comparison: KD-Tree O(log n) search time (e.g., 20 bits for 1 million nodes) E.g., VlFeat/FLANN tool, Best-Fit-First Search Strategy curse of dimensionality – needs of backtracking Might be hard to store tree indexes on small devices

27 Comparison of Hashing vs. KD-Tree Supervised Hashing Photo Tourism Patch set (Norte Dame subset, 103K samples) 512D GIFT Anchor Graph Hashing KD Tree

28 How difficult is approximate nearest neighbor search in a dataset? Understand Difficulty of Approximate Nearest Neighbor Search Toy example q x is an ε-approximate NN if Search not meaningful! A concrete measure of difficulty of search in a dataset? He, Kumar, Chang, ICML 2012

29 A naïve search approach: Randomly pick a point and compare that to the NN Relative Contrast q High Relative Contrast  easier search If, search not meaningful He, Kumar, Chang, ICML 2012

30 With CLT, and binomial approximation Estimation of Relative Contrast ϕ - standard Gaussian cdf σ' – a function of data properties (dimensionality and sparsity) n: data size p: Lp distance

31 Data sampled randomly from U[0,1] Synthetic Data relative contrast higher dimensionality  bad sparser vectors  good s: prob. of non-zero element in each dim. d: feature dimension

32 Data sampled randomly from U[0,1] Synthetic Data relative contrast lower p  good Larger database  good

33 Predict Hashing Performance of Real-World Data 16 bits LSH DatasetDimensionality (d) Sparsity (s) Relative Contrast (C r ) for p = 1 SIFT1280.894.78 Gist3841.001.83 Color Hist13820.0273.19 Imagenet BoW100000.0241.90 28 bits LSH

34 Mobile Search System by Hashing 34 Light Computing Low Bit Rate Big Data Indexing He, Feng, Liu, Cheng, Lin, Chung, Chang. Mobile Product Search with Bag of Hash Bits and Boundary Reranking, CVPR 2012.

35 Estimate the Complexity 500 local features per image – Feature size ~128 Kbytes – more than 10 seconds for transmission over 3G Database indexing – 1 million images need 0.5 billions local features – Finding matched features becomes challenging Idea: directly compute compact hash codes on mobile devices

36 Approach: hashing Each local feature coded as hash bits –locality sensitive, efficient for high dimensions Each image is represented as Bag of Hash Bits 011001100100111100… 110110011001100110… 36

37 Bit Reuse for Multi-Table Hashing To reduce transmission size – Reuse a single hash bit pool by random subsampling 37 1 0 0 1 1 1 0 0 0 0 1 0 1 0 1 0... 0 0 1 1 0 1 1 1 Optimal hash bit pool (e.g., 80 bits, PCA Hash or SPICA hash) Random subset... Table 1 Table 2 Table 11 Table 12... 32 bits Union Results

38 Rerank Results with Boundary Features Use automatic salient object segmentation for every image in DB [Cheng et al, CVPR 2011] Compute boundary features: normalized central distance, Fourier magnitude Invariance: translation, scaling, rotation 38

39 Boundary Feature – Central Distance Distance to Center D(n) FFT: F(n) 39

40 Reranking with boundary feature 40

41 Server: 1 million product images crawled from Amazon, eBay and Zappos Hundreds of categories; shoes, clothes, electrical devices, groceries, kitchen supplies, movies, etc. Speed Feature extraction: ~1s Transmission: 80 bits/feature, 1KB/image Serer Search: ~0.4s Download/display: 1-2s Mobile Product Search System: Bags of Hash Bits and Boundary features video demovideo demo (52”) He, Feng, Liu, Cheng, Lin, Chung, Chang. Mobile Product Search with Bag of Hash Bits and Boundary Reranking, CVPR 2012.

42 Performance Baseline [Chandrasekhar et al CVPR ‘10]: Client: compress local features with CHoG Server: BoW with Vocabulary Tree (1M codes) 30% higher recall and 6X-30X search speedup 42

43 Summary Some Ideas Discussed – bucket balancing is important – code inner product – an efficient form of supervised hashing – insights on search difficulty prediction – Large mobile search – a good test case for hashing Open Issues – supervised hashing vs. attribute discovery – hashing beyond point-to-point search – hashing to incorporate structured relation (spatio- temporal) 43

44 References ( Supervised Kernel Hash) W. Liu, J. Wang, R. Ji, Y. Jiang, and S.-F. Chang, Supervised Hashing with Kernels, CVPR 2012. (Difficulty of Nearest Neighbor Search) J. He, S. Kumar, S.-F. Chang, On the Difficulty of Nearest Neighbor Search, ICML 2012. ( Hash Based Mobile Product Search) J. He, T. Lin, J. Feng, X. Liu, S.-F. Chang, Mobile Product Search with Bag of Hash Bits and Boundary Reranking, CVPR 2012 (Hashing with Graphs) W. Liu, J. Wang, S. Kumar, S.-F. Chang. Hashing with Graphs, ICML 2011. (Iterative Quantization) Y. Gong and S. Lazebnik, Iterative Quantization: A Procrustean Approach to Learning Binary Codes, CVPR 2011. (Semi-Supervised Hash) J. Wang, S. Kumar, S.-F. Chang. Semi-Supervised Hashing for Scalable Image Retrieval. CVPR 2010. (ICA Hashing) J.He, R. Radhakrishnan, S.-F. Chang, C. Bauer. Compact Hashing with Joint Optimization of Search Accuracy and Time. CVPR 2011. 44


Download ppt "Recent Advances of Compact Hashing for Large-Scale Visual Search Shih-Fu Chang Columbia University October 2012 Joint work with Junfeng He (Facebook),"

Similar presentations


Ads by Google