Presentation is loading. Please wait.

Presentation is loading. Please wait.

Dept. of Computer Science & Engineering, CUHK Pseudo Relevance Feedback with Biased Support Vector Machine in Multimedia Retrieval Steven C.H. Hoi 14-Oct,

Similar presentations


Presentation on theme: "Dept. of Computer Science & Engineering, CUHK Pseudo Relevance Feedback with Biased Support Vector Machine in Multimedia Retrieval Steven C.H. Hoi 14-Oct,"— Presentation transcript:

1 Dept. of Computer Science & Engineering, CUHK Pseudo Relevance Feedback with Biased Support Vector Machine in Multimedia Retrieval Steven C.H. Hoi 14-Oct, 2003

2 Outline Introduction Related Work and Motivation SVM and one-class SVM Biased Support Vector Machine (BSVM) Pseudo Relevance Feedback (PRF) with BSVM Experimental results Future Work and Conclusions

3 Introduction Relevance Feedback (RF) –Is an interactive technique for query reformulation. –Has been shown successfully used in CBIR. Drawbacks of RF –Need user ’ s intervention to provide relevance information. This impose a burden to users. –Most of RF algorithms focus only on positive examples. Pseudo RF (PRF) alleviate the user ’ s burden –Successfully used in text information retrieval. –Use potential positive examples and negative examples to refine the retrieval quality.

4 Related Work Classical RF Algorithms –From heuristic weighting adjustment to optimal Learning, Transductive Learning (EM, D-EM), DA, SVM, 1-SVM Motivation of BSVM –Classical RF algorithms have limited help in the context of PRF. –Biased Feature in PRF: potential positive examples are limited and clustered in some way, while much negative information is available and scattered in different ways. –In Ref. [1], Yan ec al. propose use SVM-ensembles to shape PRF. –SVMs have shown good generalization performance for classification tasks. –However, classical SVMs treat positive and negative examples symmetrically. –We proposed the Biased SVM to attack the biased feature in PRF.

5 SVM & 1-SVM SVM –implements the principle of structural risk minimization and can provide good generalization performance. –In the linearly separable case, the goal of SVM is to look for a pair of hyperplanes which give the maximum margin.

6 1-SVM –1-SVM only focuses on estimating the density of positive examples. –Two kinds of dierent formulation of 1-SVM in the literature. –The sphere-based approach

7 BSVM –The strategy of our technique is that we take two sphere- hyperplanes to describe the data. –We hope the inner sphere- hyperplane can be as small as possible while it includes most of the positive data. –On the other hand, we want the outer sphere-hyperplane to be as large as possible while it contains least of the negative data.

8 BSVM Let us consider the following training data: where n is the number of observations, m is the dimension of the input space, and Y is the class label of the training data. The objective function for finding the optimal sphere-hyperplane can be formulated below:

9 The optimization task can be solved by Lagrange multipliers:

10 Let us take the partial derivative of L with respect to the parameters respectively. Then setting their partial derivatives to 0, we obtain the following equations,

11 From the equations, we can we obtain the dual form of the primal optimization problem: The decision function can be constructed as follows,

12 Differences between BSVM and SVM –Geometric interpretation Compared with the linear decision hyperplane in SVM, BSVM takes a sphere to cluster the class of positive examples, while another class for negative examples is pushed away from the sphere. –Mathematic Perspective Constraint in optimization of SVM: While it becomes another form in BSVM: This means the positive class C 1 will receive more weight than the negative class C 2. It will make the final decision boundary biased toward to the positive class.

13 PRF with BSVM Compared with SVM –SVM will classify many unseen samples as the positive class while some of them actually do not belong to the class. –BSVM will be more reasonable.

14 Compared with 1-SVM –Without incorporating the negative information, one-class SVM will classify many negative samples to the positive class. –BSVM can model the data well with the negative constraints.

15 PRF with BSVM Implementation –Figuring out the dual optimization equation (Equation (*)) belongs to a well-know problem called quadratic programming (QP) problem.Equation (*) –After solving the QP problem, we can obtain a set of optimal parameters ‘ s. Then, we can compute the evaluation function for each image as follows,

16 Experiment Results Application to CBIR –Using the Corel Image Feature Dataset –Around 68,000 features in the dataset –Selected Features: Color Histogram (32) and Co-occurrence texture (16). –Experiment Settings For an initial query, Top N ranking images are selected as positive. Last M ranking image are selected as negative. N << M, their values are determined by a training set. –Three PRF algorithms are compared: SVM, 1-SVM and BSVM based PRF

17 Experiments Evaluation Metrics –AP (Average Precision) –AR (Average Recall) Experimental Results –AP Curves

18 AR Curves

19 Discussions BSVM based PRF algorithm is effective in improving the retrieval performance of CBIR system and achieves better performance than other two classical approaches. Note that our improvement do not need any intervention from users. However, we may need to look for more sophisticated algorithms for selecting the suitable values of N and M in our PRF scheme.

20 Future Work With application to Video Retrieval (VR) –Our proposed scheme can be extended to used in VR. –More Challenging for RF in VR –Multimodal PRF algorithm will be investigated: Text + Visual + Audio. Learning Algorithms –Partial Labels and Semi-Supervised Learning Labeled training set is usually a small subset of in multimedia database. Unlabeled working sets are important clues for improve the retrieval performance in Multimedia retrieval. In literature, Bayesian SVC, Semi-SVM and constrained EM are hot topics in Machine Learning. Can we combine the supervised and unsupervised learning for attacking the multimedia retrieval problem? For example, combining the constrained EM algorithm and discriminative classifiers, such as SVM, MPM, etc.?

21 Conclusions We investigate the pseudo-relevance feedback techniques in multimedia retrieval. We propose and formulate a novel PRF technique with biased support vector machine in image retrieval. Experimental results demonstrate that the BSVM-based PRF algorithm is effective and promising for improving the retrieval performance in CBIR and Multimedia retrieval.

22 FAQ Thanks!

23 References [1] R. Yan, et al., “ Multimedia search with Pseudo Relevance Feedback ”, CIVR2003, LNCS2728, Springer, 2003. [2] T.S. Huang and X.S. Zhou. Image retrieval by relevance feedback: from heuristic weight adjustment to optimal learning methods. In Proc. IEEE International Conference on Image Processing (ICIP'01), Thessaloniki, Greece, October 2001. [3] C.H. Hoi, K. Huang, M.R. Lyu, I. King, “ Pseudo relevance feedback with Biased Support Vector Machine in Image Retrieval ”, submitted to ECCV2004.


Download ppt "Dept. of Computer Science & Engineering, CUHK Pseudo Relevance Feedback with Biased Support Vector Machine in Multimedia Retrieval Steven C.H. Hoi 14-Oct,"

Similar presentations


Ads by Google