Presentation is loading. Please wait.

Presentation is loading. Please wait.

GAUSSIAN PROCESS FACTORIZATION MACHINES FOR CONTEXT-AWARE RECOMMENDATIONS Trung V. Nguyen, Alexandros Karatzoglou, Linas Baltrunas SIGIR 2014 Presentation:

Similar presentations


Presentation on theme: "GAUSSIAN PROCESS FACTORIZATION MACHINES FOR CONTEXT-AWARE RECOMMENDATIONS Trung V. Nguyen, Alexandros Karatzoglou, Linas Baltrunas SIGIR 2014 Presentation:"— Presentation transcript:

1 GAUSSIAN PROCESS FACTORIZATION MACHINES FOR CONTEXT-AWARE RECOMMENDATIONS Trung V. Nguyen, Alexandros Karatzoglou, Linas Baltrunas SIGIR 2014 Presentation: Vu Tien Duong

2 CONTENT Introduction Gaussian processes GPFM GPPW Evaluation Conclusion

3 Introduction Context: the environment in which a recommendation is provided. Multidimensional latent factors: variables are represented as latent features in a low-dimensional space Context-aware recommendation (CAR): the user- item-context interactions are modeled in factor models. Tensor Factorization Factorization Machine

4 Introduction (1) Problem: given the many possible types of interactions between user, items and contextual variables, it may seem unrealistic to restrict the interactions among them to linearity  Solution: Gaussian Process Factorization Machines (GPFM) - non-linear context-aware collaborative filtering method, use Gaussian Processes.

5 Introduction (2) Contributions: Applicable to both the explicit and implicit feedback Using stochastic gradient descent (SGD) optimization to allow scalability of the model The first GP-based attempt for context-aware recommendations

6 Steps of method 1. Converting observed data to latent representation 2. Having prior and likelihood of model 3. Learning 4. Predicting utility

7 Gaussian Processes Widely used for modeling relational data Use flexible covariance functions An important tool for modeling non-linear complex patterns

8 Gaussian Processes (1)

9 Gaussian Process Factorization Machines (GPFM) First, convert data to latent representation Apply GP to CAR, GPFM is specified by the prior and likelihood Can choose many type of covariance function k

10 Pairwise Comparison for Implicit Feedback Pair comparison (j1, c1) > i (j2, c2) which says the user has higher utility for item j1 in context c1 than item j2 in context c2

11 Pairwise Preference Model (GPPW) Similar to GPFM

12 LEARNING

13 PREDICTIVE DISTRIBUTION

14 Evaluation Implicit datasets: FRAPPE, converted FOOD, converted COMODA Explicit datasets: ADOM, COMODA, FOOD, SUSHI Compared methods: fm, multiverse, mf, constant Metrics: Overall quality: MAE, RMSE Top items in a list: NDCG, ERR DatasetDetail ADOMMovies (from students) COMODAMovies FOODFood menus SUSHISushi types (from Japanese) FRAPPEAndroid applications Converted FOOD rating > 3 is treated as positive Converted COMODA rating > 4 is treated as positive

15 Evaluation Split dataset to 5 folds and iterate 5 times. Each time, one for testing and 4 for training. Empirical tune parameters by one fold as validation. Then fix tuned parameter when running experiments with other 4 folds. The performance is average of 5 folds.

16 Evaluation of GPFM for Explicit Feedback Context-aware and context-agnostic gpfm and fm significantly outperforms mf => benefit of contextual info Multiverse outperform mf on ADOM and FOOD but poor on COMODA and SUSHI => high dimensional context gpfm and fm (best context-aware methods) Significatly outperforms => nonlinear with GPFM leads to substantial performance in CAR.

17 GPFM - Explicit Feedback

18 Evaluation of GPPW for Implicit Feedback gppw significantly outperforms both gpfm and fm on the FOOD and COMODA Comparable ERR@10 and MAP@10 on the FRAPPE  Learning with paired comparisons can lead to substantial improvement for ranking (compared to optimizing item- based scores)  GPPW is more effective than GPFM in the implicit feedback with little overhead in computation

19

20 CONCLUSION The utility of an item under a context is modeled as functions in the latent feature space of the item and context Introducing Gaussian processes as priors for these utility functions, GPFM allows complex, nonlinear user-item- context interactions to be captured leading to powerful and flexible modeling capacity


Download ppt "GAUSSIAN PROCESS FACTORIZATION MACHINES FOR CONTEXT-AWARE RECOMMENDATIONS Trung V. Nguyen, Alexandros Karatzoglou, Linas Baltrunas SIGIR 2014 Presentation:"

Similar presentations


Ads by Google