Facing the information flood in our daily lives, search engines mainly respond
|
|
- Kristin Page
- 5 years ago
- Views:
Transcription
1 Interaction-Rich ransfer Learning for Collaborative Filtering with Heterogeneous User Feedback Weike Pan and Zhong Ming, Shenzhen University A novel and efficient transfer learning algorithm called interaction-rich transfer by collective factorization extends the efficient collective matrix factorization algorithm by providing more interactions between the user-specific latent features. Facing the information flood in our daily lives, search engines mainly respond to our submitted queries passively, while recommender systems aim to discover and meet our needs in a more active way. Collaborative filtering techniques 1 4 have been applied in various recommendation-embedded applications. However, lack of users accurate preference data for example, five-star numerical ratings might limit this approach s applicability in real deployment. On the other side, a real recommender system can usually make use of additional types of user feedback for example, binary ratings of likes and dislikes. 5 Hence, collaborative filtering with different types of user feedback provides a potential way to address the data sparsity problem of accurate graded ratings. Here, we focus on this new research problem of collaborative filtering with heterogeneous user feedback, which is associated with few prior works. A recent work proposed a transfer learning algorithm called transfer by collective factorization (CF) that exploits such heterogeneous user feedback. 5 CF addresses the data sparsity problem via simultaneously sharing data-independent knowledge and modeling the data-dependent effect of two types of feedback. However, CF is a batch algorithm and updates model parameters only once after scanning the whole data, which might not be applicable for large datasets. On the contrary, some stochastic methods such as regularized singular value decomposition (RSVD) 3 and collective matrix factorization (CMF) 6 are empirically much more efficient than alternative batch-style algorithms like probabilistic matrix factorization (PMF) 7 and CF. However, the prediction accuracy of RSVD and CMF might not be adequate when compared with that of CF, especially when the users feedback are heterogeneous. here are also some efficient distributed or online collaborative filtering algorithms such as distributed stochastic gradient descent 8 and online multitask collaborative filtering, 9 but they re designed for homogeneous user feedback instead of the heterogenous ones studied in this article /14/$ IEEE IEEE INELLIGEN SYSEMS Published by the IEEE Computer Society
2 Related Work in ransfer Learning in Collaborative Filtering ransfer learning in collaborative filtering (LCF)1,2 is an emerging interdisciplinary topic, which aims to design transfer learning 3 solutions to address the challenges in collaborative filtering, 4 for example, rating sparsity. Parallel to transfer learning in text mining, LCF has developed a family of new algorithms: model-, instance-, and featurebased transfer, which answer the question of what to transfer from the perspective of shared knowledge; and adaptive, collective, and integrative algorithms, which answer the question of how to transfer from the perspective of algorithm styles. We can categorize the proposed interaction-rich transfer by collective factorization (icf) algorithm as a feature-based (what to transfer), collective (how to transfer), transfer learning method. he most closely related work to our icf are transfer by collective factorization 5 and collective matrix factorization, 6 because they re also feature-based collective algorithms. References 1. B. Li, Q. Yang, and X. Xue, ransfer Learning for Collaborative Filtering via a Rating-Matrix Generative Model, Proc. 26th Ann. Int l Conf. Machine Learning, 2009, pp W. Pan, E.W. Xiang, and Q. Yang, ransfer Learning in Collaborative Filtering via Uncertain Ratings, Proc. 26th AAAI Conf. Artificial Intelligence, S.J. Pan and Q. Yang, A Survey on ransfer Learning, IEEE rans. Knowledge and Data Eng., vol. 22, no. 10, 2010, pp D. Goldberg et al., Using Collaborative Filtering to Weave an Information apestry, Comm. ACM, vol. 35, no. 12, 1992, pp W. Pan and Q. Yang, ransfer Learning in Heterogeneous Collaborative Filtering Domains, Artificial Intelligence, vol. 197, Apr. 2013, pp A.P. Singh and G.J. Gordon, Relational Learning via Collective Matrix Factorization, Proc. 14th ACM SIGKDD Int l Conf. Knowledge Discovery and Data Mining, 2008, pp In this work, we aim to achieve a good balance between the accuracy of CF and the efficiency of CMF (see the related sidebar for information on others efforts). We extend the CMF algorithm by introducing richer interactions between the user-specific latent features, and design a corresponding algorithm called interactionrich transfer by collective factorization (icf). In particular, we assume that the predictability with regards to the same user s rating behaviors in the related numerical ratings and binary ratings is likely to be similar. With this assumption, we design update rules by sharing not only the item-specific latent features as that in CMF, but also the user-specific latent features in a smooth manner. he icf algorithm thus introduces more interactions between user-specific latent features. Experimental results on three real-world datasets show the effectiveness of our icf over RSVD and CMF. Background he studied problem setting is exactly the same as that of CF. We have n users and m items in a target numerical rating data R = {r ui } n m {1, 2, 3, 4, 5,?} n m and an auxiliary binary rating data R = n m n { r } {,,?} m ui 01, where? denotes a missing value. he users and? 3?????? 3 1 2?? 1? R 5? 5? 5? 5 4?? 1?? arget data (5-star graded ratings) 0? 0 0 1? 1 0?? ? R ~ 1 0? 0 1??? 0 0? 0 1 items in the two data types are the same, and a one-to-one mapping is given. Our goal is to transfer knowledge from R to help predict the missing values in R. We illustrate the problem setting in Figure 1a. Note that in this article, we aim to design an efficient transfer learning algorithm, because the lack of efficiency is a major limitation of CF. PMF PMF models the preference data via two latent feature matrices, R ~ UV, (1) U W V V Auxiliary data (binary ratings of likes/dislikes) CMF Uu. U u. i = 1,..., m r ui r ~ ui Vi. Wu. W u. (a) (b) (c) i = 1,..., m r ui Figure 1. Problem setting and solutions. (a) Illustration of the studied problem setting. wo transfer learning solutions: (b) collective matrix factorization (CMF) and (c) interaction-rich transfer by collective factorization (icf). icf where the target numerical rating matrix R is factorized into a user-specific latent feature matrix U n d and an item-specific latent feature matrix V m d. Once we have obtained the latent feature matrices, we can predict the rating located at (u, i) via ˆrui = Uu Vi, where U u 1 d and V i 1 d are user u s and item i s latent feature vectors, respectively. CMF CMF uses item-side auxiliary data via sharing the same item-specific latent features. We use matrix notations r ~ ui V i. november/december
3 to illustrate its idea on knowledge sharing, R ~ UV, R ~ WV, (2) where the auxiliary binary rating matrix R is decomposed into a userspecific latent feature matrix W n d and an item-specific latent feature matrix V m d. he knowledge encoded in the item-specific latent feature matrix V is shared in two factorization systems, and the two user-specific latent feature matrices W and U aren t shared. For our problem with the same users and same items in the target and auxiliary data, we reach the following factorization systems: R ~ UV, R ~ WV, s.t. W = U, (3) which means that CMF reduces to PMF with a pool of both target and auxiliary data R R. However, such a reduced approach will cause performance degradation, because it ignores the heterogeneity of the users feedback in R and R. Obviously, the semantic meaning of likes and dislikes in the auxiliary data are different from that of graded ratings in the target data. Our Solution Now let s look at the solution we propose. icf We can see that the PMF in Equation 1 doesn t make use of auxiliary data, CMF in Equation 2 only makes use of item-side auxiliary data, while CMF in Equation 3 reduces to PMF without distinguishing the heterogeneity of user feedback. he question we ask in this article is whether we can transfer more knowledge besides sharing the item-specific latent features in CMF as shown in Equation 2 and Figure 1b. here s some potential that we can exploit because the users in both data types are the same. For a typical user, a model s prediction accuracy trained on the target data of numerical ratings or auxiliary data of binary ratings is likely to be similar, because a user s preference variation and a model s ability to capture the user s preference usually doesn t change much in two related data. With this assumption, we reach the following factorization systems: R~ UV, R ~ WV, s.t. E = E, (4) where E and E denote the corresponding errors of the prediction model on the two data types, representing the predictability of user preferences. We can see that the main difference between Equations 2 and 4 is from the shared predictability in Equation 4, denoted as E = E. We expand the matrix formulation in Equation 4 as follows: n m min fui, s.t. eui = e ui, Wu, Uu, Vi, bu, bi, µ u= 1 i= 1 (5) where fui = yui[( 1/ 2) eui 2 +R ui ] + λy ui [( 1/ 2) e 2 ui +R ui ] is a balanced loss function on two data with l > 0. Note that eui = rui rˆ ui and e ui = r ui r ˆ ui are the errors of the prediction model on missing ratings in the target data and auxiliary data, respectively, where ˆrui = µ + bu + bi + Uu Vi and ˆr ui = W u V i are estimated preferences, m is the global average, b u is the user bias of user u, and b i is item bias of item i. he variables y ui and y ui indicate whether the entry located at (u, i) is observed in the target data and auxiliary data, respectively. R ui = ( αu/ 2) U 2 u ( αv/ 2) V 2 + i + ( βu/ 2) bu 2 + ( β v / 2) b 2 i and R 2 ui = ( αw/ 2) Wu + ( αv/ 2) V 2 i + ( βu/ 2) bu 2 + ( βv/ 2) bi 2 are regularization terms used to avoid overfitting when learning the latent variables. Learning the icf o solve the optimization problem in Equation 5, we start from the perspective of gradient descents, which will be used in the stochastic gradient descent framework. Learning parameters using the target data. Given a rating from the target data r ui with y ui = 1 and y ui = 0, we have gradients, U u = e ui V i + α u U u, V i = e ui U u + α v V i, b u = e ui + b u b u, b i = e ui + b v b i, and m = e ui for U u, V i, b u, b i, and m, respectively. Besides using these gradients to update the target parameters, we can also make use of auxiliary variables W u to update the target item-specific latent feature vector V i, because the predictability is assumed to be similar and can be shared that is, eui = eui. Given e ui, we have the gradient of V i in the auxiliary data, Vi = e uiwu + α vvi. We combine two gradients for the item-specific latent feature vector V i, Vi = ρ euiuu + αvvi + ( 1 ρ) e uiwu + αvvi = ρ euiuu + αvvi + ( 1 ρ) euiwu + αvvi = eui ρuu + ( 1 ρ) Wu + αvvi, where 0 r 1 is a parameter used to linearly integrate two gradients. Comparing ru u + (1 r)w u and U u in the gradient V i, we can see that more interactions between the userspecific latent features U u and W u are introduced, which is also illus trated via graphical models in Fig ure 1c. For this reason, we call r an interaction parameter between the user-specific latent features. We can see that the shared predictability will introduce more interactions between the user-specific 50 IEEE INELLIGEN SYSEMS
4 Input: he target user-item numerical rating matrix R, the auxiliary user-item binary rating matrix R. Output: he user-specific latent feature vector U u, and bias b u, user-specific latent feature vector W u, item-specific latent feature vector V i and bias b i, and global average µ,where u = 1,...,n,i = 1,...,m. For t = 1,..., For iter = 1,...,q+q Step 1. Randomly pick up a rating from R and R; Step 2. Calculate the gradients as shown in Eqs.(6 10) if y ui = 1 or Eqs.(10 11) if y ui = 1; Step 3. Update the parameters as shown in Eq.(12). End End Figure 2. he icf algorithm. latent feature matrices U and W via ru u + (1 r)w u in V i. And for this reason, we call the proposed approach icf, representing interaction-rich transfer by collective factorization. Learning parameters using the auxiliary data. Similar to that of the target data, given a rating from the auxiliary data r ui with y ui = 1 and y ui = 0, we have the following gradient: Wu = λe uivi + λαwwu, Vi = λe ui ρwu + ( 1 ρ) U u + λαvvi, where 0 r 1 is again an interaction parameter to combine two gradients. Similarly, more interactions are introduced between the user-specific latent features W u and U u in rw u + (1 r)u u. he algorithm. We thus have the gradients given a target numerical rating (y ui = 1) or an auxiliary binary rating ( y ui = 1) as follows: b u = e ui + b u b u, if y ui = 1; (6) b i = e ui + b v b i, if y ui = 1; (7) m = e ui, if y ui = 1; (8) U u. = e ui V i. + α u U u., if y ui = 1; (9) Vi = eui Zu + αvvi, if yui = 1, λe ui Z u + λα v V i, if y ui = 1 ; (10) Wu = λe uivi + λαwwu, if y ui = 1 ; (11) where Z u = ru u + (1 r)w u. and Z u = ρwu + ( 1 ρ) Uu. We can see that when r = 1, we have Z u = U u and Z u = Wu, which are exactly the same as that of CMF. CMF is thus a special case of icf, which only shares the item-specific latent feature matrix V with r = 1. When 0 < r < 1, the equation Z u = ru u + (1 r)w u in icf is considered a smooth version, in comparison with U u only in CMF, which is likely to be more stable in the stochastic algorithmic framework of Stochastic Gradient Descent (SGD). Finally, we have the update rules, q = q γ q, (12) where q can be b u, b i, m, U u, V i when y ui = 1; and V i, W u when y ui = 1. Note that g > 0 is the learning rate. Figure 2 describes a complete algorithm with the previously discussed update rules, where it goes over all of both target and auxiliary data in times. he time complexity of icf and CMF are O ( ( q+ q ) d) and that of RSVD is O(qd), where q and q are the numbers of ratings in target and auxiliary data, respectively. he learning algorithm of icf is much more efficient than that of CF, because icf is a stochastic algorithm while CF is a batch one. Note that CF can t use similar stochastic update rules because of the orthonormal constraints on user-specific and item-specific latent feature matrices in the adopted matrix trifactorization model, and its time complexity is OK ( max( qqd, 3 6 ) + Kd ) with K as the iteration number. 5 he difference between CF and our icf can be identified from the two fundamental questions in transfer learning. 10 o answer the question of what to transfer, CF shares latent features, while our icf shares both latent features and the predictability; and for how to transfer, CF adopts matrix trifactorization and batch style implementation, while our icf uses the more efficient matrix bifactorization and stochastic style implementation. Experimental Results Next, we tested the algorithm to determine its performance. november/december
5 able 1. Description of Netflix subset (n = m = 5000), MovieLens10M (n = 71, 567, m = 10,681), and Flixter (n = 147, 612, m = 48, 794) data used in the experiments. Dataset Form Sparsity (%) Netflix arget (training) {1,, 5,?} 0.8 arget (test) {1,, 5,?} 11.3 Auxiliary {dislike, like,?} 2 MovieLens10M arget (training) {0.5,, 5,?} 0.52 arget (test) {0.5,, 5,?} 0.26 Auxiliary {dislike, like,?} 0.52 Flixter arget (training) {0.5,, 5,?} arget (test) {0.5,, 5,?} Auxiliary {dislike, like,?} Style Batch Stochastic able 2. Prediction performance of icf and other methods on the Netflix subset data.* Algorithm Mean absolute error (MAE) Root mean square error (RMSE) Probabilistic matrix ± ± factorization CMF-link ± ± CF (CMF)** ± ± CF (CSVD)** ± ± Regularized singular value ± ± decomposition (RSVD) CMF ± ± icf ± ± * For stochastic algorithms, the interaction parameter r is fixed at 0.5, and the number of iterations is fixed at 50. Batch algorithm results are from other work. 5 ** CMF = collective matrix trifactorization; CSVD = collective singular value decomposition. Datasets and Evaluation Metrics We extracted the first dataset from Netflix (see in the same way as that used in other work. 5 he data contains three copies of numerical ratings and binary ratings assigned by 5,000 users on 5,000 items. Note that we used this small dataset for empirical studies among icf, CF, and other methods, because CF might not scale well to large datasets. We extracted the second dataset from MovieLens10M (see in the same way as that used in other work. 11 he data contains five copies of target, auxiliary, and test data. For each copy of auxiliary data, we convert ratings smaller than four to dislike, and ratings larger than or equal to four to like, to simulate the binary feedback. We extracted the third dataset from Flixter (see jamalim/datasets). 12 his data contains ratings given by users on products. We preprocess the Flixter rating data in the same way as that of the MovieLens10M data to generate five copies of target, auxiliary, and test data. able 1 shows the detailed statistics of the datasets used in the experiments. For icf, RSVD, and CMF, dislike and like are replaced with numerical values of 1 and 5, respectively, to make both target data and auxiliary data in the same rating range. We adopt two commonly used evaluation metrics in recommender systems: mean absolute error (MAE) and root mean square error (RMSE), MAE= rui rˆ ui / E ( uir,, ui ) E RMSE = ( rui rˆ 2 ui) / E, ( uir,, ) ui E where r ui and ˆr ui are the true and predicted ratings, respectively, and E is the number of test ratings. Baselines and Parameter Settings We compare our icf algorithm with some batch algorithms 5 on the small dataset, because batch algorithms aren t very efficient. We also compare icf with two stochastic algorithms, RSVD and CMF, on the aforementioned two large datasets. For icf, RSVD, and CMF, the model parameters of m, b u, b i, U ik, V ik, and W ik, k = 1,, d are initialized exactly the same as that done in previous work. 11 he tradeoff parameters are set similarly to that used by Yehuda Koren, α u = α v = α w = 0.01, b u = b v = he learning rate is initialized as g = 0.01 and decreased via g g 0.9 over every scan of both the target data and auxiliary data. 3 For the Netflix subset data, we set the number of latent features as d = 10; 5 for the MovieLens10M data, we use d = 20; 13 and for the Flixter data, we use d = o study the effectiveness of interactions between userspecific latent features, we report the results of using different values of r {0, 0.2, 0.4, 0.6, 0.8, 1}. Note that when r = 1, icf reduces to CMF. he value of l is fixed as 1 with the same weight on auxiliary and target data for the MovieLens10M and Flixter data, and is fixed as 10 for the Netflix subset data. Results Now, let s study the results of the algorithm s performance IEEE INELLIGEN SYSEMS
6 able 3. Prediction performance of icf and other methods on MovieLens10M and Flixter data.* Comparison with batch algorithms. From able 2, we can see that the batch algorithm CF performs better than the proposed stochastic algorithm icf, because CF is able to capture the datadependent effect and to transfer the data-independent knowledge simultaneously in a principled way. he icf algorithm aims for efficiency and large data, which beats other batch algorithms of PMF and CMF-link, and stochastic algorithms of RSVD and CMF. he results of batch algorithms of PMF, CMF-link, and CF shown in able 2 are from other research. 5 Comparison with stochastic algorithms. From able 3, we can see that icf is again better than RSVD and CMF, which shows the effect of the introduced richer interactions between auxiliary and target data in the proposed transfer learning solution. We can also see that the transfer learning methods CMF and icf are both better than RSVD, which shows the usefulness of the auxiliary data and the effectiveness of the knowledge transfer mechanisms in CMF and icf. Impact of interaction parameter (r). From Figure 3, we can see that icf performs best when 0.2 r 0.4, which shows that a relatively strong interaction is useful. Note that when r = 1, icf reduces to CMF with no interactions between user-specific latent features. In this article, we propose a novel and efficient transfer learning algorithm, icf, in collaborative filtering with heterogeneous user feedback. Our icf aims to transfer knowledge from auxiliary binary ratings of likes and dislikes to improve the target numerical rating prediction performance in an efficient way. Our icf algorithm achieves this via introducing richer interactions by sharing both item-specific Data Metric RSVD CMF icf MovieLens10M MAE ± ± ± RMSE ± ± ± Flixter MAE RMSE * he interaction parameter r is fixed at 0.5. he number of iterations is fixed at 50. RMSE (a) ρ latent features and the predictability in two heterogeneous data in a smooth manner. Our icf is more efficient than a recent batch algorithm that is, CF and performs better than two state-of-the-art stochastic algorithms that is, RSVD and CMF. For future work, we re interested in generalizing the idea of introducing rich interactions in heterogeneous user feedback to the problem of collaborative filtering with auxiliary information of social context and implicit feedback. 14 Acknowledgments We thank the National Natural Science Foundation of China (no and no ), NSF GD (no ), GDS& (no. 2012B ), S& project of SZ (no. JCYJ ), and the National Basic Research Program of China (973 Plan, no. 2010CB327903) for their support. Zhong Ming is the corresponding author for this work. References 1. G. Adomavicius and A. uzhilin, oward the Next Generation of Recommender Systems: A Survey of the ± ± ± ± ± ± ρ Figure 3. Prediction performance of icf on (a) MovieLens10M data and (b) Flixter data with different q values. he number of iterations is fixed at 50. RMSE (b) State-of-the-Art and Possible Extensions, IEEE rans. Knowledge and Data Eng., vol. 17, no. 6, 2005, pp D. Goldberg et al., Using Collaborative Filtering to Weave an Information apestry, Comm. ACM, vol. 35, no. 12, 1992, pp Y. Koren, Factorization Meets the Neighborhood: A Multifaceted Collaborative Filtering Model, Proc. 14th ACM SIG- KDD Int l Conf. Knowledge Discovery and Data Mining, 2008, pp S. Rendle, Factorization Machines with libfm, ACM rans. Intelligent Systems and echnology, vol. 3, no. 3, 2012, pp. 57:1 57: W. Pan and Q. Yang, ransfer Learning in Heterogeneous Collaborative Filtering Domains, Artificial Intelligence, vol. 197, Apr. 2013, pp A.P. Singh and G.J. Gordon, Relational Learning via Collective Matrix Factorization, Proc. 14th ACM SIGKDD Int l Conf. Knowledge Discovery and Data Mining, 2008, pp R. Salakhutdinov and A. Mnih, Probabilistic Matrix Factorization, Proc. november/december
7 he Authors Weike Pan is a lecturer with the College of Computer Science and Software Engineering, Shenzhen University, Shenzhen, China. His research interests include transfer learning, recommender systems, and statistical machine learning. Pan has a PhD in computer science and engineering from the Hong Kong University of Science and echnology, Kowloon, Hong Kong. Contact him at panweike@szu.edu.cn. Zhong Ming is a professor with the College of Computer Science and Software Engineering, Shenzhen University, Shenzhen, China. His research interests include software engineering and Web intelligence. Ming has a PhD in computer science and technology from the Sun Yat-Sen University, Guangzhou, Guangdong, China. He is the corresponding author. Contact him at mingz@szu.edu.cn. Ann. Conf. Neural Information Processing Systems, 2008, pp R. Gemulla et al., Large-Scale Matrix Factorization with Distributed Stochastic Gradient Descent, Proc. 17th ACM SIG- KDD Int l Conf. Knowledge Discovery and Data Mining, 2011, pp J. Wang et al., Online Multi-ask Collaborative Filtering for On-the-Fly Recommender Systems, Proc. 7th ACM Conf. Recommender Systems, 2013, pp S.J. Pan and Q. Yang, A Survey on ransfer Learning, IEEE rans. Knowledge and Data Eng., vol. 22, no. 10, 2010, pp W. Pan, E.W. Xiang, and Q. Yang, ransfer Learning in Collaborative Filtering via Uncertain Ratings, Proc. 26th AAAI Conf. Artificial Intelligence, M. Jamali and M. Ester, A Matrix Factorization echnique with rust Propagation for Recommendation in Social Networks, Proc. 4th ACM Conf. Recommender Systems, 2010, pp C. Zhou et al., agrec: Leveraging agging Wisdom for Recommendation, Proc Int l Conf. Computational Science and Eng., 2009, pp N.N. Liu, L. He, and M. Zhao, Social emporal Collaborative Ranking for Context Aware Movie Recommendation, ACM rans. Intelligent Systems and echnology, vol. 4, no. 1, 2013, pp. 15:1 15:26. Selected CS articles and columns are also available for free at ADVERISER INFORMAION Advertising Personnel Marian Anderson: Sr. Advertising Coordinator manderson@computer.org Phone: Fax: Sandy Brown: Sr. Business Development Mgr. sbrown@computer.org Phone: Fax: Advertising Sales Representatives (display) Central, Northwest, Far East: Eric Kincaid e.kincaid@computer.org Phone: Fax: Northeast, Midwest, Europe, Middle East: Ann & David Schissler a.schissler@computer.org, d.schissler@computer.org Phone: Fax: Southwest, California: Mike Hughes mikehughes@computer.org Phone: Southeast: Heather Buonadies h.buonadies@computer.org Phone: Fax: Advertising Sales Representatives (Classified Line) Heather Buonadies h.buonadies@computer.org Phone: Fax: Advertising Sales Representatives (Jobs Board) Heather Buonadies h.buonadies@computer.org Phone: Fax: IEEE INELLIGEN SYSEMS
Collaborative Recommendation with Multiclass Preference Context
Collaborative Recommendation with Multiclass Preference Context Weike Pan and Zhong Ming {panweike,mingz}@szu.edu.cn College of Computer Science and Software Engineering Shenzhen University Pan and Ming
More informationA Modified PMF Model Incorporating Implicit Item Associations
A Modified PMF Model Incorporating Implicit Item Associations Qiang Liu Institute of Artificial Intelligence College of Computer Science Zhejiang University Hangzhou 31007, China Email: 01dtd@gmail.com
More informationCollaborative Filtering. Radek Pelánek
Collaborative Filtering Radek Pelánek 2017 Notes on Lecture the most technical lecture of the course includes some scary looking math, but typically with intuitive interpretation use of standard machine
More informationarxiv: v2 [cs.ir] 14 May 2018
A Probabilistic Model for the Cold-Start Problem in Rating Prediction using Click Data ThaiBinh Nguyen 1 and Atsuhiro Takasu 1, 1 Department of Informatics, SOKENDAI (The Graduate University for Advanced
More informationMatrix Factorization Techniques For Recommender Systems. Collaborative Filtering
Matrix Factorization Techniques For Recommender Systems Collaborative Filtering Markus Freitag, Jan-Felix Schwarz 28 April 2011 Agenda 2 1. Paper Backgrounds 2. Latent Factor Models 3. Overfitting & Regularization
More informationAsymmetric Correlation Regularized Matrix Factorization for Web Service Recommendation
Asymmetric Correlation Regularized Matrix Factorization for Web Service Recommendation Qi Xie, Shenglin Zhao, Zibin Zheng, Jieming Zhu and Michael R. Lyu School of Computer Science and Technology, Southwest
More informationCollaborative Filtering on Ordinal User Feedback
Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence Collaborative Filtering on Ordinal User Feedback Yehuda Koren Google yehudako@gmail.com Joseph Sill Analytics Consultant
More informationScaling Neighbourhood Methods
Quick Recap Scaling Neighbourhood Methods Collaborative Filtering m = #items n = #users Complexity : m * m * n Comparative Scale of Signals ~50 M users ~25 M items Explicit Ratings ~ O(1M) (1 per billion)
More informationDecoupled Collaborative Ranking
Decoupled Collaborative Ranking Jun Hu, Ping Li April 24, 2017 Jun Hu, Ping Li WWW2017 April 24, 2017 1 / 36 Recommender Systems Recommendation system is an information filtering technique, which provides
More informationSCMF: Sparse Covariance Matrix Factorization for Collaborative Filtering
SCMF: Sparse Covariance Matrix Factorization for Collaborative Filtering Jianping Shi Naiyan Wang Yang Xia Dit-Yan Yeung Irwin King Jiaya Jia Department of Computer Science and Engineering, The Chinese
More informationRating Prediction with Topic Gradient Descent Method for Matrix Factorization in Recommendation
Rating Prediction with Topic Gradient Descent Method for Matrix Factorization in Recommendation Guan-Shen Fang, Sayaka Kamei, Satoshi Fujita Department of Information Engineering Hiroshima University Hiroshima,
More information* Matrix Factorization and Recommendation Systems
Matrix Factorization and Recommendation Systems Originally presented at HLF Workshop on Matrix Factorization with Loren Anderson (University of Minnesota Twin Cities) on 25 th September, 2017 15 th March,
More informationRecommendation Systems
Recommendation Systems Pawan Goyal CSE, IITKGP October 21, 2014 Pawan Goyal (IIT Kharagpur) Recommendation Systems October 21, 2014 1 / 52 Recommendation System? Pawan Goyal (IIT Kharagpur) Recommendation
More informationCOT: Contextual Operating Tensor for Context-Aware Recommender Systems
Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence COT: Contextual Operating Tensor for Context-Aware Recommender Systems Qiang Liu, Shu Wu, Liang Wang Center for Research on Intelligent
More informationCircle-based Recommendation in Online Social Networks
Circle-based Recommendation in Online Social Networks Xiwang Yang, Harald Steck*, and Yong Liu Polytechnic Institute of NYU * Bell Labs/Netflix 1 Outline q Background & Motivation q Circle-based RS Trust
More informationMatrix Factorization Techniques for Recommender Systems
Matrix Factorization Techniques for Recommender Systems By Yehuda Koren Robert Bell Chris Volinsky Presented by Peng Xu Supervised by Prof. Michel Desmarais 1 Contents 1. Introduction 4. A Basic Matrix
More informationRestricted Boltzmann Machines for Collaborative Filtering
Restricted Boltzmann Machines for Collaborative Filtering Authors: Ruslan Salakhutdinov Andriy Mnih Geoffrey Hinton Benjamin Schwehn Presentation by: Ioan Stanculescu 1 Overview The Netflix prize problem
More informationAndriy Mnih and Ruslan Salakhutdinov
MATRIX FACTORIZATION METHODS FOR COLLABORATIVE FILTERING Andriy Mnih and Ruslan Salakhutdinov University of Toronto, Machine Learning Group 1 What is collaborative filtering? The goal of collaborative
More informationTopicMF: Simultaneously Exploiting Ratings and Reviews for Recommendation
Proceedings of the wenty-eighth AAAI Conference on Artificial Intelligence opicmf: Simultaneously Exploiting Ratings and Reviews for Recommendation Yang Bao Hui Fang Jie Zhang Nanyang Business School,
More informationMatrix Factorization Techniques for Recommender Systems
Matrix Factorization Techniques for Recommender Systems Patrick Seemann, December 16 th, 2014 16.12.2014 Fachbereich Informatik Recommender Systems Seminar Patrick Seemann Topics Intro New-User / New-Item
More informationRecommendation Systems
Recommendation Systems Pawan Goyal CSE, IITKGP October 29-30, 2015 Pawan Goyal (IIT Kharagpur) Recommendation Systems October 29-30, 2015 1 / 61 Recommendation System? Pawan Goyal (IIT Kharagpur) Recommendation
More informationCS425: Algorithms for Web Scale Data
CS: Algorithms for Web Scale Data Most of the slides are from the Mining of Massive Datasets book. These slides have been modified for CS. The original slides can be accessed at: www.mmds.org J. Leskovec,
More informationRecommender Systems. Dipanjan Das Language Technologies Institute Carnegie Mellon University. 20 November, 2007
Recommender Systems Dipanjan Das Language Technologies Institute Carnegie Mellon University 20 November, 2007 Today s Outline What are Recommender Systems? Two approaches Content Based Methods Collaborative
More informationBinary Principal Component Analysis in the Netflix Collaborative Filtering Task
Binary Principal Component Analysis in the Netflix Collaborative Filtering Task László Kozma, Alexander Ilin, Tapani Raiko first.last@tkk.fi Helsinki University of Technology Adaptive Informatics Research
More informationMatrix Factorization In Recommender Systems. Yong Zheng, PhDc Center for Web Intelligence, DePaul University, USA March 4, 2015
Matrix Factorization In Recommender Systems Yong Zheng, PhDc Center for Web Intelligence, DePaul University, USA March 4, 2015 Table of Contents Background: Recommender Systems (RS) Evolution of Matrix
More informationData Mining Techniques
Data Mining Techniques CS 622 - Section 2 - Spring 27 Pre-final Review Jan-Willem van de Meent Feedback Feedback https://goo.gl/er7eo8 (also posted on Piazza) Also, please fill out your TRACE evaluations!
More informationAlgorithms for Collaborative Filtering
Algorithms for Collaborative Filtering or How to Get Half Way to Winning $1million from Netflix Todd Lipcon Advisor: Prof. Philip Klein The Real-World Problem E-commerce sites would like to make personalized
More informationJoint user knowledge and matrix factorization for recommender systems
World Wide Web (2018) 21:1141 1163 DOI 10.1007/s11280-017-0476-7 Joint user knowledge and matrix factorization for recommender systems Yonghong Yu 1,2 Yang Gao 2 Hao Wang 2 Ruili Wang 3 Received: 13 February
More informationPreliminaries. Data Mining. The art of extracting knowledge from large bodies of structured data. Let s put it to use!
Data Mining The art of extracting knowledge from large bodies of structured data. Let s put it to use! 1 Recommendations 2 Basic Recommendations with Collaborative Filtering Making Recommendations 4 The
More informationLittle Is Much: Bridging Cross-Platform Behaviors through Overlapped Crowds
Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence (AAAI-16) Little Is Much: Bridging Cross-Platform Behaviors through Overlapped Crowds Meng Jiang, Peng Cui Tsinghua University Nicholas
More informationCollaborative Filtering Applied to Educational Data Mining
Journal of Machine Learning Research (200) Submitted ; Published Collaborative Filtering Applied to Educational Data Mining Andreas Töscher commendo research 8580 Köflach, Austria andreas.toescher@commendo.at
More informationCollaborative topic models: motivations cont
Collaborative topic models: motivations cont Two topics: machine learning social network analysis Two people: " boy Two articles: article A! girl article B Preferences: The boy likes A and B --- no problem.
More informationLarge-scale Collaborative Ranking in Near-Linear Time
Large-scale Collaborative Ranking in Near-Linear Time Liwei Wu Depts of Statistics and Computer Science UC Davis KDD 17, Halifax, Canada August 13-17, 2017 Joint work with Cho-Jui Hsieh and James Sharpnack
More informationLarge-Scale Social Network Data Mining with Multi-View Information. Hao Wang
Large-Scale Social Network Data Mining with Multi-View Information Hao Wang Dept. of Computer Science and Engineering Shanghai Jiao Tong University Supervisor: Wu-Jun Li 2013.6.19 Hao Wang Multi-View Social
More informationA Randomized Approach for Crowdsourcing in the Presence of Multiple Views
A Randomized Approach for Crowdsourcing in the Presence of Multiple Views Presenter: Yao Zhou joint work with: Jingrui He - 1 - Roadmap Motivation Proposed framework: M2VW Experimental results Conclusion
More informationCollaborative Filtering
Collaborative Filtering Nicholas Ruozzi University of Texas at Dallas based on the slides of Alex Smola & Narges Razavian Collaborative Filtering Combining information among collaborating entities to make
More informationScalable Bayesian Matrix Factorization
Scalable Bayesian Matrix Factorization Avijit Saha,1, Rishabh Misra,2, and Balaraman Ravindran 1 1 Department of CSE, Indian Institute of Technology Madras, India {avijit, ravi}@cse.iitm.ac.in 2 Department
More informationTime-aware Collaborative Topic Regression: Towards Higher Relevance in Textual Item Recommendation
Time-aware Collaborative Topic Regression: Towards Higher Relevance in Textual Item Recommendation Anas Alzogbi Department of Computer Science, University of Freiburg 79110 Freiburg, Germany alzoghba@informatik.uni-freiburg.de
More informationLarge-Scale Matrix Factorization with Distributed Stochastic Gradient Descent
Large-Scale Matrix Factorization with Distributed Stochastic Gradient Descent KDD 2011 Rainer Gemulla, Peter J. Haas, Erik Nijkamp and Yannis Sismanis Presenter: Jiawen Yao Dept. CSE, UT Arlington 1 1
More informationMatrix Factorization and Collaborative Filtering
10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Matrix Factorization and Collaborative Filtering MF Readings: (Koren et al., 2009)
More informationRecommendation Systems
Recommendation Systems Popularity Recommendation Systems Predicting user responses to options Offering news articles based on users interests Offering suggestions on what the user might like to buy/consume
More informationarxiv: v1 [cs.lg] 26 Oct 2012
Selective Transfer Learning for Cross Domain Recommendation Zhongqi Lu Erheng Zhong Lili Zhao Evan Xiang Weike Pan Qiang Yang arxiv:1210.7056v1 [cs.lg] 26 Oct 2012 Abstract Collaborative filtering (CF)
More informationRecommender Systems EE448, Big Data Mining, Lecture 10. Weinan Zhang Shanghai Jiao Tong University
2018 EE448, Big Data Mining, Lecture 10 Recommender Systems Weinan Zhang Shanghai Jiao Tong University http://wnzhang.net http://wnzhang.net/teaching/ee448/index.html Content of This Course Overview of
More informationLocal Low-Rank Matrix Approximation with Preference Selection of Anchor Points
Local Low-Rank Matrix Approximation with Preference Selection of Anchor Points Menghao Zhang Beijing University of Posts and Telecommunications Beijing,China Jack@bupt.edu.cn Binbin Hu Beijing University
More informationMultiverse Recommendation: N-dimensional Tensor Factorization for Context-aware Collaborative Filtering
Multiverse Recommendation: N-dimensional Tensor Factorization for Context-aware Collaborative Filtering Alexandros Karatzoglou Telefonica Research Barcelona, Spain alexk@tid.es Xavier Amatriain Telefonica
More informationECS289: Scalable Machine Learning
ECS289: Scalable Machine Learning Cho-Jui Hsieh UC Davis Oct 11, 2016 Paper presentations and final project proposal Send me the names of your group member (2 or 3 students) before October 15 (this Friday)
More informationA Matrix Factorization Technique with Trust Propagation for Recommendation in Social Networks
A Matrix Factorization Technique with Trust Propagation for Recommendation in Social Networks ABSTRACT Mohsen Jamali School of Computing Science Simon Fraser University Burnaby, BC, Canada mohsen_jamali@cs.sfu.ca
More informationMatrix and Tensor Factorization from a Machine Learning Perspective
Matrix and Tensor Factorization from a Machine Learning Perspective Christoph Freudenthaler Information Systems and Machine Learning Lab, University of Hildesheim Research Seminar, Vienna University of
More informationCollaborative Filtering Matrix Completion Alternating Least Squares
Case Study 4: Collaborative Filtering Collaborative Filtering Matrix Completion Alternating Least Squares Machine Learning for Big Data CSE547/STAT548, University of Washington Sham Kakade May 19, 2016
More informationContent-based Recommendation
Content-based Recommendation Suthee Chaidaroon June 13, 2016 Contents 1 Introduction 1 1.1 Matrix Factorization......................... 2 2 slda 2 2.1 Model................................. 3 3 flda 3
More informationMixed Membership Matrix Factorization
Mixed Membership Matrix Factorization Lester Mackey 1 David Weiss 2 Michael I. Jordan 1 1 University of California, Berkeley 2 University of Pennsylvania International Conference on Machine Learning, 2010
More informationMatrix Factorization and Factorization Machines for Recommender Systems
Talk at SDM workshop on Machine Learning Methods on Recommender Systems, May 2, 215 Chih-Jen Lin (National Taiwan Univ.) 1 / 54 Matrix Factorization and Factorization Machines for Recommender Systems Chih-Jen
More informationCS 175: Project in Artificial Intelligence. Slides 4: Collaborative Filtering
CS 175: Project in Artificial Intelligence Slides 4: Collaborative Filtering 1 Topic 6: Collaborative Filtering Some slides taken from Prof. Smyth (with slight modifications) 2 Outline General aspects
More informationTransfer Learning for Collective Link Prediction in Multiple Heterogenous Domains
Transfer Learning for Collective Link Prediction in Multiple Heterogenous Domains Bin Cao caobin@cse.ust.hk Nathan Nan Liu nliu@cse.ust.hk Qiang Yang qyang@cse.ust.hk Hong Kong University of Science and
More informationSQL-Rank: A Listwise Approach to Collaborative Ranking
SQL-Rank: A Listwise Approach to Collaborative Ranking Liwei Wu Depts of Statistics and Computer Science UC Davis ICML 18, Stockholm, Sweden July 10-15, 2017 Joint work with Cho-Jui Hsieh and James Sharpnack
More informationPredicting the Performance of Collaborative Filtering Algorithms
Predicting the Performance of Collaborative Filtering Algorithms Pawel Matuszyk and Myra Spiliopoulou Knowledge Management and Discovery Otto-von-Guericke University Magdeburg, Germany 04. June 2014 Pawel
More informationRanking-Oriented Evaluation Metrics
Ranking-Oriented Evaluation Metrics Weike Pan College of Computer Science and Software Engineering Shenzhen University W.K. Pan (CSSE, SZU) Ranking-Oriented Evaluation Metrics 1 / 21 Outline 1 Introduction
More informationLearning in Probabilistic Graphs exploiting Language-Constrained Patterns
Learning in Probabilistic Graphs exploiting Language-Constrained Patterns Claudio Taranto, Nicola Di Mauro, and Floriana Esposito Department of Computer Science, University of Bari "Aldo Moro" via E. Orabona,
More informationA Gradient-based Adaptive Learning Framework for Efficient Personal Recommendation
A Gradient-based Adaptive Learning Framework for Efficient Personal Recommendation Yue Ning 1 Yue Shi 2 Liangjie Hong 2 Huzefa Rangwala 3 Naren Ramakrishnan 1 1 Virginia Tech 2 Yahoo Research. Yue Shi
More informationDepartment of Computer Science, Guiyang University, Guiyang , GuiZhou, China
doi:10.21311/002.31.12.01 A Hybrid Recommendation Algorithm with LDA and SVD++ Considering the News Timeliness Junsong Luo 1*, Can Jiang 2, Peng Tian 2 and Wei Huang 2, 3 1 College of Information Science
More informationMatrix Factorization with Content Relationships for Media Personalization
Association for Information Systems AIS Electronic Library (AISeL) Wirtschaftsinformatik Proceedings 013 Wirtschaftsinformatik 013 Matrix Factorization with Content Relationships for Media Personalization
More informationExploiting Local and Global Social Context for Recommendation
Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence Exploiting Local and Global Social Context for Recommendation Jiliang Tang, Xia Hu, Huiji Gao, Huan Liu Computer
More informationImpact of Data Characteristics on Recommender Systems Performance
Impact of Data Characteristics on Recommender Systems Performance Gediminas Adomavicius YoungOk Kwon Jingjing Zhang Department of Information and Decision Sciences Carlson School of Management, University
More informationModeling User Rating Profiles For Collaborative Filtering
Modeling User Rating Profiles For Collaborative Filtering Benjamin Marlin Department of Computer Science University of Toronto Toronto, ON, M5S 3H5, CANADA marlin@cs.toronto.edu Abstract In this paper
More informationarxiv: v2 [cs.ir] 4 Jun 2018
Metric Factorization: Recommendation beyond Matrix Factorization arxiv:1802.04606v2 [cs.ir] 4 Jun 2018 Shuai Zhang, Lina Yao, Yi Tay, Xiwei Xu, Xiang Zhang and Liming Zhu School of Computer Science and
More informationUsing SVD to Recommend Movies
Michael Percy University of California, Santa Cruz Last update: December 12, 2009 Last update: December 12, 2009 1 / Outline 1 Introduction 2 Singular Value Decomposition 3 Experiments 4 Conclusion Last
More informationActive Transfer Learning for Cross-System Recommendation
Proceedings of the Twenty-Seventh AAAI onference on Artificial Intelligence Active Transfer Learning for ross-system ecommendation Lili Zhao, Sinno Jialin Pan, Evan Wei Xiang, Erheng Zhong, Zhongqi Lu,
More informationItem Recommendation for Emerging Online Businesses
Item Recommendation for Emerging Online Businesses Chun-Ta Lu Sihong Xie Weixiang Shao Lifang He Philip S. Yu University of Illinois at Chicago Presenter: Chun-Ta Lu New Online Businesses Emerge Rapidly
More informationExploiting Emotion on Reviews for Recommender Systems
Exploiting Emotion on Reviews for Recommender Systems Xuying Meng 1,2, Suhang Wang 3, Huan Liu 3 and Yujun Zhang 1 1 Institute of Computing Technology, Chinese Academy of Sciences, Beijing 100190, China
More informationMixed Membership Matrix Factorization
Mixed Membership Matrix Factorization Lester Mackey University of California, Berkeley Collaborators: David Weiss, University of Pennsylvania Michael I. Jordan, University of California, Berkeley 2011
More informationCS425: Algorithms for Web Scale Data
CS: Algorithms for Web Scale Data Most of the slides are from the Mining of Massive Datasets book. These slides have been modified for CS. The original slides can be accessed at: www.mmds.org Customer
More informationCollaborative Filtering with Entity Similarity Regularization in Heterogeneous Information Networks
Collaborative Filtering with Entity Similarity Regularization in Heterogeneous Information Networks Xiao Yu Xiang Ren Quanquan Gu Yizhou Sun Jiawei Han University of Illinois at Urbana-Champaign, Urbana,
More informationAPPLICATIONS OF MINING HETEROGENEOUS INFORMATION NETWORKS
APPLICATIONS OF MINING HETEROGENEOUS INFORMATION NETWORKS Yizhou Sun College of Computer and Information Science Northeastern University yzsun@ccs.neu.edu July 25, 2015 Heterogeneous Information Networks
More informationCollaborative Topic Modeling for Recommending Scientific Articles
Collaborative Topic Modeling for Recommending Scientific Articles Chong Wang and David M. Blei Best student paper award at KDD 2011 Computer Science Department, Princeton University Presented by Tian Cao
More informationLinear Regression (9/11/13)
STA561: Probabilistic machine learning Linear Regression (9/11/13) Lecturer: Barbara Engelhardt Scribes: Zachary Abzug, Mike Gloudemans, Zhuosheng Gu, Zhao Song 1 Why use linear regression? Figure 1: Scatter
More informationProbabilistic Neighborhood Selection in Collaborative Filtering Systems
Probabilistic Neighborhood Selection in Collaborative Filtering Systems Panagiotis Adamopoulos and Alexander Tuzhilin Department of Information, Operations and Management Sciences Leonard N. Stern School
More informationData Science Mastery Program
Data Science Mastery Program Copyright Policy All content included on the Site or third-party platforms as part of the class, such as text, graphics, logos, button icons, images, audio clips, video clips,
More informationLearning to Learn and Collaborative Filtering
Appearing in NIPS 2005 workshop Inductive Transfer: Canada, December, 2005. 10 Years Later, Whistler, Learning to Learn and Collaborative Filtering Kai Yu, Volker Tresp Siemens AG, 81739 Munich, Germany
More informationBayesian Matrix Factorization with Side Information and Dirichlet Process Mixtures
Proceedings of the Twenty-Fourth AAAI Conference on Artificial Intelligence (AAAI-10) Bayesian Matrix Factorization with Side Information and Dirichlet Process Mixtures Ian Porteous and Arthur Asuncion
More informationMixture-Rank Matrix Approximation for Collaborative Filtering
Mixture-Rank Matrix Approximation for Collaborative Filtering Dongsheng Li 1 Chao Chen 1 Wei Liu 2 Tun Lu 3,4 Ning Gu 3,4 Stephen M. Chu 1 1 IBM Research - China 2 Tencent AI Lab, China 3 School of Computer
More informationCS249: ADVANCED DATA MINING
CS249: ADVANCED DATA MINING Recommender Systems Instructor: Yizhou Sun yzsun@cs.ucla.edu May 17, 2017 Methods Learnt: Last Lecture Classification Clustering Vector Data Text Data Recommender System Decision
More informationCOMP 551 Applied Machine Learning Lecture 14: Neural Networks
COMP 551 Applied Machine Learning Lecture 14: Neural Networks Instructor: Ryan Lowe (ryan.lowe@mail.mcgill.ca) Slides mostly by: Class web page: www.cs.mcgill.ca/~hvanho2/comp551 Unless otherwise noted,
More informationCross-Domain Recommendation via Cluster-Level Latent Factor Model
Cross-Domain Recommendation via Cluster-Level Latent Factor Model Sheng Gao 1, Hao Luo 1, Da Chen 1, and Shantao Li 1 Patrick Gallinari 2, Jun Guo 1 1 PRIS - Beijing University of Posts and Telecommunications,
More informationMining of Massive Datasets Jure Leskovec, AnandRajaraman, Jeff Ullman Stanford University
Note to other teachers and users of these slides: We would be delighted if you found this our material useful in giving your own lectures. Feel free to use these slides verbatim, or to modify them to fit
More informationAdditive Co-Clustering with Social Influence for Recommendation
Additive Co-Clustering with Social Influence for Recommendation Xixi Du Beijing Key Lab of Traffic Data Analysis and Mining Beijing Jiaotong University Beijing, China 100044 1510391@bjtu.edu.cn Huafeng
More informationImproving Quality of Crowdsourced Labels via Probabilistic Matrix Factorization
Human Computation AAAI Technical Report WS-12-08 Improving Quality of Crowdsourced Labels via Probabilistic Matrix Factorization Hyun Joon Jung School of Information University of Texas at Austin hyunjoon@utexas.edu
More informationRelational Stacked Denoising Autoencoder for Tag Recommendation. Hao Wang
Relational Stacked Denoising Autoencoder for Tag Recommendation Hao Wang Dept. of Computer Science and Engineering Hong Kong University of Science and Technology Joint work with Xingjian Shi and Dit-Yan
More informationContext-aware Ensemble of Multifaceted Factorization Models for Recommendation Prediction in Social Networks
Context-aware Ensemble of Multifaceted Factorization Models for Recommendation Prediction in Social Networks Yunwen Chen kddchen@gmail.com Yingwei Xin xinyingwei@gmail.com Lu Yao luyao.2013@gmail.com Zuotao
More informationCollaborative Filtering
Case Study 4: Collaborative Filtering Collaborative Filtering Matrix Completion Alternating Least Squares Machine Learning/Statistics for Big Data CSE599C1/STAT592, University of Washington Carlos Guestrin
More informationIterative Laplacian Score for Feature Selection
Iterative Laplacian Score for Feature Selection Linling Zhu, Linsong Miao, and Daoqiang Zhang College of Computer Science and echnology, Nanjing University of Aeronautics and Astronautics, Nanjing 2006,
More informationLarge-scale Ordinal Collaborative Filtering
Large-scale Ordinal Collaborative Filtering Ulrich Paquet, Blaise Thomson, and Ole Winther Microsoft Research Cambridge, University of Cambridge, Technical University of Denmark ulripa@microsoft.com,brmt2@cam.ac.uk,owi@imm.dtu.dk
More informationReview: Probabilistic Matrix Factorization. Probabilistic Matrix Factorization (PMF)
Case Study 4: Collaborative Filtering Review: Probabilistic Matrix Factorization Machine Learning for Big Data CSE547/STAT548, University of Washington Emily Fox February 2 th, 214 Emily Fox 214 1 Probabilistic
More informationLearning to Recommend Point-of-Interest with the Weighted Bayesian Personalized Ranking Method in LBSNs
information Article Learning to Recommend Point-of-Interest with the Weighted Bayesian Personalized Ranking Method in LBSNs Lei Guo 1, *, Haoran Jiang 2, Xinhua Wang 3 and Fangai Liu 3 1 School of Management
More informationSoCo: A Social Network Aided Context-Aware Recommender System
SoCo: A Social Network Aided Context-Aware Recommender System ABSTRACT Xin Liu École Polytechnique Fédérale de Lausanne Batiment BC, Station 14 1015 Lausanne, Switzerland x.liu@epfl.ch Contexts and social
More informationPersonalized Multi-relational Matrix Factorization Model for Predicting Student Performance
Personalized Multi-relational Matrix Factorization Model for Predicting Student Performance Prema Nedungadi and T.K. Smruthy Abstract Matrix factorization is the most popular approach to solving prediction
More informationA Bayesian Treatment of Social Links in Recommender Systems ; CU-CS
University of Colorado, Boulder CU Scholar Computer Science Technical Reports Computer Science Spring 5--0 A Bayesian Treatment of Social Links in Recommender Systems ; CU-CS-09- Mike Gartrell University
More informationMitigating Data Sparsity Using Similarity Reinforcement-Enhanced Collaborative Filtering
Mitigating Data Sparsity Using Similarity Reinforcement-Enhanced Collaborative Filtering YAN HU, University of Chinese Academy of Sciences and Wayne State University WEISONG SHI, Wayne State University
More informationINFO 4300 / CS4300 Information Retrieval. slides adapted from Hinrich Schütze s, linked from
INFO 4300 / CS4300 Information Retrieval slides adapted from Hinrich Schütze s, linked from http://informationretrieval.org/ IR 8: Evaluation & SVD Paul Ginsparg Cornell University, Ithaca, NY 20 Sep 2011
More informationNCDREC: A Decomposability Inspired Framework for Top-N Recommendation
NCDREC: A Decomposability Inspired Framework for Top-N Recommendation Athanasios N. Nikolakopoulos,2 John D. Garofalakis,2 Computer Engineering and Informatics Department, University of Patras, Greece
More informationCost-Aware Collaborative Filtering for Travel Tour Recommendations
Cost-Aware Collaborative Filtering for Travel Tour Recommendations YONG GE, University of North Carolina at Charlotte HUI XIONG, RutgersUniversity ALEXANDER TUZHILIN, New York University QI LIU, University
More information