10.1145/1102351.1102441acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicpsprocConference Proceedings
ARTICLE

Fast maximum margin matrix factorization for collaborative prediction

ABSTRACT

Maximum Margin Matrix Factorization (MMMF) was recently suggested (Srebro et al., 2005) as a convex, infinite dimensional alternative to low-rank approximations and standard factor models. MMMF can be formulated as a semi-definite programming (SDP) and learned using standard SDP solvers. However, current SDP solvers can only handle MMMF problems on matrices of dimensionality up to a few hundred. Here, we investigate a direct gradient-based optimization method for MMMF and demonstrate it on large collaborative prediction problems. We compare against results obtained by Marlin (2004) and find that MMMF substantially outperforms all nine methods he tested.

References

  1. Azar, Y., Fiat, A., Karlin, A. R., McSherry, F., & Saia, J. (2001). Spectral analysis of data. ACM Symposium on Theory of Computing (pp. 619--626). Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Billsus, D., & Pazzani, M. J. (1998). Learning collaborative information filters. Proc. 15th International Conf. on Machine Learning (pp. 46--54). Morgan Kaufmann, San Francisco, CA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Canny, J. (2004). Gap: a factor model for discrete data. SIGIR '04: Proceedings of the 27th annual international conference on Research and development in information retrieval (pp. 122--129). Sheffield, United Kingdom: ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Collins, M., Dasgupta, S., & Schapire, R. (2002). A generalization of principal component analysis to the exponential family. Advances in Neural Information Processing Systems 14.Google ScholarGoogle Scholar
  5. Fazel, M., Hindi, H., & Boyd, S. P. (2001). A rank minimization heuristic with application to minimum order system approximation. Proceedings American Control Conference.Google ScholarGoogle ScholarCross RefCross Ref
  6. Hofmann, T.(2004). Latent semantic models for collaborative filtering. ACM Trans. Inf. Syst., 22, 89--115. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Lee, D., & Seung, H. (1999). Learning the parts of objects by non-negative matrix factorization. Nature, 401, 788--791.Google ScholarGoogle ScholarCross RefCross Ref
  8. Marlin, B. (2004). Collaborative filtering: A machine learning perspective. Master's thesis, University of Toronto, Computer Science Department.Google ScholarGoogle Scholar
  9. Marlin, B., & Zemel, R. S. (2004). The multiple multiplicative factor model for collaborative filtering. Proceedings of the 21st International Conference on Machine Learning. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Nocedal, J., & Wright, S. J. (1999). Numerical optimization. Springer-Verlag.Google ScholarGoogle Scholar
  11. Rennie, J. D. M., & Srebro, N. (2005). Loss functions for preference levels: Regression with discrete ordered labels. Proceedings of the IJCAI Multidisciplinary Workshop on Advances in Preference Handling.Google ScholarGoogle Scholar
  12. Shewchuk, J. R. (1994). An introduction to the conjugate gradient method without the agonizing pain. http://www.cs.cmu.edu/~jrs/jrspapers.html.Google ScholarGoogle Scholar
  13. Srebro, N., & Jaakkola, T. (2003). Weighted low rank approximation. 20th International Conference on Machine Learning.Google ScholarGoogle Scholar
  14. Srebro, N., Rennie, J. D. M., & Jaakkola, T. (2005). Maximum margin matrix factorization. Advances In Neural Information Processing Systems 17.Google ScholarGoogle Scholar
  15. Srebro, N., & Schraibman, A. (2005). Rank, trace-norm and max-norm. Proceedings of the 18th Annual Conference on Learning Theory. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Zhang, T., & Oles, F. J. (2001). Text categorization based on regularized linear classification methods. Information Retrieval, 4, 5--31. Google ScholarGoogle ScholarDigital LibraryDigital Library

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader
About Cookies On This Site

We use cookies to ensure that we give you the best experience on our website.

Learn more

Got it!