Skip to main content
Article
Incremental General Non-negative Matrix Factorization without Dimension Matching Constraints
Computer Science and Engineering
  • Zigang Chen
  • Lixiang Li
  • Haipeng Peng
  • Yuhong Liu, Santa Clara University
  • Yixian Yang
Document Type
Article
Publication Date
10-15-2018
Publisher
Elsevier B.V.
Disciplines
Abstract

In this paper, we propose a General Non-negative Matrix Factorization based on the left Semi-Tensor Product (lGNMF) and the General Non-negative Matrix Factorization based on the right Semi-Tensor Product (rGNMF), which factorize an input non-negative matrix into two non-negative matrices of lower ranks based on gradient method. In particular, the proposed models are able to remove the dimension matching constraints required by conventional NMF models. Both theoretical derivation and experimental results show that the conventional NMF is a special case of the proposed lGNMF and rGNMF. We find the method for the best efficacy of the image restoration in lGNMF and rGNMF by experiments on baboon and lenna images. Moreover, inspired by the Incremental Non-negative Matrix Factorization (INMF), we propose the Incremental lGNMF (IlGNMF) and Incremental rGNMF (IrGNMF), We also conduct the experiments on JAFFE database and ORL database, and find that IlGNMF and IrGNMF realize saving storage space and reducing computation time in incremental facial training.

Citation Information
Chen, Z., Li, L., Peng, H., Liu, Y., & Yang, Y. (2018). Incremental general non-negative matrix factorization without dimension matching constraints. Neurocomputing, 311, 344–352. https://doi.org/10.1016/j.neucom.2018.05.067