Survey on Matrix Completion Models and Algorithms
Author:
Affiliation:

Fund Project:

National Natural Science Foundation of China (61472186, 61572263, 61403208); National Science Foundation of Jiangsu Province (BK20161516, BK20151511); Postdoctoral Science Foundation of China (2015M581794); Project of Natural Science Research of Jiangsu University (15KJB520027); Postdoctoral Science Foundation of Jiangsu Province (1501023C); NUPTSF (NY214127, NY215097)

  • Article
  • | |
  • Metrics
  • |
  • Reference [99]
  • |
  • Related [20]
  • | | |
  • Comments
    Abstract:

    In recent years, with the great success of compressed sensing (CS) in the field of signal processing, matrix completion (MC), derived from CS, has increasingly become a hot research topic in the field of machine learning. Many researchers have done a lot of fruitful studies on matrix completion problem modeling and their optimization, and constructed relatively complete matrix completion theory. In order to better grasp the development process of matrix completion, and facilitate the combination of matrix completion theory and engineering applications, this article reviews the existing matrix completion models and their algorithms. First, it introduces the natural evolution process from CS to MC, and illustrates that the development of CS theory has laid the foundation for the formation of MC theory. Second, the article summarizes the existing matrix completion models into the four classes from the perspective of the relaxation of non-convex and non-smooth rank function, aiming to provide reasonable solutions for specific matrix completion applications; Third, in order to understand the inherent optimization techniques and facilitate solving new problem-dependent matrix completion model, the article studies the representative optimization algorithms suitable for various matrix completion models. Finally, article analyzes the existing problems in current matrix completion technology, proposes possible solutions for these problems, and discusses the future work.

    Reference
    [1] Donoho DL. Compressed sensing. IEEE Trans. on Information Theory, 2006,52(4):1289-1306.[doi:10.1109/TIT.2006.871582]
    [2] Candès EJ, Romberg J, Tao T. Robust uncertainty principles:Exact signal reconstruction from highly incomplete frequency information. IEEE Trans. on Information Theory, 2006,52(2):489-509.[doi:10.1109/TIT.2005.862083]
    [3] Candes EJ, Recht B. Exact matrix completion via convex optimization. Foundations of Computational Mathematics, 2009,9(6):717-772.[doi:10.1007/s10208-009-9045-5]
    [4] Li W, Zhao L, Lin ZJ, Xu DQ, Lu DM. Non-Local image inpainting using low-rank matrix completion. Computer Graphics Forum, 2015,34(6):111-122.[doi:10.1111/cgf.12521]
    [5] Cabral R, De la Torre F, Costeira JP, Bernardino A. Matrix completion for weakly-supervised multi-label image classification. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2015,37(1):121-135.[doi:10.1109/TPAMI.2014.2343234]
    [6] Mardani M, Giannakis GB. Estimating traffic and anomaly maps via network tomography. IEEE/ACM Trans. on Networking, 2016, 24(5):1533-1547.[doi:10.1109/TNET.2015.2417809]
    [7] Yi KF, Wan JW, Yao L, Bao TY. Partial matrix completion algorithm for efficient data gathering in wireless sensor networks. IEEE Communications Letters, 2015,19(1):54-57.[doi:10.1109/LCOMM.2014.2371998]
    [8] Otazo R, Candès E, Sodickson DK. Low-Rank plus sparse matrix decomposition for accelerated dynamic MRI with separation of background and dynamic components. Magnetic Resonance in Medicine, 2015,73(3):1125-1136.[doi:10.1002/mrm.25240]
    [9] Taghizadeh MJ, Parhizkar R, Garner PN, Bourlard H, Asaei A. Ad hoc microphone array calibration:Euclidean distance matrix completion algorithm and theoretical guarantees. Signal Processing, 2015,107:123-140.[doi:10.1016/j.sigpro.2014.07.016]
    [10] Hsieh CJ, Natarajan N, Dhilon IS. PU learning for matrix completion. In:Proc. of the Int'l Conf. on Machine Learning. 2015.
    [11] Adeli-Mosabbeb E, Fathy M. Non-Negative matrix completion for action detection. Image and Vision Computing, 2015,39(1):38-51.[doi:10.1016/j.imavis.2015.04.006]
    [12] Yi J, Zhang L, Jin R, Qian Q, Jain AK. Semi-Supervised clustering by input pattern assisted pairwise similarity matrix completion. In:Proc. of the Int'l Conf. on Machine Learning. 2013.
    [13] Gogn A, Majumdar A. Matrix completion incorporating auxiliary information for recommender system design. Expert System with Applications, 2015,42(12):5789-5799.[doi:10.1016/j.eswa.2015.04.012]
    [14] Candes EJ, Eldar YC, Strohmer T, Vladislav H. Phase retrieval via matrix completion. SIAM Review, 2015,57(2):225-251.[doi:10.1137/151005099]
    [15] Kumar R, Silva CD, Akalin O, Aravkin AY, Mansour H; Recht B; Herrmann FJ. Efficient matrix completion for seismic data reconstruction. Geophysics, 2015, 80(5):97-114.[doi:10.1190/geo2014-0369.1]
    [16] Zhao Z, Zhang LJ, He XF, Ng W. Expert finding for question answering via graph regularized matrix completion, IEEE Trans. on Knowledge and Data Engineering, 2015,27(4):993-1004.[doi:10.1109/TKDE.2014.2356461]
    [17] Xiao F, Sha CH, Chen L, Sun LJ, Wang RC. Noise-Tolerant localization from incomplete range measurements for wireless sensor networks. In:Proc. of the IEEE Int'l Conf. on Computer Communications. 2015. 2794-2802.[doi:10.1109/INFOCOM.2015. 7218672]
    [18] Chen L, Yang G, Chen ZY, Xiao F, Xu J. Web services QoS prediction via matrix completion with structural noise. Journal on Communications, 2015,36(6):49-59(in Chinese with English abstract).
    [19] Chen L, Yang G, Chen ZY, Xiao F, Shi JY. Correlation consistency constrained matrix completion for Web service tag refinement. Neural Computing and Applications, 2015,26(1):101-110.[doi:10.1007/s00521-014-1704-z]
    [20] Bishop WE, Yu BM. Deterministic symmetric positive semidefinite matrix completion. In:Proc. of the Advances in Neural Information Processing Systems. 2014.
    [21] Li XD. Compressed sensing and matrix completion with constant proportion of corruptions. Constructive Approximation, 2013,37:73-99.[doi:10.1007/s00365-012-9176-9]
    [22] Candes EJ, Plan Y. Matrix completion with noise. Proc. of the IEEE, 2010,98(6):925-936.[doi:10.1109/JPROC.2009.2035722]
    [23] Mardani M, Mateos G, Giannakis GB. Subspace learning and imputation for streaming big data matrices and tensors. IEEE Trans. on Signal Processing, 2015,63(10):2663-2677.[doi:10.1109/TSP.2015.2417491]
    [24] Wang Z, Lai J, Lu Z S, Ye JP. Orthogonal rank-one matrix pursuit for low rank matrix completion SIAM Journal on Scientific Computing, 2015,37(1):488-514.[doi:10.1137/130934271]
    [25] Ma S, Goldfarb D, Chen L. Fixed point and Bregman iterative methods for matrix rank minimization. Mathematics Programming, 2011,128(1):321-353.[doi:10.1007/s10107-009-0306-5]
    [26] Xu YY, Yin WT, Wen ZW, Zhang Y. An alternating direction algorithm for matrix completion with nonnegative factors. Frontiers of Mathematics in China, 2012,7(2):365-384.[doi:10.1007/s11464-012-0194-5]
    [27] Mohan K, Fazel M. Iterative reweighted algorithms for matrix rank minimization. Journal of Machine Learning Research, 2013,13:3253-3285.
    [28] Mackey L, Talwalkar A, Jordan M. Distributed matrix completion and robust factorization. Journal of Machine Learning Research, 2015,16(1):913-960.
    [29] Candes EJ, Tao T. The power of convex relaxation:Near-optimal matrix completion. IEEE Trans. on Information Theory, 2009, 56(5):2053-2080.[doi:10.1109/TIT.2010.2044061]
    [30] Makari F, Teflioudi C, Gemulla R, Haas P, Sismanis Y. Shared-Memory and shared-nothing stochastic gradient descent algorithms for matrix completion. Knowledge and Information Systems, 2015,42(3):493-523.[doi:10.1007/s10115-013-0718-7]
    [31] Nie FP, Wang H, Huang H, Ding C. Joint schatten p-norm and Lp-norm roubst matrix completion for missing value recovery. Knowledge and Information Systems, 2015,42(3):525-544.[doi:10.1007/s10115-013-0713-z]
    [32] Lu CY, Tang JH, Yan SC, Lin Z. Generalized nonconvex non-smooth low-rank minimization. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2014. 4130-4137.[doi:10.1109/CVPR.2014.526]
    [33] Zhang LJ, Yang T, Jin R, Zhou ZH. Stochastic proximal gradient descent for nuclear norm regularization. 2015. http://arxiv.org/abs/1511.01664CoRRabs/1511.016641562
    [34] Boumal N, Absil P. RTRMC:A riemannian trust region method for matrix completion. In:Proc. of the Annual Conf. on Neural Information Processing Systems. 2011.
    [35] Ghasemi H, Malek-Mohammadi M, Babaie-Zadeh M. SRF:Matrix completion based on smoothed rank function. In:Proc. of the IEEE Int'l Conf. on Acoustics, Speech and Signal Processing. 2011. 3672-3675.[doi:10.1109/ICASSP.2011.5947147]
    [36] Winlaw M, Hynes M, Caterini A, Sterck H. Algorithmic acceleration of parallel ALS for collaborative filtering; speeding up distributed big data recommendation in Spark. In:Proc. of the IEEE Int'l Conf. on Parallel and Distributed Systems. 2015. 682-691.[doi:10.1109/ICPADS.2015.91]
    [37] Sun RY, Luo ZQ. Guaranteed matrix completion via non-convex factorization. In:Proc. of the Annual Symp. on Foundations of Computer Science. 2015.[doi:10.1109/FOCS.2015.25]
    [38] Lin ZC. Rank minimization:Theory, algorithms and applications. In:Zhang CS, Zhou ZH, eds. Proc. of the Machine Learning and Their Applications. Beijing:Tsinghua University Press, 2013. 143-164(in Chinese with English abstract).
    [39] Peng YG, Suo JL, Dai QH, Xu LW. From compressed sensing to low-rank matrix recovery:Theory and aplications. Acta Actomatica Sinica, 2013,39(7):981-994(in Chinese with English abstract).[doi:10.1016/S1874-1029(13)60063-4]
    [40] Chen CH, He BS, Yuan XM. Matrix completion via an alternating direction method. IMA Journal of Numerical Analysis, 2012, 32(1):227-245.[doi:10.1093/imanum/drq039]
    [41] Wen SW, Xu FF, Wen ZW, Chen L. Robust linear optimization under matrix completion. Science in China:Mathematics, 2014,57:699-710.[doi:10.1007/s11425-013-4697-7]
    [42] Jiao LC, Zhao J, Yang SY, Liu F, Xie W. Research advances on sparse cognitive learning, computing and recognition. Chinese Journal of Computers, 2016,39(4):835-852(in Chinese with English abstract).
    [43] Hu Y, Zhang DB, Ye JP, Li X, He XF. Fast and accurate matrix completion via truncated nuclear norm regularization. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2013,35(9):2117-2130.[doi:10.1109/TPAMI.2012.271]
    [44] Chen L, Yang G, Chen ZY, Xiao F, Chen SC. Linearized bregman iteration algorithm for matrix completion with structural noise. Chinese Journal of Computers, 2015,38(7):1357-1371(in Chinese with English abstract).
    [45] Tibshirani R. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society, 1996,58(1):267-288.
    [46] Fazel M. Matrix rank minimization with applications[Ph.D. Thesis]. Standford University, 2002.
    [47] Toh KC, Yun SW. An accelerated proximal gradient algorithm for nuclear norm regularized least squares problems. Pacific Journal of Optimization, 2010,6(3):615-640.
    [48] Cai JF, Candes EJ, Shen Z. A singular value thresholding algorithm for matrix completion, SIAM Journal of Optimization, 2010, 20(4):1956-1982.[doi:10.1137/080738970]
    [49] Lin ZC, Chen MM, Wu LQ, Ma Y. The augmented Lagrange multiplier method for exact recovery of corrupted low-rank matrices. Technical Report, UILU-ENG-09-2214, UIUC University, 2010.
    [50] Wen ZW, Yin WT, Zhang Y. Solving a low rank factorization model for matrix completion by a nonlinear successive overrelaxation algorithm. Mathematic Programming Computation. 2012,4:333-361.[doi:10.1007/s12532-012-0044-1]
    [51] Liu YY, Jiao LC, Shang FH. A fast tri-factorization method for low-rank matrix recovery and completion. Pattern Recognition, 2013,46(1):163-173.[doi:10.1016/j.patcog.2012.07.003]
    [52] Srebro N, Shraibman A. Rank, trace-norm and max-norm. In:Proc. of the Annual Conf. on Learning Theory. 2005. 545-560.[doi:10.1007/11503415_37]
    [53] Mardani M, Mateos G, Giannakis GB. Rank minimization for subspace tracking from incomplete data. In:Proc. of the IEEE Int'l Conf. on Acoustics, Speech and Signal Processing. 2013. 5681-5685.[doi:10.1109/ICASSP.2013.6638752]
    [54] Geng J, Wang LS, Wang YF. A non-convex algorithm framework based on DC programming and DCA for matrix completion. Numerical Algorithms, 2015,68:903-921.[doi:10.1007/s11075-014-9876-2]
    [55] Kang Z, Peng C, Cheng J, Cheng Q. LogDet rank minimization with application to subspace clustering. Computational Intelligence and Neuroscience, 2015, Article ID 824289.[doi:10.1155/2015/824289]
    [56] Honerkamp J, Weese J. Tikhonovs regularization method for ill-posed problems. Continuum Mechanics and Thermodynamics, 1990,2(1):17-30.[doi:10.1007/BF01170953]
    [57] Chen Z, Haykin S. On different facets of regularization theory. Neural Computation, 2002,14(12):2791-2846.[doi:10.1162/089976602760805296]
    [58] Combettes PL, Wajs VR. Signal recovery by proximal forward-backward splitting. Multiscale Modeling and Simulation, 2005,4(4):1168-1200.[doi:10.1137/050626090]
    [59] Nemirovski A. Efficient methods in convex programming. Lecture Notes, 1995,23(3):24-39.
    [60] Beck A, Teboulle M. A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM Journal on Imaging Sciences, 2009,2(1):183-202.[doi:10.1137/080716542]
    [61] Nesterov Y. A method of solving a convex programming problem with convergence rate O(k-2). Soviet Mathematics Doklady, 1983, 27(2):372-376.
    [62] Nesterov Y. Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM Journal on Optimization, 2012, 22:341-362.[doi:10.1137/100802001]
    [63] Beck A, Tetruashvili L. On the convergence of block coordinate descent methods. SIAM Journal on Optimization, 2013,23(4):2037-2060.[doi:10.1137/120887679]
    [64] Bregman L. The relaxation method of finding the common points of convex sets and its application to the solution of problems in convex programming. Computational Mathematics and Mathematical Physics, 1967,7(3):200-217.[doi:10.1016/0041-5553(67) 90040-7]
    [65] Osher S, Burger M, Goldfarb M. An iterative regularization method for total variation-based image restoration. Multiscale Modeling and Simulation, 2005,4(2):460-489.[doi:10.1137/040605412]
    [66] Yin WT, Osher S, Goldfarb D. Bregman iterative algorithms for L1-minimization with applications to compressed sensing. SIAM Journal on Imaging Sciences, 2008,1(1):143-168.[doi:10.1137/070703983]
    [67] Cai JF, Osher S, Shen Z. Linearized Bregman iteration for frame-based image deblurring. SIAM Journal on Imaging Sciences, 2009, 2(2):226-252.[doi:10.1137/080733371]
    [68] Boyd S, Parikh N, Chu E. Distributed optimization and statistical learning via the alternating direction method of multipliers. Foundations and Trends® in Machine Learning, 2011,3(1):1-122.[doi:10.1561/2200000016]
    [69] Douglas J, Rachford HH. On the numerical solution of heat conduction problems in two and three space variables. Trans. of the American Mathematical Society, 1956,82:421-439.[doi:10.1090/S0002-9947-1956-0084194-4]
    [70] Gabay D, Mercier B. A dual algorithm for the solution of nonlinear variational problems via finite element approximations. Computers and Mathematics with Applications, 1976,2:17-40.[doi:10.1016/0898-1221(76)90003-1]
    [71] Zhang RL, Kwok J. Asynchronous distributed ADMM for consensus optimization. In:Proc. of the Int'l Conf. on Machine Learning. 2014.
    [72] Zhang WL, Kwok J. Fast stochastic alternating direction method of multipliers. In:Proc. of the Int'l Conf. on Machine Learning. 2014.
    [73] He BS, Yuan XM. On the O(1/n) convergence rate of the Douglas-Rachford alternating direction method. SIAM Journal of Numerical Analysis. 2012,50(2):700-709.[doi:10.1137/110836936]
    [74] He BS, Tao M, Yuan XM. Alternating direction method with Gaussian back substitution for separable convex programming. SIAM Journal on Optimization, 2012,22(2):313-340.[doi:10.1137/110822347]
    [75] Lin ZC, Liu RS, Li H. Linearized alternating direction method with parallel splitting and adaptive penalty for separable convex programs in machine learning. Machine Learning, 2015,99(2):287-325.[doi:10.1007/s10994-014-5469-5]
    [76] Nemirovski A, Juditsky A, Lan G, Shapiro A. Robust stochastic approximation approach to stochastic programming. SIAM Journal on Optimization, 2009,19(4):1574-1609.[doi:10.1137/070704277]
    [77] Recht B, Re C. Parallel stochastic gradient algorithms for large-scale matrix completion. Mathematical Programming Computation, 2013,5(2):201-226.[doi:10.1007/s12532-013-0053-8]
    [78] Liu J, Wright SJ, Re C, Bittorf V, Sridhar S. An asynchronous parallel stochastic coordinate descent algorithm. Journal of Machine Learning Research, 2015,16(1):285-322.
    [79] Azadi S, Sra S. Towards an optimal stochastic alternating direction method of multipliers. In:Proc. of the Int'l Conf. on Machine Learning. 2014. 1564
    [80] Zhang LJ, Yang T, Yi J, Jin R, Zhou ZH. Stochastic optimization for kernel PCA. In:Proc. of the AAAI Conf. on Artificial Intelligence. 2016. 2316-2322.
    [81] Zhang ZH. The singular value decomposition, applications and beyond. arXiv:1510.08532v1, http://arxiv.org/abs/1510.08532
    [82] Tao M, He BS, Yuan XM. Recovering low-rank and sparse components of matrices from incomplete and noisy observations. SIAM Journal on Optimization, 2011,21(1):57-81.[doi:10.1137/100781894]
    [83] Yan M, Yang Y, Osher S. Exact low-rank matrix completion from sparsely corrupted entries via adaptive outlier pursuit. Journal of Scientific Computing, 2013,56(3):433-449.[doi:10.1007/s10915-013-9682-3]
    [84] Chen YD, Jalali A, Sanghavi S, Caramanis C. Low-Rank matrix recovery from errors and erasures. IEEE Trans. on Information Theory, 2013,59(7):4324-4337.[doi:10.1109/TIT.2013.2249572]
    [85] Chen YD, Xu H, Caramanis C, Sanghavi S. Robust matrix completion and corrupted columns. In:Proc. of the Int'l Conf. on Machine Learning. 2011.
    [86] Xu M, Jin R, Zhou ZH. Speedup matrix completion with side information:Application to multi-label learning. In:Proc. of the Advances in Neural Information Processing Systems. 2013.
    [87] Natarajan N, Dhillon IS. Matrix completion for predicting gene-disease associations. Bioinformatics, 2014,30(12):60-68.[doi:10.1093/bioinformatics/btu269]
    [88] Teflioudi C, Makari F, Gemulla R. Distributed matrix completion. In:Proc. of the IEEE Int'l Conf. on Data Mining. 2012. 655-664.[doi:10.1109/ICDM.2012.120]
    [89] Mosabbeb EA, Fathy M. Distributed matrix completion for large-scale multi-label classification. Intelligent Data Analysis, 2014,18:1137-1151.
    [90] Liu J, Musialski P, Wonka P, Ye JP. Tensor completion for estimating missing values in visual data. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2013,35(1):208-220.[doi:10.1109/TPAMI.2012.39]
    [91] Alameda-Pineda X, Yan Y, Ricci E, Sebe N. Recognizing emotions from abstract paintings using non-linear matrix completion. In:Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. 2016. 5240-5248.[doi:10.1109/CVPR.2016.566]
    [92] Xu LB, Davenport MA. Dynamic matrix recovery from incomplete observations under an exact low-rank constraint. In:Proc. of the Advances in Neural Information Processing Systems. 2016.
    [93] Cao XR, Chen Y, Zhao Q, Meng DY, Wang Y, Wang D, Xu ZB. Low-Rank matrix factorization under general mixture noise distributions. In:Proc. of the Int'l Conf. on Computer Vision. 2015.[doi:10.1109/ICCV.2015.175]
    附中文参考文献:
    [18] 陈蕾,杨庚,陈正宇,肖甫,许建.基于结构化噪声矩阵补全的Web服务QoS预测.通信学报,2015,36(6):49-59.
    [38] 林宙辰.秩极小化:理论、算法与应用.见:张长水,周志华,编.机器学习及其应用会议论文集.北京:清华大学出版社,2013.143-164.
    [39] 彭义刚,索津莉,戴琼海,徐文立.从压缩传感到低秩矩阵恢复:理论与应用.自动化学报,2013,39(7):981-994.[doi:10.1016/S1874-1029(13)60063-4]
    [42] 焦李成,赵进,杨淑媛,刘芳,谢雯.稀疏认知学习、计算与识别的研究进展.计算机学报,2016,39(4):835-852.
    [44] 陈蕾,杨庚,陈正宇,肖甫,陈松灿.基于线性Bregman迭代的结构化噪声矩阵补全算法.计算机学报,2015,38(7):1357-1371.
    Cited by
    Comments
    Comments
    分享到微博
    Submit
Get Citation

陈蕾,陈松灿.矩阵补全模型及其算法研究综述.软件学报,2017,28(6):1547-1564

Copy
Share
Article Metrics
  • Abstract:5377
  • PDF: 15729
  • HTML: 5868
  • Cited by: 0
History
  • Received:May 26,2016
  • Revised:November 03,2016
  • Online: February 21,2017
You are the first2032375Visitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063