基于局部与全局保持的半监督维数约减方法
作者:
基金项目:

Supported by the Natural Science Foundation of Guangdong Province of China under Grant No.07006474 (广东省自然科学基金); the Sci & Tech Research Project of Guangdong Province of China under Grant No.2007B010200044 (广东省科技攻关项目)


Local and Global Preserving Based Semi-Supervised Dimensionality Reduction Method
  • 摘要
  • | |
  • 访问统计
  • |
  • 参考文献 [25]
  • |
  • 相似文献 [20]
  • |
  • 引证文献
  • | |
  • 文章评论
    摘要:

    在很多机器学习和数据挖掘任务中,仅仅利用边信息(side-information)并不能得到最好的半监督学习(semi-supervised learning)效果,因此,提出一种基于局部与全局保持的半监督维数约减(local and global preserving based semi-supervised dimensionality reduction,简称LGSSDR)方法.该算法不仅能够保持正、负约束信息而且能够保持数据集所在低维流形的全局以及局部信息.另外,该算法能够计算出变换矩阵并较容易地处理未见样本.实验结果验证了该算法的有效性.

    Abstract:

    In many machine learning and data mining tasks, it can't achieve the best semi-supervised learning result if only use side-information. So, a local and global preserving based semi-supervised dimensionality reduction (LGSSDR) method is proposed in this paper. LGSSDR algorithm can not only preserve the positive and negative constraints but also preserve the local and global structure of the whole data manifold in the low dimensional embedding subspace. Besides, the algorithm can compute the transformation matrix and handle unseen samples easily. Experimental results on several datasets demonstrate the effectiveness of this method.

    参考文献
    [1] Duda RO, Hart PE, Stork DG. Pattern Classification. 2nd ed., New York: John Wiley & Sons, 2001.
    [2] Turk MA, Pentland AP. Face recognition using eigenfaces. In: Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. Madison: IEEE Computer Society, 1991. 586-591.
    [3] Martinez AM, Kak AC. PCA Versus LDA. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2001,23(2):228-233.
    [4] Zhu XJ. Semi-Supervised learning literature survey. Technical Report, 1530, Department of Computer Sciences, University of Wisconsin at Madison, 2006. http://www.cs.wisc.edu/~jerryzhu/pub/ssl_survey.pdf
    [5] Wagstaff K, Cardie C. Clustering with instance-level constraints. In: Proc. of the 17th Int’l Conf. on Machine Learning. San Francisco: Morgan Kaufmann Publishers, 2000. 1103-1110.
    [6] Klein D, Kamvar SD, Manning CD. From instance-level constraints to space-level constraints: Making the most of prior knowledge in data clustering. In: Sammut C, Hoffmann AG, eds. Proc. of the 19th Int’l Conf. on Machine Learning. San Francisco: Morgan Kaufmann Publishers, 2002. 307-314.
    [7] Shental N, Hertz T, Weinshall D, Pavel M. Adjustment learning and relevant component analysis. In: Shental N, Hertz T, Weinshall D, Pavel M, eds. Proc. of the 7th European Conf. on Computer Vision. London: Springer-Verlag, 2002. 776-792.
    [8] Bar-Hillel A, Hertz T, Shental N, Weinshall D. Learning a Mahalanobis metric from equivalence constraints. Journal of Machine Learning Research, 2005,6(6):937-965.
    [9] Xing EP, Ng AY, Jordan MI, Russell S. Distance metric learning, with application to clustering with Side-information. In: Becker S, Thrun S, Obermayer K, eds. Advances in Neural Information Processing Systems 15. Cambridge: MIT Press, 2003. 505-512.
    [10] Tang W, Zhong S. Pairwise constraints-guided dimensionality reduction. In: Proc. of the 2006 SIAM Int’l Conf. on Data Mining Workshop on Feature Selection for Data Mining. 2006. 59-66.
    [11] Yeung DY, Chang H. Extending the relevant component analysis algorithm for metric learning using both positive and negative equivalence constraints. Pattern Recognition, 2006,39(5):1007-1010.
    [12] Wu F, Zhou YL, Zhang CS. Relevant linear feature extraction using side-information and unlabeled data. In: Proc. of the 17th Int’l Conf. on Pattern Recognition. Washington: IEEE Computer Society, 2004. 582-585.
    [13] Zhang DQ, Zhou ZH, Chen SC. Semi-Supervised dimensionality reduction. In: Proc. of the 7th SIAM Int’l Conf. on Data Mining. 2007. 629-634.
    [14] Belkin M, Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 2003,15(6): 1373-1396.
    [15] Cai D, He XF, Han JW. Semi-Supervised discriminant analysis. In: Proc. of the 11th IEEE Int’l Conf. on Computer Vision. 2007. 1-7.
    [16] Roweis ST, Saul LK. Nonlinear dimensionality reduction by locally linear embedding. Science, 2000,290(22):2323-2327.
    [17] Belhumeur PN, Hespanha JP, Kriegman DJ. Eigenfaces vs. fisherfaces: Recognition using class specific linear projection. IEEE Trans. on Pattern Analysis and Machine Intelligence, 1997,19(7):711-720.
    [18] Yan SC, Xu D, Zhang BY, Zhang HJ, Yang Q, Lin S. Graph embedding and extensions: A general framework for dimensionality reduction. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2007,29(1):40-51.
    [19] Yang J, Zhang D, Yang JY, Niu B. Globally maximizing, locally minimizing: Unsupervised discriminant projection with application to face and palm biometrics. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2007,29(4):650-664.
    [20] Sch?lokopf B, Smola AJ, Muller KR. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 1998,10(5):1299-1319.
    [21] Mika S, R?tsch G, Weston J, Sch?lkopf B, Müller KR. Fisher discriminant analysis with kernels. In: Hu YH, Larsen J, Wilson E, Douglas S, eds. Proc. of the 1999 IEEE Signal Processing Society Workshop. Madison: IEEE Computer Society, 1999. 41-48.
    [22] He XF, Niyogi P. Locality preserving projections. In: Thrun S, Saul L, Scholkopf B, eds. Advances in Neural Information Processing Systems 16. Cambridge: MIT Press, 2004. 153-160.
    [23] Sch?lkopf B, Smola AJ. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. Cambridge: MIT Press, 2002.
    [24] Asuncion A, Newman DJ. UCI machine learning repository. School of Information and Computer Science, University of California, Irvine. 2007. http://mlearn.ics.uci.edu/MLRepository.html
    [25] Sim T, Barker S, Bsat M. The CMU pose, illumination, and expression database. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2003,25(12):1615-1618.
    网友评论
    网友评论
    分享到微博
    发 布
引用本文

韦 佳,彭 宏.基于局部与全局保持的半监督维数约减方法.软件学报,2008,19(11):2833-2842

复制
分享
文章指标
  • 点击次数:9184
  • 下载次数: 11244
  • HTML阅读次数: 0
  • 引用次数: 0
历史
  • 收稿日期:2008-02-24
  • 最后修改日期:2008-08-26
文章二维码
您是第位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京市海淀区中关村南四街4号,邮政编码:100190
电话:010-62562563 传真:010-62562533 Email:jos@iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号