自组织映射与流形学习的图像显著度检测
作者:
基金项目:

国家自然科学基金(61300140);现代信息科学与网络技术北京市重点实验室开放课题(XDXX1307)


Saliency Detection Using Self Organizing Map and Manifold Learning
Author:
  • 摘要
  • | |
  • 访问统计
  • |
  • 参考文献 [13]
  • |
  • 相似文献 [20]
  • | | |
  • 文章评论
    摘要:

    提取反映图像内容的结点以及为这些结点分配初始标签,是半监督学习用于显著度检测的关键问题.通过自组织映射把图像分成多个结点,这些结点不但反映图像内容的颜色特征,还能够反映图像内容的轮廓特征.然后通过把二维结点图嵌入到高维的空间构造带权无向图.由于无向边的对称性,进一步采用流形学习的方法,把无向图和半监督学习结合起来,通过预设边界结点预期的显著度,最终计算出所有结点的显著度.实验结果表明,与近年提出的几种经典的显著度检测算法相比,所提出的方法取得了较好的Precision-Recall性能和较舒服的视觉效果.

    Abstract:

    Extracting nodes that reflect image content and assigning initial labels for these nodes are two critical technologies for saliency detection. A novel method of saliency detection is proposed by this work. It consists of two main parts, one is self organizing map (SOM), and the other is manifold learning (ML). Hundreds of nodes are obtained by the SOM. These nodes can capture not only the color, but also the contour of image content. By means of embedding a two dimension map into higher Euclid space, a weighted undirected graph is constructed. In consideration of edge symmetry in undirected graph, a manifold learning method, which combines undirected graph and semi-supervision, is further proposed. With supplied initial saliency values for nodes along image borders, the saliency values are computed for all nodes. Experimental results demonstrate the proposed model not only achieves high performance on precision and recall, but also presents a pleasing visual effect.

    参考文献
    [1] Treisman AM, Gelade G. A feature-integration theory of attention. Cognitive Psychology, 1980,12(1):97-136.
    [2] Itti L, Koch C, Niebur E. A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. on Pattern Analysis and Machine Intelligence, 1998,20(11):1254-1259.
    [3] Harel J, Koch C, Perona P. Graph-Based visual saliency. In:Proc. of the Neural Information Processing Systems 2006. 2006. 545-552.
    [4] Achanta R, Hemami S, Estrada F, Süsstrunk S. Frequency-Tuned salient region detection. In:Proc. of the Computer Vision and Pattern Recognition 2009. 2009. 1597-1604.
    [5] Cheng MM, Zhang GX, Mitra NJ, Huang XL, Hu SM. Global contrast based salient region detection. In:Proc. of the Computer Vision and Pattern Recognition 2011. Springer-Verlag, 2011. 409-416.
    [6] Goferman S, Zelnik-Manor L, Tal A. Context-Aware saliency detection. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2012,34(10):1915-1925.
    [7] Perazzi F, Krähenbühl P, Pritch Y, Hornung A. Saliency filters:Contrast based filtering for salient region detection. In:Proc. of the Computer Vision and Pattern Recognition 2012. 2012. 733-740.
    [8] Wei YC, Wen F, Zhu WJ, Sun J. Geodesic saliency using background priors. In:Proc. of the European Conf. on Computer Vision 2012. 2012. 29-32.
    [9] He QF, Lu S, Hao QF, Li GJ. A new approach to automatic extraction of discriminant regions in image. Chinese Journal of Computers, 2002,25(8):801-809(in Chinese with English abstract).
    [10] Zhang XR, Yang C, Jiao LC. Semi-Supervised SAR target recognition based on Laplacian regularized least squares classification. Ruan Jian Xue Bao/Journal of Software, 2010,21(4):586-596(in Chinese with English abstract). http://www.jos.org.cn/1000-9825/3538.htm[doi:10.3724/SP.J.1001.2010.03538]
    [11] Jiang H, Wang JD, Yuan ZJ, Liu T, Zheng NN, Li SP. Automatic salient object segmentation based on context and shape prior. Proc. of British Machine Vision Conf. 2011. 2011,110:1-12.
    [9] 何清法,鲁松,郝沁汾,李国杰.一种自动抽取图像中可判别区域的新方法.计算机学报,2002,25(8):801-809.
    [10] 张向荣,阳春,焦李成.基于Laplacian正则化最小二乘的半监督SAR目标识别.软件学报,2010,21(4):586-596. http://www.jos.org.cn/1000-9825/3538.htm[doi:10.3724/SP.J.1001.2010.03538]
    引证文献
    网友评论
    网友评论
    分享到微博
    发 布
引用本文

陈加忠,曹华,苏曙光,伊斯刚.自组织映射与流形学习的图像显著度检测.软件学报,2015,26(S2):137-144

复制
分享
文章指标
  • 点击次数:2246
  • 下载次数: 3349
  • HTML阅读次数: 0
  • 引用次数: 0
历史
  • 收稿日期:2014-06-20
  • 最后修改日期:2014-08-20
  • 在线发布日期: 2016-01-11
文章二维码
您是第19818696位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京市海淀区中关村南四街4号,邮政编码:100190
电话:010-62562563 传真:010-62562533 Email:jos@iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号