Saliency Detection Using Self Organizing Map and Manifold Learning
Author:
Affiliation:

  • Article
  • | |
  • Metrics
  • |
  • Reference [13]
  • |
  • Related [20]
  • | | |
  • Comments
    Abstract:

    Extracting nodes that reflect image content and assigning initial labels for these nodes are two critical technologies for saliency detection. A novel method of saliency detection is proposed by this work. It consists of two main parts, one is self organizing map (SOM), and the other is manifold learning (ML). Hundreds of nodes are obtained by the SOM. These nodes can capture not only the color, but also the contour of image content. By means of embedding a two dimension map into higher Euclid space, a weighted undirected graph is constructed. In consideration of edge symmetry in undirected graph, a manifold learning method, which combines undirected graph and semi-supervision, is further proposed. With supplied initial saliency values for nodes along image borders, the saliency values are computed for all nodes. Experimental results demonstrate the proposed model not only achieves high performance on precision and recall, but also presents a pleasing visual effect.

    Reference
    [1] Treisman AM, Gelade G. A feature-integration theory of attention. Cognitive Psychology, 1980,12(1):97-136.
    [2] Itti L, Koch C, Niebur E. A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. on Pattern Analysis and Machine Intelligence, 1998,20(11):1254-1259.
    [3] Harel J, Koch C, Perona P. Graph-Based visual saliency. In:Proc. of the Neural Information Processing Systems 2006. 2006. 545-552.
    [4] Achanta R, Hemami S, Estrada F, Süsstrunk S. Frequency-Tuned salient region detection. In:Proc. of the Computer Vision and Pattern Recognition 2009. 2009. 1597-1604.
    [5] Cheng MM, Zhang GX, Mitra NJ, Huang XL, Hu SM. Global contrast based salient region detection. In:Proc. of the Computer Vision and Pattern Recognition 2011. Springer-Verlag, 2011. 409-416.
    [6] Goferman S, Zelnik-Manor L, Tal A. Context-Aware saliency detection. IEEE Trans. on Pattern Analysis and Machine Intelligence, 2012,34(10):1915-1925.
    [7] Perazzi F, Krähenbühl P, Pritch Y, Hornung A. Saliency filters:Contrast based filtering for salient region detection. In:Proc. of the Computer Vision and Pattern Recognition 2012. 2012. 733-740.
    [8] Wei YC, Wen F, Zhu WJ, Sun J. Geodesic saliency using background priors. In:Proc. of the European Conf. on Computer Vision 2012. 2012. 29-32.
    [9] He QF, Lu S, Hao QF, Li GJ. A new approach to automatic extraction of discriminant regions in image. Chinese Journal of Computers, 2002,25(8):801-809(in Chinese with English abstract).
    [10] Zhang XR, Yang C, Jiao LC. Semi-Supervised SAR target recognition based on Laplacian regularized least squares classification. Ruan Jian Xue Bao/Journal of Software, 2010,21(4):586-596(in Chinese with English abstract). http://www.jos.org.cn/1000-9825/3538.htm[doi:10.3724/SP.J.1001.2010.03538]
    [11] Jiang H, Wang JD, Yuan ZJ, Liu T, Zheng NN, Li SP. Automatic salient object segmentation based on context and shape prior. Proc. of British Machine Vision Conf. 2011. 2011,110:1-12.
    [9] 何清法,鲁松,郝沁汾,李国杰.一种自动抽取图像中可判别区域的新方法.计算机学报,2002,25(8):801-809.
    [10] 张向荣,阳春,焦李成.基于Laplacian正则化最小二乘的半监督SAR目标识别.软件学报,2010,21(4):586-596. http://www.jos.org.cn/1000-9825/3538.htm[doi:10.3724/SP.J.1001.2010.03538]
    Cited by
    Comments
    Comments
    分享到微博
    Submit
Get Citation

陈加忠,曹华,苏曙光,伊斯刚.自组织映射与流形学习的图像显著度检测.软件学报,2015,26(S2):137-144

Copy
Share
Article Metrics
  • Abstract:2242
  • PDF: 3339
  • HTML: 0
  • Cited by: 0
History
  • Received:June 20,2014
  • Revised:August 20,2014
  • Online: January 11,2016
You are the first2032512Visitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063