Super-Resolution Based on Sparse Dictionary Coding
Author:
Affiliation:

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Learning-Based super-resolution methods usually select several objects with similar features from some examples according to the low-resolution image, then estimate super-resolution result using optimization algorithm. But the result is usually limited by the quality of matching objects and only geometric construction of the images is selected as matching feature, so matching accuracy is relatively low. This paper presents a sparse dictionary model for image super-resolution, which unifies the feature patches of high-resolution (HR) and low-resolution (LR) images for sparse coding. To break through the aforementioned limitations, this method builds a sparse association between HR and LR images, and realized simultaneous matching and optimization methods. The study uses a MCA method to improve the accuracy for feature extraction and carry out super-resolution reconstruction and denoise simultaneously. Sparse K-SVD algorithm is adopted as optimization method to reduce the computation time of sparse coding. Some experiments with real images show that this method outperforms other learning-based super-resolution algorithms.

    Reference
    Related
    Cited by
Get Citation

李民,程建,乐翔,罗环敏.稀疏字典编码的超分辨率重建.软件学报,2012,23(5):1315-1324

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:June 06,2010
  • Revised:January 31,2011
  • Adopted:
  • Online: April 29,2012
  • Published:
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063