A Hybrid Algorithm Based on Attention Model
DOI:
Author:
Affiliation:

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    A hybrid algorithm based on attention model (HAAM) is proposed to speed up the training of back-propagation neural networks and improve the performances. The algorithm combines the genetic algorithm with the BP algorithm based on magnified error signal. The key to this algorithm lies in the partition of the BP training process into many chips with each chip trained by the BP algorithm. The chips in the same iteration are optimized by the GA operators, and those in different iterations constitute the whole training. Therefore, the HAAM obtains the ability of searching the global optimum solution relying on these operations, and it is easy to be parallelly processed. The simulation experiments show that this algorithm can effectively avoid failure training caused by randomizing the initial weights and thresholds, and solve the slow convergence problem resulted from the Flat-Spots when the error signal becomes too small. Moreover, this algorithm improves the generalization of BP network by improving the training precision instead of adding hidden neurons.

    Reference
    Related
    Cited by
Get Citation

杨博,苏小红,王亚东.基于注意力模型的混合学习算法.软件学报,2005,16(6):1073-1080

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:November 04,2003
  • Revised:January 06,2005
  • Adopted:
  • Online:
  • Published:
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063