Research on Stability of Discrete Time Hopfield Network

DOI：

 作者 单位 叶世伟 中国科学院,计算技术研究所,智能信息处理重点实验室,北京,100080中国科学院,研究生院,信息科学与工程学院,北京,100039 郑宏伟 四川师范大学,数学系,四川,成都,610066 王文杰 中国科学院,研究生院,信息科学与工程学院,北京,100039 马琳 中国科学院,研究生院,信息科学与工程学院,北京,100039 史忠植 中国科学院,计算技术研究所,智能信息处理重点实验室,北京,100080

主要讨论离散时间连续状态的Hopfield网络模型中当神经元的激活函数为单调增函数(不一定严格单调增)时,并行和串行收敛的充分条件以及具有全局惟一稳定点的充分条件.通过定义新的能量函数和研究单调增函数(不一定严格单调增)的性质,给出了并行和串行收敛的充分条件.通过研究能量函数成为凸函数的条件,将Hopfield 网络的运行看作约束凸优化问题求解,从而得出了仅有全局惟一极小点的充分条件.当网络神经元的自反馈大于该神经元激活函数导数的倒数时,串行运行收敛.当网络连接权值矩阵的最小特征值大于激活函数导数的倒数时,网络并行收敛.如果网络的能量函数为凸函数,则网络将仅有惟一一个全局稳定点.这些结果在应用Hopfield 网络求解优化问题和联想记忆时拓广了神经元激活函数的选择范围.

In this paper, the convergent conditions in sequence or parallel update mode and the sufficient condition with only one global stable state for Hopfield network model with discrete time and continuous states when its neurons’ activation function is non-decreasing (not being strictly monotone increasing) are discussed. With the definition of a new energy function and the research on the properties of monotonously increasing function, the sufficient conditions is presented to converge in parallel or sequential update mode when neuron’s activation function is monotonously increasing (not be necessary to strictly increase). After obtained the condition for energy function to be convex with respect to the network states variables, it follows that a sufficient condition for network to have only one stable point with the minimum energy by regarding the operation of Hopfield network as solving a constrained convex optimal problem. When auto-connection weight value of each neuron in network is greater than the reciprocal of derivation of its activation function, the network will be convergent in sequence update mode. When the minimal eigenvalue of connection weights matrix is greater than the reciprocal of derivation of its neuron activation function, the network will be convergent in parallel update mode. If the energy function of network is convex, the network will have only one global stable point. These results extend the choice range of activation function of neuron when using Hopfield net to solution of optimization problem or to associative memory.
HTML  下载PDF全文  查看/发表评论  下载PDF阅读器
 主办单位：中国科学院软件研究所 中国计算机学会 京ICP备05046678号-4 编辑部电话：+86-10-62562563 E-mail: jos@iscas.ac.cn Copyright 中国科学院软件研究所《软件学报》版权所有 All Rights Reserved 本刊全文数据库版权所有，未经许可，不得转载，本刊保留追究法律责任的权利