In this paper, the dynamic behaviors of continuous neural networks under structural variations in learning process are studied.
本文研究了连续神经网络在学习过程中结构摄动情况下网络的动态特性。
This paper investigates the absolute exponential stability of generalized neural networks with a general class of partially Lipschitz continuous and monotone increasing activation functions.
本文研究了一类具偏李·普希兹连续和单调增加激活函数的神经网络绝对指数稳定性问题。
In this model, the continuous input-output mapping of the system is realized by nonlinear mapping capability to the time variable of process neural networks.
该模型利用过程神经元网络所具有的对时间变量的非线性映射能力,实现系统的输入、输出之间的连续映射关系。
According to this theory, we know that partially connected higher-order neural networks can approximate any continuous functions as fully connected neural networks can do.
根据这个理论,可知稀疏或部分连接的高阶神经网络象全连接的网络一样能够逼近任意连续函数。
RBF neural network is a kind of local approximation neural networks. In theory, it can approximate any continuous function if there is enough neuron.
RBF神经网络是一种局部逼近的神经网络,理论上只要足够多的神经元,R BF神经网络可以任意精度逼近任意连续函数。
With the best polynomial approximation as a metric, the rate of approximation of the neural networks with single hidden layer to a continuous function is estimated by using a constructive approach.
以最佳多项式逼近为度量,用构造性方法估计单隐层神经网络逼近连续函数的速度。
These results can be used for evaluation of error-correction capability and the synthesis procedures for continuous-time associative memory neural networks.
这些结果可用于连续反馈联想记忆网络的容错性能评价以及综合过程。
These results can be used for evaluation of error-correction capability and the synthesis procedures for continuous-time associative memory neural networks.
这些结果可用于连续反馈联想记忆网络的容错性能评价以及综合过程。
应用推荐