当源项是解的非线形函数时,发现定常状态的误差是刚性参数的单调增加函数。
The steady error is an increasing function of the stiffness parameter if the source term is nonlinear.
本文研究了一类具偏李·普希兹连续和单调增加激活函数的神经网络绝对指数稳定性问题。
This paper investigates the absolute exponential stability of generalized neural networks with a general class of partially Lipschitz continuous and monotone increasing activation functions.
本文研究了一类具偏李·普希兹连续和单调增加激活函数的神经网络绝对指数稳定性问题。
This paper investigates the absolute exponential stability of generalized neural networks with a general class of partially Lipschitz continuous and monotone increasing activation functions.
应用推荐