According to this theory, we know that partially connected higher-order neural networks can approximate any continuous functions as fully connected neural networks can do.
根据这个理论,可知稀疏或部分连接的高阶神经网络象全连接的网络一样能够逼近任意连续函数。
Regular back-propagation networks (BP) are fully connected globalized neural networks, it is usually difficult for them to approximate illbehaved systems, which exist in any application field.
常规的反向传播网络(BP)是一种内部呈完全联结的全局性网络,它对非平滑系统的学习能力较弱。
应用推荐