由于粗神经网络的误差传递函数不可微,所以采用遗传算法来训练粗神经网络。
Because the error transfer function of rough neural network is not differentiable, genetic algorithms are applied for training the network.
极大极小问题是一类不可微优化问题,熵函数法是求解这类问题的一种有效算法。
Minimax problem is a sort of non-differentiable optimization problem and the entropy function method provides a efficient approach to solve such kind of problems.
由于优化问题可能是不连续的、不可微的甚或是没有函数解析式的,传统经典的无约束优化方法在应用时会受到限制。
Because the optimization problem may be discontinuous and non-differentiable even has no objective function, the traditional optimization methods are unable to tackle with it.
应用推荐