一个既不是局部极大点又不是局部极小点的平稳点称为一个鞍点。
A stationary point which is neither a local maximum nor a local minimum point is called a saddle point.
有效地克服了人工神经网络学习速度慢、存在局部极小点的固有缺陷。
The intrinsic defects of artificial neural network, e. g. , its slow learning speed. existence of partial minimum points, are solved.
研究了改进人工势场法,发现该方法并不能完全解决局部极小点问题。
By studying the improved potential field function method, it is found that the method couldn't solve all local minimum problems.
后者需要初值足够接近结果,以避免陷入讨厌的局部极小点并节约宝贵的计算时间。
And for the latter, initial value should be sufficiently close to the result to avoid the nasty local minimum and save the precious computation time.
用这种方法可以有效逼近解的局部极小点,但不一定能得到最佳解或最佳的分辨率。
This method can approach local extreme value available, but it is not the optimal solution or optimal resolution.
粒子群优化算法应用于多极值点函数优化时,存在陷入局部极小点和搜寻效率低的问题。
Particle Swarm optimization (PSO) algorithm is a population-based global optimization algorithm, but it is easy to be trapped into local minima in optimizing multimodal function.
由于目标函数中存在大量局部极小点,在处理合成数据和实际数据时,该方法效果很差。
Because there are many local minimum points in objective function waveform inversion's effectiveness is weak when processing anamorphic data and actual data.
如果罚问题的极小点是原约束规划问题的极小点,则称此罚问题中的罚函数为精确罚函数。
If each minimum of the penalty problem is a minimum of the primal constrained programming problem, then the corresponding penalty function is called exact penalty function.
通过设定逃逸系数,算法在寻优过程中具有了能够跳出局部极小点到达全局最优点的能力。
In the process of optimization, the method has the ability of escaping from the local minimized point and arriving at the global optimal point by setting an escaping coefficient.
第三章给出一类新的拉伸函数,其从原函数的一个局部极小点出发应用拉伸函数法修正目标函数。
In the third chapter, a new auxiliary function method is proposed, and then a stretching function technique is used to modify the objective function with respect to the obtained local minimum.
建模实践表明,改进后的BP算法可能使网络误差函数达到局部极小点,提高了算法的拟合精度。
Applying improved BP algorithm, it shows the improved BP algorithm can easily converge into the local minimum point and it can improve the accuracy of model.
本文首先对箱子约束全局最优化问题提出了一个新的辅助函数,该函数能跳出当前的局部极小点。
In this paper, a new auxiliary function with one parameter on box constrained for escaping the current local minimizer of global optimization problem is proposed.
目的对BP学习算法中存在的大量局部极小点以及收敛速度慢问题进行研究并提出相应的改进方案。
Aim To study the standard BP algorithms local minima and learning speed problems and propose the scheme for improvement.
于是,这些传统方法常常被模型选择与过学习问题、非线性和维数灾难问题、局部极小点问题等困扰。
Therefore, the traditional methods are inclined to bring many problems like model-choosing, over-fitting, non-linear, disaster of dimensionality, local minimum.
由于该方法易陷入局部最优解,提出了一种基于混合遗传算法求解LOO上界极小点的核参数选择方法。
Usually, the steepest descent algorithm is used to find the minimum of LOO upper-bound. However, it often gets local optimal solution.
作为一种新的机器学习方法,SV M能较好地解决小样本、非线性、高维数和局部极小点等实际问题。
As a new machine learning method, SVM can solve the small sample, nonlinear, high dimension and local minima, the actual problem.
在软测量建模过程中,基于支持向量机的算法能较好地解决小样本、非线性、高维数、局部极小点等问题。
In model establishment of soft-sensing, the problems of small sample, non-linearity, high dimensions and local minimal value can be well solved by support vector machine algorithm.
针对前向神经网络BP算法由于初始权值选择不当而陷入局部极小点这一缺陷,提出新的全局优化训练算法。
Then some defects such as slow convergence rate and getting into local minimum in BP algorithm are pointed out, and the root of the defects is presented.
支持向量机方法较好地解决了许多学习方法面临的小样本、非线性和局部极小点等问题,具有很好的应用前景。
SVM solves practical problems such as small samples, nonlinearity, local minima, which exist in most of learning methods, and has a bright future.
实验表明,该算法不仅能明显提高网络的学习速度,而且可较好地避免学习过程陷入局部极小点而导致学习失败。
Experiments show that this algorithm can speed up the learning process of network, and solve the problem of local extremum in learning process to a certain extend.
分析了BP算法的基本原理,指出了BP算法具有收敛速度慢、易陷入局部极小点等缺陷以及这些缺陷产生的根源。
Then some defects such as slow convergence rate and getting into local minimum in BP algorithm are pointed out, and the root of the defects is presented.
通过实验研究,体现了RBF神经网络作为一种局部全连接网络,训练速度快,克服了BP网络的局部极小点问题。
Experimental study showed that as a local and whole conjunction neural network RBF network can be trained very quickly, and can overcome shortcomings of local minimum pole in BP networks.
修正算法的中心是引入一个反射单纯形的算法步骤,以克服普通算法在某些情况下会造成的一种不收敛到极小点的困难。
The centre of the modified method is that an algorithm step for the reflection of the simplex is introduced to overcome the above difficulty.
CHNN的设计应保证,无论从何种初值出发,网络总能通过运行收敛到一个稳定状态,它相应于能量函数的某个局部极小点。
The design of CHNN should ensure that the networks always converge to a steady sate corresponding to a local minimum point of energy function by running in spite of starting from any initial values.
经验证(PSO)优化算法可以有效地克服BP神经网络存在的学习效率低,收敛速度慢以及容易陷入局部极小点等固有缺点。
It is confirmed that PSO could overcome intrinsic shortcomings of BP neural network, including low learning efficiency, slow convergence rate, being easy to fall into local minima, etc.
本研究通过引入单位风险的概念,并结合有效边界上的单位风险极小点,给出了各种情况下有风险投资和无风险投资的最优组合方案。
By introducing the concept of unit risk and combining the risk minimum point on optimum investment curve, the article offers the optimum combination plan under various circumstances.
这种方法对高斯噪声和星座图由于信号初始相位而引入的旋转具有良好的稳健性,并避免了神经网络中的过学习和局部极小点等缺陷。
This method is robust to Gaussian noise and constellation rotation due to initial phase of signal and avoids overfitting and local minimum in neural networks.
将混沌机制引入常规BP算法,利用混沌机制固有的全局游动,逃出权值优化过程中存在的局部极小点,解决了网络训练易陷入局部极小点的问题。
Chaotic mechanism is introduced to normal BP algorithm, and the problem of local limit value for network is solved using global moving characteristic of chaotic mechanism is weight optimization.
新算法选择很广一类的隐层神经元函数,可以直接求得全局最小点,不存在BP算法的局部极小、收敛速度慢等问题。
The algorithm can get global minimum easily with a wide variety of functions of hidden neurons, and no problems such as local minima and slow rate of convergence are suffered like BP algorithm.
新算法选择很广一类的隐层神经元函数,可以直接求得全局最小点,不存在BP算法的局部极小、收敛速度慢等问题。
The algorithm can get global minimum easily with a wide variety of functions of hidden neurons, and no problems such as local minima and slow rate of convergence are suffered like BP algorithm.
应用推荐