• 梯度下降混沌优化具有各自的缺点

    The decreasing gradient algorithm and chaos algorithm both have shortcomings for optimization problems.

    youdao

  • 但BP神经网络本质梯度下降容易陷入局部最优

    BP neural network, as its nature of gradient descent method, is easy to fall into local optimum.

    youdao

  • 模型采用前向模糊神经网络学习梯度下降

    The model was a Feedforward Fuzzy neural network possessing five layers, and Gradient Descent was adopted as learning algorithm.

    youdao

  • 然后利用梯度下降推导基于优模式中心NLDA

    The second algorithm calculates the optimum classes' centres of NLDA by method of grads descending.

    youdao

  • 有两种方可以用来设计网络能量下降一个是梯度下降法

    Two ways are used to design the network, the one is the direct energy descent method, and the other is the gradient descent method.

    youdao

  • 矩阵学习过程归结梯度下降矛盾线性方程组的过程;

    Then, the learning of the weight matrix can be done by means of solving a group of systems of linear equations. Last, the mathematical base of the outer-product leaming method is pointed out.

    youdao

  • 首先研究离线训练滑模控制器然后,给出了利用梯度下降在线训练方

    First, a study of a sliding mode controller under on off training is made and then the on line learning algorithm using a gradient decent method is designed.

    youdao

  • 提出分别采用共轭梯度、二元适应收缩以及梯度下降法对以上优化问题求解

    And the three optimization problems are solved respectively by the conjugate gradient method, the adaptive bivariate shrinking and the gradient descent method.

    youdao

  • 实质采用梯度下降使权值改变总是朝着误差变小方向改进,最终达到最小误差。

    The essence of back propagation networks is that make the change of weights become little by gradient descent method and finally attain the minimal error.

    youdao

  • 本文研究BP网络实现了“梯度下降网络训练获得了传统好的效果

    This paper studies BP network, realizes the method of gradient descent, gets better result than traditional one.

    youdao

  • 运用最优梯度下降使目标计算值与期望值误差最小来迭代优选权重,建立迭代模型

    A few or all of the parameters of the controller are adjusted by using the gradient descent algorithm to minimize the output error.

    youdao

  • 对小波神经网络采用梯度下降法优化网络参数学习率采用适应学习速率自动调节

    In WNN the most fast grads descent methodology was adopted to adjust the network parameters and the learning rate by self adapting learning rate method.

    youdao

  • RBF神经网络采用离线学习在线修正值,加快收敛速度应用惯性梯度下降

    RBF neural network adopts the off-line training and the on-line adaptation of weight and threshold value. In order to speed up the convergence, the grads descent method with inertia item was used.

    youdao

  • 然后通过梯度下降法最小二乘相结合混合学习,对控制器参数进行调整提高控制精度

    Then some parameters of the controller are modulated by hybrid learning algorithm of ladder descent (LD) and least square error (LSE) so as to attain better control precision.

    youdao

  • 一定时间间隔内两次采样得到图象进行运算,用梯度下降直接迭代图象位移估计值

    The algorithm which utilizes the Steepest Descent Method can give the estimation of displacement between two frames of image directly.

    youdao

  • 利用梯度下降对网络权值进行训练,并且推导了BVS增长以及网络训练的限制记忆递推公式。

    The weights are trained with Gradient Descent Method. The increase algorithm of BVS, and restricted algorithm, was induced.

    youdao

  • 对于参数学习提出了种适用于分类可微经验风险函数函数能够有效地利用梯度下降进行最小化。

    For the learning process, a new kind of empirical risk function is proposed which is differentiable and can be minimized by gradient descent strategy.

    youdao

  • 分析了BP神经网络混沌优化特点,并混沌优化梯度下降结合起来构成一种新的组合搜索优化方

    The characteristics of BP neural network and chaos optimal method are analyzed. By integrating chaos optimal method with gradient-decline method, an optimal method of combination search is created.

    youdao

  • 通过网络分析提出了序列泛函网络模型学习,而网络的泛函参数利用梯度下降法来进行学习。

    In this paper, by analyzing the functional network, a new model and learning algorithm of the serial functional networks is proposed.

    youdao

  • 研究了简化型内回归神经网络基于适应梯度下降法训练提出种基于简化型内回归神经网络的非线性动态数据校核

    An adaptive gradient descent algorithm for training simplified internally recurrent networks (SIRN) is developed and a new method of reconciling nonlinear dynamic data based on SIRN is proposed.

    youdao

  • 考虑神经网络在训练大规模样品易陷入局部极小,用梯度下降混沌优化结合,使神经网络实现快速训练的同时,避免陷入局部极小。

    Combining grading method with chaotic optimization, the neural network model achieves rapid training and avoids local minimum when there are a lot of samples to be trained.

    youdao

  • 下降共轭梯度有机结合起来,构造出一种混合优化证明全局收敛性。

    Based on the steepest descent method and the conjugate gradient method, a hybrid algorithm is proposed in this paper, and its global convergence is proved.

    youdao

  • 运用梯度下降法,推导出能量函数曲线演化方程应用于图像分割

    Using gradient-descent methods the energy function is minimized and a curve evolution equation is obtained to segment the image.

    youdao

  • 本文根据常微分方程参数反问题数学理论,将正交化同有限差分结合用于确定水质模型参数,正则化方、最下降和共轭梯度比较。

    The comparison of the calculation results show that orthogonal rule method is fast, simple and reliable, and is applicable to the calculation of the water quality modeling parameters.

    youdao

  • 本文根据常微分方程参数反问题数学理论,将正交化同有限差分结合用于确定水质模型参数,正则化方、最下降和共轭梯度比较。

    The comparison of the calculation results show that orthogonal rule method is fast, simple and reliable, and is applicable to the calculation of the water quality modeling parameters.

    youdao

$firstVoiceSent
- 来自原声例句
小调查
请问您想要如何调整此模块?

感谢您的反馈,我们会尽快进行适当修改!
进来说说原因吧 确定
小调查
请问您想要如何调整此模块?

感谢您的反馈,我们会尽快进行适当修改!
进来说说原因吧 确定