The essence of back propagation networks is that make the change of weights become little by gradient descent method and finally attain the minimal error.
其实质是采用梯度下降法使权值的改变总是朝着误差变小的方向改进,最终达到最小误差。
The essence of back propagation networks is that make the change of weights become little by gradient descent method and finally attain the minimal error.
其实质是采用梯度下降法使权值的改变总是朝着误差变小的方向改进,最终达到最小误差。
应用推荐