The reduced training set is used to form the learning machines.
该类学习机也是在少训练样本集上构造的。
According to this method, a random expanded training set is obtained.
以此方法对样本集进行有效扩充,得到新的随机扩展训练集。
The training set is not needed in clustering but the accuracy is lower.
聚类不需要训练集,但准确率较低。
Reliability could possibly be improved by using a training set with feedback.
对样本含量估计的可靠性通过具有反馈的训练可能得到提高。
The experimental data obtained were divided into training set and testing data.
把试验获取的数据整理分为训练样本集和测试样本集。
We've already talked a bit about the fact that algorithms may over-fit the training set.
我们已经提到了一点有关算法可能会与训练集过拟合(over-fit)的细节。
The training set of data will be memorized, making the network useless on new data sets.
训练集数据将被记忆,而使在处理新数据方面网络无用。
And the classifier's performance may be improved by filtering the samples in the training set.
在训练网页分类器时,对网页样本集进行有效地筛选有可能改善分类器的性能。
These inputs, often called the "training set", are the examples from which the agent tries to learn.
这些输入通常被称作“训练集”(原文为training set,译者注),它们是Agent尝试学习的样本。
Ensure that use training set is selected so we use the data set we just loaded to create our model.
请确保选中usetraining set以便我们使用刚载入的这个数据集来创建我们的模型。
Use the class information of training set to build the model, and extract the feature benefit to classification.
利用训练文档的类信息对文本分类模型进行建模,提取对分类贡献较大的特征。
Choose the file bmw-test.arff, which contains 1,500 records that were not in the training set we used to create the model.
选择文件bmw - test . arff,内含1,500条记录,而这些记录在我们用来创建模型的训练集中是没有的。
The neural net uses these to modify its weights, and it aims to match its classifications with the targets in the training set.
神经网络用这些来调整权系数,其目的使培训中的目标与其分类相匹配。
The trials of orthogonal design were chosen as the training set of the BP-ANN, and forecasted the trails of the entire field.
选取正交设计的试验点作为反向传播人工神经网络的训练集,实现对全实验域试验点的预测,并与实测的试验数据比较。
If, for instance, you only have 20 samples, there's not much data to use for a training set and still leave a significant test set.
比如,你现在仅仅只有20个样本,对于训练集和有效的测试集来说,没有太多的数据。
In selecting vectors, the algorithm also USES the class label of training set and the judgment information of class mean vector.
在特征向量的选择中,本算法还用到了训练集的类别标签和类别平均向量的判别信息。
Moreover, as class density estimates cannot be derived for such a training set, class posterior probabilities cannot be estimated.
更进一步地,因种类分布密度无法从那样的训练集中进行估计,种类的后验概率也无法被估计出来。
In 1960 a spirited animal lover with no scientific training set up camp in Tanganyika's Gombe Stream Game Reserve to observe chimpanzees.
1960年,一位没有接受过任何科学训练的勇敢动物爱好者在坦噶尼喀湖畔的贡贝野生动物保护区安营扎寨开始观察大猩猩。
In 1960 a spirited animal lover with no scientific training set up camp in Tanganyika’s Gombe Stream Game Reserve to observe chimpanzees.
1960年,一位英姿飒爽的动物爱好者在坦噶尼喀(Tanganyika)的冈贝河野生动物保护区(Gombe StreamGame Reserve)扎下了营地,虽然没有接受过任何科学训练,但她此行却是为了观察黑猩猩而来。
Used the weather and sport training set and rules, an intelligent call center model with a fuzzy export sub-system module is realized.
并针对天气与运动的模糊训练示例集与规则集,实现了一个小型的带有模糊专家模块的智能呼叫中心原型系统。
A new SVM iterative algorithm is proposed, aiming at the problem that the speeds of learning and classifying are slow in large training set.
针对SVM方法在大样本情况下学习和分类速度慢的问题,提出了大样本情况下的一种新的SVM迭代训练算法。
By given a test image, the expressions in the training set can be "translate" to the input image by using factorization model and vice versa.
当给出一张测试人脸图像时,我们利用因素分解模型的“转移”算法合成测试人脸在训练集已有表情下的图像和训练集人脸在测试人脸表情下的图像。
Nevertheless, if you're in a bind for data, this can yield passable results with lower variance than simply using one test set and one training set.
尽管如此,比起只是简单的使用一个测试集和一个训练集,这种方法可以产生比较低的差异,还算是一个可以说得过去的结果。
Decision tree algorithm is that the category knowledge of the training set is mined through built high precision and small-scale decision tree.
决策树算法通过构造精度高、小规模的决策树采掘训练集中的分类知识。
We select training sets and test sets in many different software releases and discuss the relation between training set and the prediction accuracy.
通过采集通信软件的不同发布版本的测试历史数据,讨论了训练集数据的选择与预测精度之间的关系。
The algorithm is quite simple, especially, on the certain condition, it can match all the input-output pairs in the training set to any given accuracy.
这种算法非常简单,且在一定条件下,能将样本集合中的所有输入、输出数据对拟合至任意精度。
Based on the equivalence between the original training set and the newly added training set, a new algorithm for SVM-based incremental learning was proposed.
基于原训练样本集和新增训练样本集在增量训练中地位等同,提出了一种新的SVM增量学习算法。
Firstly, sample set is roughly classified using ART to reduce the scale of samples, in training set, and then all small training sets is trained using parallel BP.
首先用ART网络对训练集中的样本进行粗分类,以减小训练集的样本规模,然后用多个BP网络并行地对小训练集进行训练。
Firstly, sample set is roughly classified using ART to reduce the scale of samples, in training set, and then all small training sets is trained using parallel BP.
首先用ART网络对训练集中的样本进行粗分类,以减小训练集的样本规模,然后用多个BP网络并行地对小训练集进行训练。
应用推荐