本文在经典统计学习理论的基础上,讨论了可能性空间上学习过程一致收敛速度的界。
In this paper, the bounds on the rate of uniform convergence of the learning processes on possibility space are discussed based on the classic Statistical learning Theory.
最后证明基于双重随机样本的统计学习理论的关键定理并讨论学习过程一致收敛速度的界。
Finally the key theorem of statistical learning theory based on random rough samples is proved, and the bounds on the rate of uniform convergence of learning process are discussed.
本文在截尾样本下构造了失效率的一种截尾非参数估计,并给出了其均方收敛及强相合性的局部一致收敛速度。
In this paper, a nonparametric estimator terminated of failure rate is constructed based on censored data, and its rates of convergence in mean square and strong consistency are given respectively.
应用推荐