熵是信号序列信息量的表征,作者在对差压信号的分析中提出了两种基于熵概念的特征——香农熵和阈值熵。
Entropy is a measure of information in signals. Based on the conception of entropy, two new features-Shannon entropy and Threshold entropy are proposed in flow regime identification.
本文从玻耳兹曼熵和香农熵的概念,推演出熵与信息的互补原理。
This paper presents the principle of complementarity of entropy and information derived from the concepts of L. Boltz mann's entropy and c.
采用部分体积插值法和香农熵计算得到的互信息,无法避免会出现一些局部极值,可能导致错误的配准。
For the mutual information calculated by partial volume interpolation method and Shannon entropy, certain local extremums are inevitable, which may lead to inaccurate registration.
应用推荐