一个讯息中的夏侬熵就是将此讯息编码所需的位元数目。
The Shannon entropy of a message is the number of binary digits, or bits, needed to encode it.
概念“讯息”的各种各样的意思被讨论,一个最大的讯息(熵)原则的一般的公式被使用。
The various meanings of the concept "information" are discussed and a general formulation of the maximum information (entropy) principle is used.
一个讯息中的夏侬熵就是将此讯息编码所需的位元数目。
The exactness with which a number is specified; the number of significant digits with which a number is expressed.
应用推荐