中英
entropy
/ ˈentrəpi /
/ ˈentrəpi /
  • 简明
  • 柯林斯
  • n.[热] 熵(热力学函数)
  • 网络释义
  • 专业释义
  • 英英释义
  • 1

    [热] 熵

    ...简单、单一和均匀向复杂、多样和不均匀演化的规则看企业系统,你会发现任何一个企业随着时间的延长,熵值都会增加,熵(entropy)指的是体系混乱的程度,表示任何一种能量在空间中分布的均匀程度,能量分布得越均匀,熵就越大。

  • 2

     信息熵

    信息熵(Entropy)是一种有效测量用户专业性的方法,一个用户的回答越专业,那么他所拥有的熵值越低,其对专业领域的专著度越高。

  • 3

    [自] 平均信息量

    ... entranceage入学年龄 entropy平均信息量 entry表列值 ...

  • 4

     函数

    ... EnginEEringrECBnol工程技术 Entropy函数 EnvironmEntAl环境 ...

短语
查看更多
  • 双语例句
  • 原声例句
  • 权威例句
  • 1
    If we calculate the change in entropy during gas mixing, we will find that this quantity is positive.
    如果我们计算气体混合过程中熵的变化,我们会发现这个量是正的。
  • 2
    The present study is to investigate the application of sample entropy (SampEn) measures.
    现在的研究采用了复杂性分析中的样品熵算法。
  • 3
    A score function for optimization based on maximum mutual information entropy with odditional restriction is proposed.
    提出了基于最大互信息熵且具有奇数约束的优化得分函数。
查看更多
  • 词典短语
  • 词源
查看更多
  • 百科
  • Entropy

    In thermodynamics, entropy (usual symbol S) is a measure of the number of specific ways in which a thermodynamic system may be arranged, commonly understood as a measure of disorder. According to the second law of thermodynamics the entropy of an isolated system never decreases; such a system will spontaneously evolve toward thermodynamic equilibrium, the configuration with maximum entropy. Systems that are not isolated may decrease in entropy, provided they increase the entropy of their environment by at least that same amount. Since entropy is a state function, the change in the entropy of a system is the same for any process that goes from a given initial state to a given final state, whether the process is reversible or irreversible. However, irreversible processes increase the combined entropy of the system and its environment.The change in entropy (ΔS) of a system was originally defined for a thermodynamically reversible process aswhere T is the absolute temperature of the system, dividing an incremental reversible transfer of heat into that system (dQ). (If heat is transferred out the sign would be reversed giving a decrease in entropy of the system.) The above definition is sometimes called the macroscopic definition of entropy because it can be used without regard to any microscopic description of the contents of a system. The concept of entropy has been found to be generally useful and has several other formulations. Entropy was discovered when it was noticed to be a quantity that behaves as a function of state, as a consequence of the second law of thermodynamics.Entropy is an extensive property. It has the dimension of energy divided by temperature, which has a unit of joules per kelvin (J K-1) in the International System of Units (or kg m2 s-2 K-1 in basic units). But the entropy of a pure substance is usually given as an intensive property — either entropy per unit mass (SI unit: J K-1 kg-1) or entropy per unit amount of substance (SI unit: J K-1 mol-1).The absolute entropy (S rather than ΔS) was defined later, using either statistical mechanics or the third law of thermodynamics.In the modern microscopic interpretation of entropy in statistical mechanics, entropy is the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification. Understanding the role of thermodynamic entropy in various processes requires an understanding of how and why that information changes as the system evolves from its initial to its final condition. It is often said that entropy is an expression of the disorder, or randomness of a system, or of our lack of information about it. The second law is now often seen as an expression of the fundamental postulate of statistical mechanics through the modern definition of entropy.

查看更多