Noun entropy has 2 senses
  1. information, selective information, entropy - (communication theory) a numerical measure of the uncertainty of an outcome; "the signal contained thousands of bits of information"
    --1 is a kind of information measure
  2. randomness, entropy, S - (thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work; "entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity"
    --2 is a kind of physical property
    --2 has particulars: conformational entropy
,
TOP