entropy
التعريفات والمعاني
== English ==
=== Etymology ===
First attested in 1867, as the translation of German Entropie, coined in 1865 by Rudolph Clausius in analogy to Energie (“energy”), replacing the root of Ancient Greek ἔργον (érgon, “work”) by Ancient Greek τροπή (tropḗ, “transformation”)).
=== Pronunciation ===
IPA(key): /ˈɛntɹəpi/
=== Noun ===
entropy (countable and uncountable, plural entropies)
A measure of the disorder present in a system.
Alternative form: S (symbol)
(Boltzmann definition) A measure of the disorder directly proportional to the natural logarithm of the number of microstates yielding an equivalent thermodynamic macrostate.
(information theory) Shannon entropy
(thermodynamics, countable) A measure of the amount of energy in a physical system that cannot be used to do work.
The capacity factor for thermal energy that is hidden with respect to temperature.
The dispersal of energy; how much energy is spread out in a process, or how widely spread out it becomes, at a specific temperature.
(statistics, information theory, countable) A measure of the amount of information and noise present in a signal.
(uncountable) The tendency of a system that is left to itself to descend into chaos.
==== Synonyms ====
anergy
bound entropy
disgregation
==== Antonyms ====
aggregation
exergy
free entropy
negentropy
==== Derived terms ====
==== Translations ====
==== See also ====
chaos
=== References ===
=== Further reading ===
“entropy”, in Webster’s Revised Unabridged Dictionary, Springfield, Mass.: G. & C. Merriam, 1913, →OCLC.
William Dwight Whitney, Benjamin E[li] Smith, editors (1911), “entropy”, in The Century Dictionary […], New York, N.Y.: The Century Co., →OCLC.
“entropy”, in OneLook Dictionary Search.
=== Anagrams ===
Poynter, peryton