entropy
Meaning
- (countable, uncountable) A measure of the disorder present in a system.
- (countable, uncountable) A measure of the disorder present in a system.
- (countable) A measure of the amount of energy in a physical system that cannot be used to do work.
- (countable, uncountable) The capacity factor for thermal energy that is hidden with respect to temperature.
- (countable, uncountable) The dispersal of energy; how much energy is spread out in a process, or how widely spread out it becomes, at a specific temperature.
- (countable) A measure of the amount of information and noise present in a signal.
- (uncountable) The tendency of a system that is left to itself to descend into chaos.
Synonyms
selective information
mean amount of information
average information
average information content
Translations
Frequency
Pronounced as (IPA)
/ˈɛntɹəpi/
Etymology
In summary
First attested in 1867, as the translation of German Entropie, coined in 1865 by Rudolph Clausius in analogy to Energie (“energy”), replacing the root of Ancient Greek ἔργον (érgon, “work”) by Ancient Greek τροπή (tropḗ, “transformation”)).
Bookmark this
Improve your pronunciation
Start learning English with learnfeliz.
Practice speaking and memorizing "entropy" and many other words and sentences in English.
Go to our English course page
Notes