entropy

Значение (Английский)

  1. (countable, uncountable) A measure of the disorder present in a system.
  2. (countable, uncountable) A measure of the disorder present in a system.
  3. (countable) A measure of the amount of energy in a physical system that cannot be used to do work.
  4. (countable, uncountable) The capacity factor for thermal energy that is hidden with respect to temperature.
  5. (countable, uncountable) The dispersal of energy; how much energy is spread out in a process, or how widely spread out it becomes, at a specific temperature.
  6. (countable) A measure of the amount of information and noise present in a signal.
  7. (uncountable) The tendency of a system that is left to itself to descend into chaos.

Синонимы

selective information

mean amount of information

average information

average information content

Переводы

εντροπία

entropia

entropi

إنتروبيا

Maß für den Informationsgehalt

mittlerer Informationsgehalt

Частота

31k
Произносится как (IPA)
/ˈɛntɹəpi/
Этимология (Английский)

In summary

First attested in 1867, as the translation of German Entropie, coined in 1865 by Rudolph Clausius in analogy to Energie (“energy”), replacing the root of Ancient Greek ἔργον (érgon, “work”) by Ancient Greek τροπή (tropḗ, “transformation”)).

Улучшите свое произношение

Notes

Sign in to write sticky notes