# Entropy

 related topics {math, energy, light} {theory, work, human} {acid, form, water} {math, number, function} {rate, high, increase} {system, computer, user} {disease, patient, cell} {work, book, publish} {ship, engine, design} {son, year, death}

Entropy is a thermodynamic property that is a measure of the energy not available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines. Such devices can only be driven by convertible energy, and have a theoretical maximum efficiency when converting energy to work. During this work entropy accumulates in the system, but has to be removed by dissipation in the form of waste heat.

The concept of entropy is defined by the second law of thermodynamics, which states that the entropy of a closed system always increases. Thus, entropy is also measure of the tendency of a process, such as a chemical reaction, to be entropically favored, or to proceed in a particular direction. It determines that thermal energy always flows spontaneously from regions of higher temperature to regions of lower temperature, in the form of heat. These processes reduce the state of order of the initial systems, and therefore entropy is an expression of disorder or randomness. This model is the basis of the microscopic interpretation of entropy in statistical mechanics describing the probability of the constituents of a thermodynamic system to be occupying accessible quantum mechanical states, a model directly related to the information entropy.

Thermodynamic entropy has the dimension of energy divided by temperature, and a unit of joules per kelvin (J/K) in the International System of Units.

The term entropy was coined in 1865 by Rudolf Clausius based on the Greek εντροπία [entropía], a turning toward, from εν- [en-] (in) and τροπή [tropē] (turn, conversion).[2][note 2]