Entropy is a macrophysical property of thermodynamic systems which has been introduced by Clausius. It is completely transferred from one system to the other during reversible processes, while it always increases during irreversible processes in closed systems. The First Law of Thermodynamics claims that energy is conserved. This means that nonthermal energy lost by a system (for example, through friction) must reappear in a system or its surrounding in the form of thermal energy. The definition of entropy is obtained from the Carnot Cycle of heat engines. The efficiency of reversible heat engines working between two absolute temperatures, T and T – ΔT, is
where W is the work done and Qin is the heat absorbed by the engine. The state function entropy, denoted by S, is then defined in terms of
According to the Second Law of Thermodynamics, only irreversible processes are possible in nature. Thus, the entropy change of a system and its surrounding is positive and tends to zero as the system approaches reversibility.
For a closed system of constant composition,
This equation contains only properties of the system and is therefore a process-independent fundamental property relation of the system. It can also be derived that
The entropy change of an ideal gas can be written as:
with N as the number of moles present, is entropy per mole and is specific heat capacity per mole. The value of d is 0 if the system is separated into two identical parts (i.e., V/N is constant). However, entropy of mixing is generated if two different gases at the same temperature and pressure are mixed, i.e., N is constant and V2 = 2V1 for each component. Before removing the separating wall, the molecules of each species occupy only half the available space and are more ordered than in the final state, where they are randomly distributed over the total volume. From a microscopic view, entropy is therefore a measure of the potential number of microstates within the same macrostate, i.e., the disorder of the system. These ideas have been expressed mathematically by L. Boltzmann and J. W. Gibbs in terms of a thermodynamic probability Ω, which is the number of ways microscopic particles can be distributed among the states accessible to them.
with n as the total number of particles and n1, n2, etc. being the number of particles in states 1,2, etc. Because of the large number of particles contained in thermodynamic systems and the randomness of their position at any time, statistical means have to be applied, giving rise to the expression Statistical Thermodynamics. The connection postulated by Boltzmann between entropy and thermodynamic probability is
with k as the Boltzmann constant R/N, and
with Ω1 and Ω2, the probability for states 1 and 2.
If applied to information theory, entropy is the logarithm of the missing yes/no statements to provide a complete set of information.