View in A-Z Index
Number of views:
28235

Entropy is a macrophysical property of thermodynamic systems which has been introduced by Clausius. It is completely transferred from one system to the other during reversible processes, while it always increases during irreversible processes in closed systems. The First Law of Thermodynamics claims that energy is conserved. This means that nonthermal energy lost by a system (for example, through friction) must reappear in a system or its surrounding in the form of thermal energy. The definition of entropy is obtained from the Carnot Cycle of heat engines. The efficiency of reversible heat engines working between two absolute temperatures, T and T – ΔT, is

(1)

where W is the work done and Qin is the heat absorbed by the engine. The state function entropy, denoted by S, is then defined in terms of

(2)

According to the Second Law of Thermodynamics, only irreversible processes are possible in nature. Thus, the entropy change of a system and its surrounding is positive and tends to zero as the system approaches reversibility.

(3)

For a closed system of constant composition,

(4)

This equation contains only properties of the system and is therefore a process-independent fundamental property relation of the system. It can also be derived that

(5)

and hence

(6)
(7)

The entropy change of an ideal gas can be written as:

(8)

with N as the number of moles present, is entropy per mole and is specific heat capacity per mole. The value of d is 0 if the system is separated into two identical parts (i.e., V/N is constant). However, entropy of mixing is generated if two different gases at the same temperature and pressure are mixed, i.e., N is constant and V2 = 2V1 for each component. Before removing the separating wall, the molecules of each species occupy only half the available space and are more ordered than in the final state, where they are randomly distributed over the total volume. From a microscopic view, entropy is therefore a measure of the potential number of microstates within the same macrostate, i.e., the disorder of the system. These ideas have been expressed mathematically by L. Boltzmann and J. W. Gibbs in terms of a thermodynamic probability Ω, which is the number of ways microscopic particles can be distributed among the states accessible to them.

(9)

with n as the total number of particles and n1, n2, etc. being the number of particles in states 1,2, etc. Because of the large number of particles contained in thermodynamic systems and the randomness of their position at any time, statistical means have to be applied, giving rise to the expression Statistical Thermodynamics. The connection postulated by Boltzmann between entropy and thermodynamic probability is

(10)

with k as the Boltzmann constant R/N, and

(11)

with Ω1 and Ω2, the probability for states 1 and 2.

If applied to information theory, entropy is the logarithm of the missing yes/no statements to provide a complete set of information.

Back to top © Copyright 2008-2024