### Entropy

Although entropy is a property most fundamentally responsible for the behavior of matter, and it is arguably the most important term in thermodynamics, it is a quantity of which we have no intuitive knowledge. The concept of entropy is, so to say, abstract and rather philosophical in nature to some extent. There is no straightforward definition for entropy, but, generally, the entropy of a system is defined in terms of a differential equation for infinitesimal change in entropy:

dS = dQ / T

where dQ is the amount of heat absorbed in a reversible process (in which the system goes from the one state to another), and T is the absolute temperature.

For irreversible processes, entropy is defined as

dS > dQ / T

Entropy owes its existence to the second law of thermodynamics, according to which the total entropy of any isolated thermodynamic system (that is not at equilibrium) tends to increase over time, approaching a maximum value. It is also very often defined as a measure of randomization or disorder (statistical interpretation of entropy; approach first used by Ludwig Boltzmann) or, more precisely, of our lack of detailed knowledge of the exact, microscopic state of the system. Because of the enormous number of particles contained in any macroscopic system, a microscopic description of matter must necessarily be statistical in nature.

If the Gibbs energy is known, entropy can be derived as follows:

S = - (∂G/∂T)P,n