What is Statistical Mechanics?

A mathematical framework called statistical mechanics is used in physics to apply statistical techniques and probability theory to massive collections of microscopic objects. The behavior of these ensembles is used to describe the macroscopic behavior of nature rather than assuming or positing any natural rules.

Numerous issues in the areas of physics, biology, chemistry, and neurology can be addressed using this branch of study, which is often referred to as statistical physics or statistical thermodynamics.

Its fundamental goal is to make the characteristics of matter in general more understandable in terms of the physical rules that govern atomic motion.

Classical thermodynamics, which succeeded in explaining macroscopic physical properties like temperature, pressure, and heat capacity in terms of microscopic parameters that fluctuate about average values and are characterized by probability distributions, gave rise to statistical mechanics.

Three physicists are typically credited with creating the discipline of statistical mechanics:

  • Ludwig Boltzmann created the theoretical model for how entropy may be explained as a collection of microstates.
  • Modellers of the probability distribution of such situations, such James Clerk Maxwell
  • The field's name was first used by Josiah Willard Gibbs in 1884.

While thermodynamic equilibrium is the main focus of classical thermodynamics, non-equilibrium statistical mechanics has been used to the problem of microscopically modelling the pace of irreversible processes that are driven by imbalances.

Examples of such processes include heat and particle flows as well as chemical reactions. The fundamental understanding gained from using non-equilibrium statistical mechanics to examine the most straightforward non-equilibrium scenario of a steady state current flow in a system of many particles is known as the fluctuation-dissipation theorem.

History

The kinetic theory of gases was established by Daniel Bernoulli, a Swiss mathematician and physicist, in his 1738 publication Hydrodynamica.

In this work, Bernoulli advanced the theory that gases are composed of numerous molecules travelling in all directions, that their collision on a surface produces the gas pressure that we experience, and that what we perceive as heat is only the kinetic energy of their motion. This theory is still in use today.

James Clerk Maxwell, a Scottish physicist, developed the Maxwell distribution of molecular velocities in 1859 after reading Rudolf Clausius' work on the diffusion of molecules. This formula provided the percentage of molecules with each velocity within a given range.

The first statistical law in physics was established by this. Additionally, Maxwell provided the first mechanical justification for the idea that atomic collisions lead to temperature equilibration and, hence, a trend towards equilibrium. Young student in Vienna Ludwig Boltzmann discovered Maxwell's article five years later, in 1864, and devoted most of his life expanding the topic.

Boltzmann's work, much of which was collectively published in his 1896 Lectures on Gas Theory, launched statistical mechanics in the 1870s.

In the proceedings of the Vienna Academy and other organizations, Boltzmann's original articles on the statistical interpretation of thermodynamics, the H-theorem, transport theory, thermal equilibrium, the equation of state of gases, and related topics take up over 2,000 pages.

With his H-theorem, Boltzmann first looked into non-equilibrium statistical mechanics and developed the idea of an equilibrium statistical ensemble.

American mathematical physicist J. Willard Gibbs first used the term "statistical mechanics" in 1884. Although "statistical mechanics" is often used, "probabilistic mechanics" would be a better phrase nowadays. Statistical mechanics was formalized by Gibbs in his book Elementary Principles in Statistical Mechanics, which was released in 1902 just before he passed away.

This book addressed all mechanical systems, whether they were large or small, gaseous or not. Although Gibbs' methods were initially developed within the context of classical mechanics, their generality allowed for easy adaptation to the development of quantum physics, and they continue to serve as the cornerstone of statistical mechanics today.

Principles: ensembles and mechanics

Both classical mechanics and quantum mechanics are frequently studied in physics. The conventional mathematical approach takes into account two ideas for both kinds of mechanics:

  • A phase point (in classical mechanics) or a pure quantum state vector (in quantum mechanics) are mathematical representations of the entire state of the mechanical system at a given instant.
  • Hamilton's equations (classical mechanics) or the Schrödinger equation (quantum mechanics) are examples of motion equations that advance the state across time.

The state at any other time, past or present, can theoretically be computed using these two ideas. However, there is a gap between these laws and our daily experiences because, when performing processes at the human scale (such as a chemical reaction), we do not feel the need to know precisely at the microscopic level the simultaneous positions and velocities of each molecule.

By introducing some uncertainty regarding the system's current state, statistical mechanics bridge the gap between the practical application of partial knowledge and the laws of mechanics.

Statistical mechanics presents the statistical ensemble, a sizable collection of virtually independent copies of the system in distinct states, in contrast to ordinary mechanics, which only takes into account the behavior of a single state. A probability distribution covering every potential state of the system makes up the statistical ensemble.

As opposed to a single-phase point in ordinary mechanics, the ensemble in classical statistical mechanics is a probability distribution over phase points, which is typically depicted as a distribution in a phase space with canonical coordinate axes. The ensemble, which is a probability distribution over pure states in quantum statistical mechanics, can be succinctly described as a density matrix.

As is typical for probabilities, there are numerous ways to interpret the ensemble.

  • The numerous states that a single system could be in can be represented by an ensemble (epistemic probability, a type of knowledge), or
  • The states of the systems in experiments repeated on separate systems that were prepared in a similar but imperfectly controlled manner (empirical probability), in the limit of an infinite number of trials, can be interpreted as the ensemble's members.

For many purposes, these two definitions are interchangeable, and this article will do the same.

Each state in the ensemble changes over time in accordance with the equation of motion, regardless of how the probability is interpreted. As a result, the ensemble itself (the probability distribution over states) changes because of the virtual systems' constant transition between one state and another.

The Liouville equation (classical mechanics) or the von Neumann equation (quantum mechanics) both predict the ensemble evolution. The mechanical equation of motion is simply applied to each virtual system in the ensemble separately to get these equations, with the likelihood of the virtual system remaining constant throughout time as it changes states.

Ensembles that don't change over time are one specific type of ensemble. These ensembles are referred to as equilibrium ensembles, and statistical equilibrium is the state they are in.

A statistical ensemble is in statistical equilibrium if it comprises all the past, present, and future states of each state with probabilities equal to the likelihood of being in that state. Statistical thermodynamics is the study of equilibrium ensembles of isolated systems.

The broader case of ensembles that change over time or ensembles of non-isolated systems is addressed by non-equilibrium statistical mechanics.

Thermodynamics by statistics

Deriving the classical thermodynamics of materials in terms of the characteristics of their constituent particles and the interactions between them is the main objective of statistical thermodynamics, commonly referred to as equilibrium statistical mechanics.

In other words, statistical thermodynamics connects the microscopic behaviors and motions that take place inside a material to its macroscopic qualities when it is in thermodynamic equilibrium.

While statistical mechanics in its proper form contains dynamics, in this case, statistical equilibrium (steady state) is the main topic of discussion. Statistical equilibrium just indicates that the ensemble is not evolving, not that the particles have ceased to move (mechanical equilibrium).

Underlying premise

The probability distribution being solely a function of conserved parameters (total energy, total particle numbers, etc.) is a sufficient (but not essential) requirement for statistical equilibrium with an isolated system.

Only a few of the various equilibrium ensembles that can be taken into account correspond to thermodynamics. To justify why the ensemble for a particular system should take one form or another, further postulates are required.

Taken from several textbooks, this strategy is based on the equal a priori probability hypothesis. This proposition asserts

An isolated system with precisely known energy and composition can be found in any microstate that is consistent with that information with an equal probability.

Thus, the microcanonical ensemble described below is motivated by the equal a priori probability postulate. The equal a priori probability postulate is supported by several reasons, including:

  • A system that evolves over time to investigate "all accessible" states-all those with the same energy and composition-is said to be ergodic. The microcanonical ensemble is the sole potential equilibrium ensemble with fixed energy in an ergodic system. Since most systems are not ergodic, this technique has a narrow range of applications.
  • The principle of indifference states that we can only give each compatible scenario an equal chance of happening in the absence of more knowledge.
  • The proper ensemble is the one that is in agreement with the available information and has the highest Gibbs entropy (information entropy), according to a more complex interpretation of the principle of indifference.

There have also been suggestions for additional fundamental assumptions in statistical mechanics. Recent research, for instance, demonstrates that the equal a priori probability postulate is not necessary to build the theory of statistical mechanics.[13][14] One such formalism is based on the basic thermodynamic connection and the postulates listed below:

  1. The ensemble parameters and random variables have an effect on how the probability density function behaves.
  2. The ensemble averages of random variables are used to explain thermodynamic state functions.
  3. The Gibbs entropy formula's definition of entropy and the definition of entropy used in classical thermodynamics are the same.
    Where the following can take the place of the third postulate:
  4. All microstates have the same probability when the temperature is infinite.

Three ensembles of thermodynamics

The main articles are: Canonical ensemble, Microcanonical ensemble, and Grand canonical ensemble.

For any isolated system contained within a finite volume, three equilibrium ensembles with simple forms can be established. In statistical thermodynamics, these ensembles are the ones that are discussed the most. They all correspond to classical thermodynamics in the macroscopic limit (described below).

Small-scale ensemble

Represents a system that has a definite composition (exact number of particles) and a precisely specified energy. Every potential state that is consistent with that energy and composition is present in the microcanonical ensemble with equal probability.

Canonical group

Describes a fixed composition system that is in thermal equilibrium with a heat bath that has a specific temperature. The canonical ensemble consists of states with different energies but the same composition; the ensemble's various states are assigned varied probabilities based on their overall energies.

A large canonical group

Describes a system that is in thermal and chemical equilibrium with a thermodynamic reservoir but has a non-fixed composition (uncertain particle counts). The temperature and chemical potentials in the reservoir are both precisely controlled for different kinds of particles.

The grand canonical ensemble consists of states with diverse energies and particle counts; the probability assigned to the various states in the ensemble varies depending on their overall energies and particle counts.

All three of the ensembles tend to exhibit the same behavior for systems with numerous particles (the thermodynamic limit). Which ensemble is used at that point is purely a question of mathematical convenience.

The theory of the concentration of measure phenomenon, which has applications in many fields of science, including functional analysis, artificial intelligence techniques, and big data technologies, was evolved from the Gibbs theorem on the equivalence of ensembles.

The following are significant situations where the thermodynamic ensembles do not produce the same outcomes:

  • Miniature systems.
  • Large systems going through a phase change.
  • Extensive systems with distant interconnections.

In these situations, it is important to select the appropriate thermodynamic ensemble because there are discernible variations among these ensembles not only in terms of the number of fluctuations but also in terms of average quantities, such as particle distribution.

The correct ensemble is the one that matches how the system has been prepared and characterized, or, in other words, the ensemble that accurately reflects the system's knowledge.

Computation Techniques

A system is "solved" when the characteristic state function for an ensemble has been derived for it (macroscopic observables can be retrieved from the characteristic state function).

However, because it entails taking into account every potential state of the system, calculating the characteristic state function of a thermodynamic ensemble is not always straightforward.

The most common (and actual) example is too complex for an accurate solution, even though some hypothetical systems have been precisely solved. There are many ways to simulate the genuine ensemble and enable average quantity calculations.

Exact

Certain situations allow for precise solutions.

  • Using accurate diagonalization in quantum physics or an integral over the entire phase space in classical mechanics, it is possible to directly calculate the ensembles for very small microscopic systems by simply counting all potential states of the system.
  • Many separable microscopic systems make up some complex systems that can each be individually examined. This characteristic, which is notable for idealized gases of non-interacting particles, enables precise derivations of the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein statistics.
  • A few sizable interactive systems have been resolved. For a few toy models, exact solutions have been discovered using subtle mathematical methods. The Bethe ansatz, square-lattice using model in zero fields, and hard hexagon model are a few examples.





Latest Courses