Presentation is loading. Please wait.

# Statistical Physics Notes

## Presentation on theme: "Statistical Physics Notes"— Presentation transcript:

Statistical Physics Notes
1. Probability Discrete distributions A variable x takes n discrete values, {xi , i = 1,…, n} (e.g. throwing of a coin or a dice) After N events, we get a distribution, {N1, N2, …, Nn} Probability: Normalization: Markovian assumption: events are independent

Continuous distributions
A variable x can take any value in a continuous interval [a,b] (most physical variables; position, velocity, temperature, etc) Partition the interval into small bins of width dx. If we measure dN(x) events in the interval [x,x+dx], the probability is Normalization:

Examples: Uniform probability distribution
Gaussian distribution A: normalization constant x0: position of the maximum s: width of the distribution P(x) a x 1/a 2s

Normalization constant
Substitute Normalized Gaussian distribution

Properties of distributions
Average (mean) discrete distribution continuous distribution Median (or central) value: 50% split point Most probable value: P(x) is maximum For a symmetric distribution all of the above are equal (e.g. Gaussian). Mean value of discrete an observable f(x) continuous

Variance: measures the spread of a distribution
Root mean square (RMS) or standard deviation: Same dimension as mean, used in error analysis:

Examples: Uniform probability distribution, [0,a]
Gaussian distribution For variance, assume

Detour: Gaussian integrals
Fundamental integral: Introduce

Addition and multiplication rules
Addition rule for exclusive events: Multiplication rule for independent variables x and y discrete continuous Examples: 2D-Gaussian distribution If

In cylindrical coordinates, this distribution becomes
Since it is independent of , we can integrate it out Mean and variance

3D-Gaussian distribution with equal ’s
In spherical coordinates We can integrate out  and  Here r refers to a vector physical variable, e.g. position, velocity, etc.

Mean and variance of 3D-Gaussian distribution

Most probable value for a 3D-Gaussian distribution
Set Summary of the properties of a 3D-Gaussian dist.:

Binomial distribution
If the probability of throwing a head is p and tail is q (p+q=1), then the probability of throwing n heads out of N trials is given by the binomial distribution: The powers of p and q in the above equation are self-evident. The prefactor can be found from combinatorics. An explicit construction for 1D random walk or tossing of coins is shown in the next page.

Explicit construction of the binomial distribution
There is only 1 way to get all H or T, N ways to get 1T and (N-1)H (or vice versa), N(N-1)/2 ways to get 2T and (N-2)H, N(N-1)(N-2)…(N-n+1)/n! ways to get nT and (N-n)H (binomial coeff.)

Properties of the binomial distribution:
Normalization follows from the binomial theorem Average value of heads For

Average position in 1D random walk after N steps
For large N, the probability of getting is actually quite small. To find the spread, calculate the variance

Hence the variance is To find the spread in position, we use

Large N limit of the binomial distribution:
Mean collision times of molecules in liquids are of the order of picosec. Thus in macroscopic observations, N is a very large number To find the limiting form of P(n), Taylor expand its log around the mean Stirling’s formula for ln(n!) for large n,

Substitute the derivatives in the expansion of P(n)

Thus the large N limit of the binomial distribution is the Gaussian dist.
Here is the mean value, and the width and normalization are For the position variable we have

How good is the Gaussian approximation?
Bars: Binomial distribution with p=q N= N=14 Solid lines: Gaussian distribution

2. Thermal motion Ideal gas law: Macroscopic observables:
P: pressure, V: volume, T: temperature N: number of molecules k = 1.38 x J/K (Boltzmann constant) At room temperature (Tr = 298 K), kTr = 4.1 x J = 4.1 pN nm (kTr provides a convenient energy scale for biomolecular system) The combination NkT suggests that the kinetic energy of individual molecules is about kT. To link the macroscopic properties to molecular ones, we need an estimate of pressure at the molecular level.

Derivation of the average kinetic energy from the ideal gas law
Consider a cubic box of length L filled with N gas molecules The pressure on the walls arises from the collision of molecules Momentum transfer to the y-z wall Average collision time Force on the wall due a single coll.

In general, velocities have a distribution, so we take an average
Average force due to one molec. Average force due to N molec’s Pressure on the wall Generalise to all walls Average kinetic energy Equipartition thm.: Mean energy associated with each deg. of fredom is:

Distribution of speeds
Experimental set up Experimental results for Tl atoms ○ T=944 K ● T=870 K ▬▬ 3D-Gaussian dist. v is reduced by Velocity filter

Velocities in a gas have a Gaussian distribution (Maxwell)
The rms is Distribution of speeds (3D-Gaussian) This is the probability of a molecule having speed v regardless of direction

Example: the most common gas molecule N2
Oxygen is 16/14 times heavier, so for the O2 molecule, scale the above results by Hydrogen is 14 times lighter, so for H2 scale the above results by

Generalise the Maxwell distribution to N molecules
(Use the multiplication rule, assuming they move independently) This is simply the Boltzmann distribution for non-interacting N particles. In general, the particles interact so there is also position dependence: Universality of the Gaussian dist. arises from the quadratic nature of E.

Speed dist. for boiling water at 2 different temperatures
Activation barriers and relaxation to equilibrium Speed dist. for boiling water at 2 different temperatures Removing the most energetic molecules creates a non-equilibrium state. When boiled, water molecules with sufficient kinetic energy evaporate Arrhenius rate law: Ebarrier : Activation barrier Those molecules with K.E. > Ebarrier can escape

Equilibrium state (i. e. Gaussian dist
Equilibrium state (i.e. Gaussian dist.) is restored via molecular collisions Injecting very fast molecules in a box of molecules results in an initial spike in the Gaussian distribution Gas molecules collide like billiard balls (Energy and momentum are conserved) Thus in each collision, the fast molecules lose energy to the slower ones (friction)

3. Entropy, Temperature and Free Energy
Entropy is a measure of disorder in a closed system. When a system goes from an ordered to a disordered state, entropy increases and information is lost. The two quantities are intimately linked, and sometimes it is easier to understand information loss or gain. Consider any of the following 2-state system Tossing of N coins Random walk in 1-D (N steps) Box of N gas molecules divided into 2 parts N spin ½ particles with magnetic moment m in a magnetic field B Each of these systems can be described by a binomial distribution

There are 2N states in total, but only N+1 are distinct.
Introduce the number of states with a given (N1, N2) as We define the disorder (or information content) as For the 2-state system, assuming large N, we obtain (Shannon’s formula)

Thus the amount of disorder per event is
I vanishes for either p1=1 or p2=1 (zero disorder, max info) and It is maximum for p1=p2=1/2 (max disorder, min info) Generalization to m-levels: Max disorder when all pi are equal, and zero disorder when one is 1.

Entropy Statistical postulate: An isolated system evolves to thermal equilibrium. Equilibrium is attained when the probability dist. of microstates has the maximum disorder (i.e. entropy). Entropy of a physical system is defined as Entropy of an ideal gas Total energy: Radius of the sphere in 3N dimensions

Area of such a sphere is proportional to r3N-1 ≈ r3N
Hence the area of the 3N-D volume in momentum space is (2mE)3N/2 The number of allowed states is given by the phase space integral Sakure-Tetrode formula Area of a unit sphere in 3N-D Planck’s const.

Temperature: If the energies are not equal, and we allow exchange of energy via a small membrane, how will the energy evolve? (maximum disorder) Isolated system Total energy is conserved

Example: NA = NB= 10 Fluctuations in energy are proportional to

Definition of temperature
At equilibrium: In general Free energy of a microscopic system “a” in a thermal bath (zeroth law of thermodynamics) Average energy Partition function

Example: 1D harmonic oscillator in a heat bath
In 3D: Equipartition of energy: each DOF has kT/2 of energy on average

Free energy of the harmonic oscillator
Entropy:

Harmonic oscillator in quantum mechanics
Energy levels Free energy: Entropy:

To calculate the average energy let
Using <Ea> yields the same entropy expression. Classical limit:

Download ppt "Statistical Physics Notes"

Similar presentations

Ads by Google