Download presentation

Presentation is loading. Please wait.

Published byCarlos Sutton Modified over 2 years ago

1
Statistical Physics Notes 1. Probability Discrete distributions A variable x takes n discrete values, {x i, i = 1,…, n} (e.g. throwing of a coin or a dice) After N events, we get a distribution, {N 1, N 2, …, N n } Probability: Normalization: Markovian assumption: events are independent

2
Continuous distributions A variable x can take any value in a continuous interval [a,b] (most physical variables; position, velocity, temperature, etc) Partition the interval into small bins of width dx. If we measure dN(x) events in the interval [x,x+dx], the probability is Normalization:

3
Examples: Uniform probability distribution Gaussian distribution A: normalization constant x 0 : position of the maximum : width of the distribution a x 1/a P(x)

4
Normalization constant Substitute Normalized Gaussian distribution

5
Properties of distributions Average (mean) discrete distribution continuous distribution Median (or central) value: 50% split point Most probable value: P(x) is maximum For a symmetric distribution all of the above are equal (e.g. Gaussian). Mean value of discrete an observable f(x)continuous

6
Variance: measures the spread of a distribution Root mean square (RMS) or standard deviation: Same dimension as mean, used in error analysis:

7
Examples: Uniform probability distribution, a Gaussian distribution For variance, assume

8
Detour: Gaussian integrals Fundamental integral: Introduce

9
Addition and multiplication rules Addition rule for exclusive events: Multiplication rule for independent variables x and y discrete continuous Examples: 2D-Gaussian distribution If

10
In cylindrical coordinates, this distribution becomes Since it is independent of , we can integrate it out Mean and variance

11
3D-Gaussian distribution with equal ’s In spherical coordinates We can integrate out and Here r refers to a vector physical variable, e.g. position, velocity, etc.

12
Mean and variance of 3D-Gaussian distribution

13
Most probable value for a 3D-Gaussian distribution Set Summary of the properties of a 3D-Gaussian dist.:

14
Binomial distribution If the probability of throwing a head is p and tail is q (p+q=1), then the probability of throwing n heads out of N trials is given by the binomial distribution: The powers of p and q in the above equation are self-evident. The prefactor can be found from combinatorics. An explicit construction for 1D random walk or tossing of coins is shown in the next page.

15
Explicit construction of the binomial distribution There is only 1 way to get all H or T, N ways to get 1 T and (N-1) H (or vice versa), N(N-1)/2 ways to get 2 T and (N-2) H, N(N-1)(N-2)…(N-n+1)/n! ways to get n T and (N-n) H (binomial coeff.)

16
Properties of the binomial distribution: Normalization follows from the binomial theorem Average value of heads For

17
Average position in 1D random walk after N steps For large N, the probability of getting is actually quite small. To find the spread, calculate the variance

18
Hence the variance is To find the spread in position, we use

19
Large N limit of the binomial distribution: Mean collision times of molecules in liquids are of the order of picosec. Thus in macroscopic observations, N is a very large number To find the limiting form of P(n), Taylor expand its log around the mean Stirling’s formula for ln(n!) for large n,

20
Substitute the derivatives in the expansion of P(n)

21
Thus the large N limit of the binomial distribution is the Gaussian dist. Here is the mean value, and the width and normalization are For the position variable we have

22
How good is the Gaussian approximation? N=4 N=14 Bars: Binomial distribution with p=q Solid lines: Gaussian distribution

23
2. Thermal motion Ideal gas law: Macroscopic observables: P: pressure, V: volume, T: temperature N: number of molecules k = 1.38 x J/K (Boltzmann constant) At room temperature (T r = 298 K), kT r = 4.1 x J = 4.1 pN nm (kT r provides a convenient energy scale for biomolecular system) The combination NkT suggests that the kinetic energy of individual molecules is about kT. To link the macroscopic properties to molecular ones, we need an estimate of pressure at the molecular level.

24
Derivation of the average kinetic energy from the ideal gas law Consider a cubic box of length L filled with N gas molecules The pressure on the walls arises from the collision of molecules Momentum transfer to the y-z wall Average collision time Force on the wall due a single coll.

25
In general, velocities have a distribution, so we take an average Average force due to one molec. Average force due to N molec’s Pressure on the wall Generalise to all walls Average kinetic energy Equipartition thm.: Mean energy associated with each deg. of fredom is:

26
Distribution of speeds Experimental set up Experimental results for Tl atoms ○ T=944 K ● T=870 K ▬▬ 3D-Gaussian dist. v is reduced by Velocity filter

27
Velocities in a gas have a Gaussian distribution (Maxwell) The rms is Distribution of speeds (3D-Gaussian) This is the probability of a molecule having speed v regardless of direction

28
Example: the most common gas molecule N 2 Oxygen is 16/14 times heavier, so for the O 2 molecule, scale the above results by Hydrogen is 14 times lighter, so for H 2 scale the above results by

29
Generalise the Maxwell distribution to N molecules (Use the multiplication rule, assuming they move independently) This is simply the Boltzmann distribution for non-interacting N particles. In general, the particles interact so there is also position dependence: Universality of the Gaussian dist. arises from the quadratic nature of E.

30
Activation barriers and relaxation to equilibrium Speed dist. for boiling water at 2 different temperatures Removing the most energetic molecules creates a non-equilibrium state. When boiled, water molecules with sufficient kinetic energy evaporate Arrhenius rate law:E barrier : Activation barrier Those molecules with K.E. > E barrier can escape

31
Equilibrium state (i.e. Gaussian dist.) is restored via molecular collisions Injecting very fast molecules in a box of molecules results in an initial spike in the Gaussian distribution Gas molecules collide like billiard balls (Energy and momentum are conserved) Thus in each collision, the fast molecules lose energy to the slower ones (friction)

32
3. Entropy, Temperature and Free Energy Entropy is a measure of disorder in a closed system. When a system goes from an ordered to a disordered state, entropy increases and information is lost. The two quantities are intimately linked, and sometimes it is easier to understand information loss or gain. Consider any of the following 2-state system Tossing of N coins Random walk in 1-D (N steps) Box of N gas molecules divided into 2 parts N spin ½ particles with magnetic moment in a magnetic field B Each of these systems can be described by a binomial distribution

33
There are 2 N states in total, but only N+1 are distinct. Introduce the number of states with a given (N 1, N 2 ) as We define the disorder (or information content) as For the 2-state system, assuming large N, we obtain (Shannon’s formula)

34
Thus the amount of disorder per event is I vanishes for either p 1 =1 or p 2 =1 (zero disorder, max info) and It is maximum for p 1 =p 2 =1/2 (max disorder, min info) Generalization to m-levels: Max disorder when all p i are equal, and zero disorder when one is 1.

35
Entropy Statistical postulate: An isolated system evolves to thermal equilibrium. Equilibrium is attained when the probability dist. of microstates has the maximum disorder (i.e. entropy). Entropy of a physical system is defined as Entropy of an ideal gas Total energy: Radius of the sphere in 3N dimensions

36
Area of such a sphere is proportional to r 3N-1 ≈ r 3N Hence the area of the 3N-D volume in momentum space is (2mE) 3N/2 The number of allowed states is given by the phase space integral Sakure-Tetrode formula Area of a unit sphere in 3N-D Planck’s const.

37
Temperature: If the energies are not equal, and we allow exchange of energy via a small membrane, how will the energy evolve? (maximum disorder) Isolated system Total energy is conserved

38
Example: N A = N B = 10 Fluctuations in energy are proportional to

39
Definition of temperature At equilibrium: In general Free energy of a microscopic system “a” in a thermal bath (zeroth law of thermodynamics) Average energy Partition function

40
Example: 1D harmonic oscillator in a heat bath In 3D: Equipartition of energy: each DOF has kT/2 of energy on average

41
Free energy of the harmonic oscillator Free energy: Entropy:

42
Harmonic oscillator in quantum mechanics Energy levels Free energy: Entropy:

43
To calculate the average energy let Using yields the same entropy expression. Classical limit:

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google