Dr Roger Bennett Rm. 23 Xtn. 8559 Lecture 15.

Slides:



Advertisements
Similar presentations
Dr Roger Bennett Rm. 23 Xtn Lecture 19.
Advertisements

Dr Roger Bennett Rm. 23 Xtn Thermal Physics PH2001 Dr Roger Bennett Rm. 23 Xtn Lecture.
Dr Roger Bennett Rm. 23 Xtn Lecture 13.
The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We.
Lecture 8, p 1 Lecture 8 The Second Law of Thermodynamics; Energy Exchange  The second law of thermodynamics  Statistics of energy exchange  General.
Chapter 3 Classical Statistics of Maxwell-Boltzmann
1 Lecture 6 Ideal gas in microcanonical ensemble. Entropy. Sackur-Tetrode formula. De Broglie wavelength. Chemical potential. Ideal gas in canonical ensemble.
1.The Statistical Basis of Thermodynamics 1.The Macroscopic & the Microscopic States 2.Contact between Statistics & Thermodynamics: Physical Significance.
SOME BASIC IDEAS OF STATISTICAL PHYSICS Mr. Anil Kumar Associate Professor Physics Department Govt. College for Girls, Sector -11, Chandigarh.
1 Lecture 5 The grand canonical ensemble. Density and energy fluctuations in the grand canonical ensemble: correspondence with other ensembles. Fermi-Dirac.
Intermediate Physics for Medicine and Biology Chapter 3: Systems of Many Particles Professor Yasser M. Kadah Web:
Chapter 2 Statistical Thermodynamics. 1- Introduction - The object of statistical thermodynamics is to present a particle theory leading to an interpretation.
MSEG 803 Equilibria in Material Systems 8: Statistical Ensembles Prof. Juejun (JJ) Hu
Energy. Simple System Statistical mechanics applies to large sets of particles. Assume a system with a countable set of states.  Probability p n  Energy.
1 Lecture 3 Entropy in statistical mechanics. Thermodynamic contacts: i.mechanical contact, ii.heat contact, iii.diffusion contact. Equilibrium. Chemical.
Lecture 21. Boltzmann Statistics (Ch. 6)
Statistical Mechanics
Introduction to Thermostatics and Statistical Mechanics A typical physical system has N A = X particles. Each particle has 3 positions and.
1 Ch3. The Canonical Ensemble Microcanonical ensemble: describes isolated systems of known energy. The system does not exchange energy with any external.
The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
P340 Lecture 5 (Second Law) THE FUNDAMENTAL POSTULATE (OF THERMAL PHYSICS) Any isolated system in thermodynamic equilibrium is equally likely to be in.
Boltzmann Distribution and Helmholtz Free Energy
12.3 Assembly of distinguishable particles
Introduction to (Statistical) Thermodynamics
Excerpts of Some Statistical Mechanics Lectures Found on the Web.
Entropy and the Second Law Lecture 2. Getting to know Entropy Imagine a box containing two different gases (for example, He and Ne) on either side of.
Statistical Thermodynamics CHEN 689 Fall 2015 Perla B. Balbuena 240 JEB
Too many particles… can’t keep track! Use pressure (p) and volume (V) instead. Temperature (T) measures the tendency of an object to spontaneously give.
Summary: Isolated Systems, Temperature, Free Energy Zhiyan Wei ES 241: Advanced Elasticity 5/20/2009.
MSEG 803 Equilibria in Material Systems 7: Statistical Interpretation of S Prof. Juejun (JJ) Hu
The Laws of Thermodynamics
Lecture 21. Grand canonical ensemble (Ch. 7)
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
Chapter 21ENTROPY AND THE SECOND LAW OF THERMODYNAMICS 21.1 Some One-Way Processes Consider the following examples: Example 1: If you drop a stone, it.
The Ideal Monatomic Gas. Canonical ensemble: N, V, T 2.
Chemical Reactions in Ideal Gases. Non-reacting ideal gas mixture Consider a binary mixture of molecules of types A and B. The canonical partition function.
 We just discussed statistical mechanical principles which allow us to calculate the properties of a complex macroscopic system from its microscopic characteristics.
Entropy and temperature Fundamental assumption : an isolated system (N, V and U and all external parameters constant) is equally likely to be in any of.
Summary Boltzman statistics: Fermi-Dirac statistics:
Supplement – Statistical Thermodynamics
Ch 22 pp Lecture 2 – The Boltzmann distribution.
Lecture 2 Molecular dynamics simulates a system by numerically following the path of all particles in phase space as a function of time the time T must.
3.The Canonical Ensemble 1.Equilibrium between a System & a Heat Reservoir 2.A System in the Canonical Ensemble 3.Physical Significance of Various Statistical.
Chapter 19 Statistical thermodynamics: the concepts Statistical Thermodynamics Kinetics Dynamics T, P, S, H, U, G, A... { r i},{ p i},{ M i},{ E i} … How.
Lecture 7, p Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
Monatomic Crystals.
2/18/2014PHY 770 Spring Lecture PHY Statistical Mechanics 11 AM – 12:15 & 12:30-1:45 PM TR Olin 107 Instructor: Natalie Holzwarth.
Lecture 10—Ideas of Statistical Mechanics Chapter 4, Wednesday January 30 th Finish Ch. 3 - Statistical distributions Statistical mechanics - ideas and.
An Introduction to Statistical Thermodynamics. ( ) Gas molecules typically collide with a wall or other molecules about once every ns. Each molecule has.
Other Partition Functions
Lecture 13. Thermodynamic Potentials (Ch. 5) So far, we have been using the total internal energy U and, sometimes, the enthalpy H to characterize various.
Lecture 7. Thermodynamic Identities (Ch. 3). Diffusive Equilibrium and Chemical Potential Sign “-”: out of equilibrium, the system with the larger  S/
Chapter 6: Basic Methods & Results of Statistical Mechanics
PHYS 172: Modern Mechanics Lecture 24 – The Boltzmann Distribution Read 12.7 Summer 2012.
1 12. Thermodynamics The essential concepts here are heat and temperature. Heat is a form of energy transferred between a system and an environment due.
Boltzmann statistics, average values
Applications of the Canonical Ensemble: Simple Models of Paramagnetism
Statistical Mechanics
The average occupation numbers
Entropy in statistical mechanics. Thermodynamic contacts:
Lecture 19. Boltzmann Statistics (Ch. 6)
Lecture 41 Statistical Mechanics and Boltzmann factor
Boltzmann statistics Reservoir R U0 -  Combined system U0 = const
Applications of the Canonical Ensemble:
Energy Fluctuations in the Canonical Ensemble
MIT Microstructural Evolution in Materials 3: Canonical Ensemble
Chapter 1: Statistical Basis of Thermodynamics
Statistical Thermodynamics
Statistical Mechanics and Canonical Ensemble
Presentation transcript:

Dr Roger Bennett Rm. 23 Xtn Lecture 15

Free expansion No temperature change means no change in kinetic energy distribution. The only physical difference is that the atoms have more space in which to move. We may imagine that there are more ways in which the atoms may be arranged in the larger volume. Statistical mechanics takes this viewpoint and analyses how many different states are possible that give rise to the same macroscopic properties.

Statistical View The constraints on the system (U, V and n) define the macroscopic state of the system (macrostate). We need to know how many microscopic states (microstates or quantum states) satisfy the macrostate. A microstate for a system is one for which everything that can in principle be known is known. The number of microstates that give rise to a macrostate is called the thermodynamic probability, , of that macrostate. (alternatively the Statistical Weight W) The largest thermodynamic probability dominates. The essential assumption of statistical mechanics is that each microstate is equally likely.

Statistical View Boltzmann’s Hypothesis: The Entropy is a function of the statistical weight or thermodynamic probability: S = ø(W) If we have two systems A and B each with entropy S A and S B respectively. Then we expect the total entropy of the two systems to be S AB = S A + S B (extensive). Think about the probabilities. W AB = W A  W B So S AB = ø(W A ) + ø(W B ) = ø(W AB ) = ø(W A W B )

Statistical View Boltzmann’s Hypothesis: S AB = ø(W A ) + ø(W B ) = ø(W AB ) = ø(W A W B ) The only functions that behave like this are logarithms. S = k ln(W) Boltzmann relation The microscopic viewpoint thus interprets the increase in entropy for an isolated system as a consequence of the natural tendency of the system to move from a less probable to a more probable state.

Expansion of an ideal gas - microscopic Expansion of ideal gas contained in volume V. U, T unchanged and no work is done nor heat flows. Entropy increases – what is the physical basis?

Expansion of an ideal gas - microscopic Split volume into elemental cells  V. Number of ways of placing one atom in volume is V/  V. Number of ways of placing n atoms is –W = (V/  V) n S = nk ln(V/  V) –Is this right? Depends on the size of  V.

Expansion of an ideal gas - microscopic Is this right? Depends on the size of  V. Yep, we only measure changes in entropy. S f -S i =nk(ln (V f /  V) - ln (V i /  V))= nk ln(V f /V i ) Doubling volume gives  S = nk ln(2) = NR ln(2)

Statistical mechanics We have seen that the entropy of a system is related to the probability of its state –entropy is a statistical phenomena. To calculate thermal properties we must combine our knowledge of the particles that make up a system with statistical properties. Statistical mechanics starts from conceptually simple ideas but evolves into a powerful and general tool. The first and cornerstone concept is a clear understanding of probability.

Probability Two common versions –Classical Probability: the power to predict what the likely outcome of an experiment is. –Statistical Probability: by repeated measurement we can determine the probability of an experimental outcome by measuring is frequency of occurrence. Relies on the system being in equilibrium and the existence of well defined frequencies.

Classical Probability –Must determine all the possible outcomes and assign equal probabilities to each. –Why equal probs. Surely not all outcomes are equal? –We ensure this is the case by looking at the system in absolute finest detail such that the outcome is related to a simple event. –By definition no further refinement would enable us to define the properties of the state in any finer detail. This is the microstate or quantum state of the system. –We have already done this example by boxing atoms of a gas in small volumes  V.

Example of Classical Probability Heads and Tails Coin Toss – best of 3. Possible outcomes –HHH, THH, HTH, HHT, TTH, THT, HTT, TTT –Each one of these is a microstate. Probability of exactly two Heads? 3 microstates have two heads in at total of 8 possible outcomes so probability 3/8. This is easy - we can count the number of microstates.

Example of Classical Probability Heads and Tails Coin Toss – best of 30. –What is probability of 30 heads? –Only one microstate has 30 H but how many microstates are possible in total? About 10 9 –Probability on each toss of a H is ½ so for 30 tosses gives P 30H = (½) 30 –What is probability of 5 Heads and 25 Tails? –How many ways of this occurring – how many microstates?

Example of Classical Probability Heads and Tails Coin Toss – best of 30. –What is probability of 5 Heads and 25 Tails? –How many ways of this occurring – how many microstates? –Each microstate equally likely –Probability= 142,506 × (½) 30 = 1.3×10 -4 Prob. 15 H and 15 T = 155,117,520 × (½) 30

Microstates in a configuration No. of microstates in a configuration where particles are distinguishable. Where N is the total number of particles/events/options etc. n i is the no. of particles in the i th distinct state.  means product (cf.  for sum). Eg. how many distinct anagrams of STATISTICS

Dr Roger Bennett Rm. 23 Xtn Lecture 16

Microstates in a configuration No. of microstates in a configuration where particles are distinguishable. Where N is the total number of particles/events/options etc. n i is the no. of particles in the i th distinct state.  means product (cf.  for sum). Eg. how many distinct anagrams of STATISTICS

Equilibrium Take an isolated system which is partitioned into two subsystems. U=U 1 +U 2 V=V 1 +V 2 N=N 1 +N 2 The statistical weight W of the entire system (total number of microstates) is the product of the weights of the subsystems. W(U,V,N,U 1,V 1,N 1 )=W 1 (U 1 V 1 N 1 ) × W 2 (U 2 V 2 N 2 ) S(U,V,N,U 1,V 1,N 1 )=S 1 (U 1 V 1 N 1 ) + S 2 (U 2 V 2 N 2 ) U 2, V 2, N 2 U 1, V 1, N 1

Equilibrium S(U,V,N,U 1,V 1,N 1 )=S 1 (U 1 V 1 N 1 ) + S 2 (U 2 V 2 N 2 ) Now let us exchange energy through the wall (same methodology works for exchange of particles or volume). Use Clausius Entropy principle (at equilibrium entropy is a maximum) – independent variable is U 1 holding all other terms fixed.

Equilibrium This is the condition for thermal equilibrium – the two subsystems must be at the same temperature. We can now define an absolute temperature for each subsystem i. At equilibrium all subsystems are at the same temperature.

Example – The Schottky Deferct At absolute zero all atoms in a crystal are perfectly ordered on a crystal lattice. Raising the temperature introduces point defects Schottky defects are atoms displaced from the lattice that end up on the surface leaving vacancies

Schottky Defect What is the concentration of defects in a crystal at thermal equilibrium? Creation of a defect costs energy . The energy U associated with n of these defects = n  Assumptions? Defects are dilute and so do not interact. We can now use our understanding of probability to investigate the configurational entropy.

Schottky Defects Configurational entropy – how many ways to distribute n defects in a crystal of N atoms? We can calculate the number of microstates

Schottky Defects At equilibrium, and remembering U = n , Leaves us with just the differential of the entropy to calculate. As crystals have large numbers we can approximate the factorial functions with Stirling’s formula.

Schottky Defects

For typical values of  = 1eV, and kT at room temp ~ 1/40 eV we find the density of Schottky defects to be n/N = e -40 = At 1000K n/N~10 -6

Dr Roger Bennett Rm. 23 Xtn Lecture 17

Equilibrium for an isolated system Take an isolated system which is partitioned into two subsystems. U=U 1 +U 2 V=V 1 +V 2 N=N 1 +N 2 The statistical weight W of the entire system (total number of microstates) is the product of the weights of the subsystems. W(U,V,N,U 1,V 1,N 1 )=W 1 (U 1 V 1 N 1 ) × W 2 (U 2 V 2 N 2 ) S(U,V,N,U 1,V 1,N 1 )=S 1 (U 1 V 1 N 1 ) + S 2 (U 2 V 2 N 2 ) U 2, V 2, N 2 U 1, V 1, N 1

Equilibrium for an isolated system S(U,V,N,U 1,V 1,N 1 )=S 1 (U 1 V 1 N 1 ) + S 2 (U 2 V 2 N 2 ) Now let us allow the walls to move thus changing volumes to maximise entropy (same methodology as before). Use Clausius Entropy principle (at equilibrium entropy is a maximum) – independent variable is V 1 holding all other terms fixed.

Equilibrium for an isolated system This is the condition for equilibrium – the two subsystems must be at the same pressure as the wall has moved to maximise entropy. We can now define pressure:

Equilibrium for an isolated system S(U,V,N,U 1,V 1,N 1 )=S 1 (U 1 V 1 N 1 ) + S 2 (U 2 V 2 N 2 ) Now let us exchange particles through the wall (same methodology as before) Use Clausius Entropy principle (at equilibrium entropy is a maximum) – independent variable is N 1 holding all other terms fixed.

Equilibrium for an isolated system This is the condition for particle equilibrium – the two subsystems must have no driving force to change particle numbers. We can now define the driving force to exchange particles as the Chemical Potential:

Equilibrium for system in a heat bath Take a system which is partitioned into two subsystems –same problem as before so far. U 0 =U+U R The combined system is again totally isolated and we assume T,V,N describe the macrostate of the system. The system will possess a discrete set of microstates, however, which we could group and label according to the energy of that microstate. U, V, N T R, V R, N R

Equilibrium for system in a heat bath By grouping the microstates by energy we can associate a statistical weight to each energy level. I.e. U 1 <U 2 <U 3 <U 4 …<U r The total energy of the composite system is conserved so U 0 =U+U R. U, V, N T The probability of finding our system with U = U r must be proportional to the number of microstates associated with the reservoir having energy U R = U 0 -U r. p r = const × W(U 0 -U r ) (all volumes and particle numbers constant)

Equilibrium for system in a heat bath p r = const × W(U 0 -U r ) The constant of proportionality must just depend upon all the available microstates and so can be properly normalised: We can also write W(U 0 -U r ) in terms of entropy:

Equilibrium for system in a heat bath So far we haven’t used the fact that the reservoir is a heat bath. It has average energy U 0 >> U r our system energy. This is not true for all states r but is true for all overwhelmingly likely states! We expand S(U 0 -U r ) as a Taylor expansion:-

Equilibrium for system in a heat bath The first term is simple The second term is related to the temperature as before through: The third term therefore describes changes in temperature of a heat bath due to temperature exchange with the system. By definition this and higher terms must be negligible. We keep terms up to linear in U r.

Equilibrium for system in a heat bath

The Boltzmann Distribution This is the Boltzmann distribution and gives “the probability that a system in contact with a heat bath at temperature T should be in a particular state”. The only property of the heat bath on which it depends is the temperature. The function Z is called the partition function of the system. It is fundamental to the study of systems at fixed temperature.

Dr Roger Bennett Rm. 23 Xtn Lecture 18

The Boltzmann Distribution This is the Boltzmann distribution and gives “the probability that a system in contact with a heat bath at temperature T should be in a particular state”. r labels all the states of the system. At low temperature only the lowest states have any chance of being occupied. As the temperature is raised higher lying states become more and more likely to be occupied. In this case, in contact with the heat bath, all the microstates are therefore not equally likely to be populated.

The Boltzmann Distribution -Example Take a very simple system that has only three energy levels each corresponding to one microstate (non-degenerate). The energies of these states are: –U 1 = 0 J, U 2 = 1.4× J, U 3 = 2.8× J If the heat bath has a temperature of 2K Z = e 0 + e -1/2 + e -1 = The probability of being in state p 1 = 0.506, p 2 = and p 3 = 0.186

The Boltzmann Distribution Usually there are huge numbers of microstates that can all have the same energy. This is called degeneracy. In this case we can do our summations above over each individual energy level rather than sum over each individual microstate. The summation is now over all the different energies U r and g(U r ) is the number of states possessing the energy U r. The probability is that of finding the system with energy U r.

Entropy in ensembles Our system embedded in a heat bath is called a canonical ensemble (our isolated system on its own from Lecture 16 is termed a microcanonical ensemble). When isolated the microcanonical ensemble has a defined internal energy so that the probability of finding a system in a particular microstate is the same as any other microstate. In a heat bath the energy of the system fluctuates and the probability of finding any particular microstate is not equal. Can we now calculate the entropy for such a system and hence derive thermodynamic variables from statistical properties?

Entropy in the canonical ensemble Embed our system in a heat bath made up of (M-1) replica subsystems to the one were interested in. Each subsystem may be in one of many microstates. The number of subsystems in the i th microstate is n i. The number of ways of arranging n 1 systems of µstate 1, n 2 systems of µstate 2, n 3 ….

Entropy in the canonical ensemble If we make M huge so that all n i are also large then we can (eventually) use Stirling’s approximation in calculating the entropy for the entire ensemble of M systems S M

Entropy in the canonical ensemble

As M becomes very large the ratios n i /M tend to a probability p i of finding the subsytem in state i. S M is the entropy for the ensemble of all the subsystems. But we know that entropy is extensive and scales with the size of the system. So the entropy per system is:

Entropy in the canonical ensemble This is the general definition of entropy and holds even if the probabilities of each individual microstate are different. If all microstates are equally probable p i = 1/W (microcanonical ensemble) Which brings us nicely back to the Boltzmann relation

Entropy in the canonical ensemble The general definition of entropy, in combination with the Boltzmann distribution allows us to calculate real properties of the system.

Helmholtz Free Energy Ū is the average value of the internal energy of the system. (Ū – TS) is the average value of the Helmholtz free energy, F. This is a function of state that we briefly mentioned in earlier lectures. It is central to statistical mechanics. The Partition function Z has appeared in our result –it seems to be much more than a mere normalising factor. Z acts as a bridge linking the microscopic world of microstates (quantum states) to the free energy and hence to all the large scale properties of a system.