Presentation is loading. Please wait.

Presentation is loading. Please wait.

Dr Roger Bennett Rm. 23 Xtn. 8559 Lecture 15.

Similar presentations


Presentation on theme: "Dr Roger Bennett Rm. 23 Xtn. 8559 Lecture 15."— Presentation transcript:

1 Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

2 Free expansion No temperature change means no change in kinetic energy distribution. The only physical difference is that the atoms have more space in which to move. We may imagine that there are more ways in which the atoms may be arranged in the larger volume. Statistical mechanics takes this viewpoint and analyses how many different states are possible that give rise to the same macroscopic properties.

3 Statistical View The constraints on the system (U, V and n) define the macroscopic state of the system (macrostate). We need to know how many microscopic states (microstates or quantum states) satisfy the macrostate. A microstate for a system is one for which everything that can in principle be known is known. The number of microstates that give rise to a macrostate is called the thermodynamic probability, , of that macrostate. (alternatively the Statistical Weight W) The largest thermodynamic probability dominates. The essential assumption of statistical mechanics is that each microstate is equally likely.

4 Statistical View Boltzmann’s Hypothesis: The Entropy is a function of the statistical weight or thermodynamic probability: S = ø(W) If we have two systems A and B each with entropy S A and S B respectively. Then we expect the total entropy of the two systems to be S AB = S A + S B (extensive). Think about the probabilities. W AB = W A  W B So S AB = ø(W A ) + ø(W B ) = ø(W AB ) = ø(W A W B )

5 Statistical View Boltzmann’s Hypothesis: S AB = ø(W A ) + ø(W B ) = ø(W AB ) = ø(W A W B ) The only functions that behave like this are logarithms. S = k ln(W) Boltzmann relation The microscopic viewpoint thus interprets the increase in entropy for an isolated system as a consequence of the natural tendency of the system to move from a less probable to a more probable state.

6 Expansion of an ideal gas - microscopic Expansion of ideal gas contained in volume V. U, T unchanged and no work is done nor heat flows. Entropy increases – what is the physical basis?

7 Expansion of an ideal gas - microscopic Split volume into elemental cells  V. Number of ways of placing one atom in volume is V/  V. Number of ways of placing n atoms is –W = (V/  V) n S = nk ln(V/  V) –Is this right? Depends on the size of  V.

8 Expansion of an ideal gas - microscopic Is this right? Depends on the size of  V. Yep, we only measure changes in entropy. S f -S i =nk(ln (V f /  V) - ln (V i /  V))= nk ln(V f /V i ) Doubling volume gives  S = nk ln(2) = NR ln(2)

9 Statistical mechanics We have seen that the entropy of a system is related to the probability of its state –entropy is a statistical phenomena. To calculate thermal properties we must combine our knowledge of the particles that make up a system with statistical properties. Statistical mechanics starts from conceptually simple ideas but evolves into a powerful and general tool. The first and cornerstone concept is a clear understanding of probability.

10 Probability Two common versions –Classical Probability: the power to predict what the likely outcome of an experiment is. –Statistical Probability: by repeated measurement we can determine the probability of an experimental outcome by measuring is frequency of occurrence. Relies on the system being in equilibrium and the existence of well defined frequencies.

11 Classical Probability –Must determine all the possible outcomes and assign equal probabilities to each. –Why equal probs. Surely not all outcomes are equal? –We ensure this is the case by looking at the system in absolute finest detail such that the outcome is related to a simple event. –By definition no further refinement would enable us to define the properties of the state in any finer detail. This is the microstate or quantum state of the system. –We have already done this example by boxing atoms of a gas in small volumes  V.

12 Example of Classical Probability Heads and Tails Coin Toss – best of 3. Possible outcomes –HHH, THH, HTH, HHT, TTH, THT, HTT, TTT –Each one of these is a microstate. Probability of exactly two Heads? 3 microstates have two heads in at total of 8 possible outcomes so probability 3/8. This is easy - we can count the number of microstates.

13 Example of Classical Probability Heads and Tails Coin Toss – best of 30. –What is probability of 30 heads? –Only one microstate has 30 H but how many microstates are possible in total? About 10 9 –Probability on each toss of a H is ½ so for 30 tosses gives P 30H = (½) 30 –What is probability of 5 Heads and 25 Tails? –How many ways of this occurring – how many microstates?

14 Example of Classical Probability Heads and Tails Coin Toss – best of 30. –What is probability of 5 Heads and 25 Tails? –How many ways of this occurring – how many microstates? –Each microstate equally likely –Probability= 142,506 × (½) 30 = 1.3×10 -4 Prob. 15 H and 15 T = 155,117,520 × (½) 30

15 Microstates in a configuration No. of microstates in a configuration where particles are distinguishable. Where N is the total number of particles/events/options etc. n i is the no. of particles in the i th distinct state.  means product (cf.  for sum). Eg. how many distinct anagrams of STATISTICS

16 Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 16

17 Microstates in a configuration No. of microstates in a configuration where particles are distinguishable. Where N is the total number of particles/events/options etc. n i is the no. of particles in the i th distinct state.  means product (cf.  for sum). Eg. how many distinct anagrams of STATISTICS

18 Equilibrium Take an isolated system which is partitioned into two subsystems. U=U 1 +U 2 V=V 1 +V 2 N=N 1 +N 2 The statistical weight W of the entire system (total number of microstates) is the product of the weights of the subsystems. W(U,V,N,U 1,V 1,N 1 )=W 1 (U 1 V 1 N 1 ) × W 2 (U 2 V 2 N 2 ) S(U,V,N,U 1,V 1,N 1 )=S 1 (U 1 V 1 N 1 ) + S 2 (U 2 V 2 N 2 ) U 2, V 2, N 2 U 1, V 1, N 1

19 Equilibrium S(U,V,N,U 1,V 1,N 1 )=S 1 (U 1 V 1 N 1 ) + S 2 (U 2 V 2 N 2 ) Now let us exchange energy through the wall (same methodology works for exchange of particles or volume). Use Clausius Entropy principle (at equilibrium entropy is a maximum) – independent variable is U 1 holding all other terms fixed.

20 Equilibrium This is the condition for thermal equilibrium – the two subsystems must be at the same temperature. We can now define an absolute temperature for each subsystem i. At equilibrium all subsystems are at the same temperature.

21 Example – The Schottky Deferct At absolute zero all atoms in a crystal are perfectly ordered on a crystal lattice. Raising the temperature introduces point defects Schottky defects are atoms displaced from the lattice that end up on the surface leaving vacancies

22 Schottky Defect What is the concentration of defects in a crystal at thermal equilibrium? Creation of a defect costs energy . The energy U associated with n of these defects = n  Assumptions? Defects are dilute and so do not interact. We can now use our understanding of probability to investigate the configurational entropy.

23 Schottky Defects Configurational entropy – how many ways to distribute n defects in a crystal of N atoms? We can calculate the number of microstates

24 Schottky Defects At equilibrium, and remembering U = n , Leaves us with just the differential of the entropy to calculate. As crystals have large numbers we can approximate the factorial functions with Stirling’s formula.

25 Schottky Defects

26 For typical values of  = 1eV, and kT at room temp ~ 1/40 eV we find the density of Schottky defects to be n/N = e -40 = 10 -17. At 1000K n/N~10 -6

27 Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 17

28 Equilibrium for an isolated system Take an isolated system which is partitioned into two subsystems. U=U 1 +U 2 V=V 1 +V 2 N=N 1 +N 2 The statistical weight W of the entire system (total number of microstates) is the product of the weights of the subsystems. W(U,V,N,U 1,V 1,N 1 )=W 1 (U 1 V 1 N 1 ) × W 2 (U 2 V 2 N 2 ) S(U,V,N,U 1,V 1,N 1 )=S 1 (U 1 V 1 N 1 ) + S 2 (U 2 V 2 N 2 ) U 2, V 2, N 2 U 1, V 1, N 1

29 Equilibrium for an isolated system S(U,V,N,U 1,V 1,N 1 )=S 1 (U 1 V 1 N 1 ) + S 2 (U 2 V 2 N 2 ) Now let us allow the walls to move thus changing volumes to maximise entropy (same methodology as before). Use Clausius Entropy principle (at equilibrium entropy is a maximum) – independent variable is V 1 holding all other terms fixed.

30 Equilibrium for an isolated system This is the condition for equilibrium – the two subsystems must be at the same pressure as the wall has moved to maximise entropy. We can now define pressure:

31 Equilibrium for an isolated system S(U,V,N,U 1,V 1,N 1 )=S 1 (U 1 V 1 N 1 ) + S 2 (U 2 V 2 N 2 ) Now let us exchange particles through the wall (same methodology as before) Use Clausius Entropy principle (at equilibrium entropy is a maximum) – independent variable is N 1 holding all other terms fixed.

32 Equilibrium for an isolated system This is the condition for particle equilibrium – the two subsystems must have no driving force to change particle numbers. We can now define the driving force to exchange particles as the Chemical Potential:

33 Equilibrium for system in a heat bath Take a system which is partitioned into two subsystems –same problem as before so far. U 0 =U+U R The combined system is again totally isolated and we assume T,V,N describe the macrostate of the system. The system will possess a discrete set of microstates, however, which we could group and label according to the energy of that microstate. U, V, N T R, V R, N R

34 Equilibrium for system in a heat bath By grouping the microstates by energy we can associate a statistical weight to each energy level. I.e. U 1 <U 2 <U 3 <U 4 …<U r The total energy of the composite system is conserved so U 0 =U+U R. U, V, N T The probability of finding our system with U = U r must be proportional to the number of microstates associated with the reservoir having energy U R = U 0 -U r. p r = const × W(U 0 -U r ) (all volumes and particle numbers constant)

35 Equilibrium for system in a heat bath p r = const × W(U 0 -U r ) The constant of proportionality must just depend upon all the available microstates and so can be properly normalised: We can also write W(U 0 -U r ) in terms of entropy:

36 Equilibrium for system in a heat bath So far we haven’t used the fact that the reservoir is a heat bath. It has average energy U 0 >> U r our system energy. This is not true for all states r but is true for all overwhelmingly likely states! We expand S(U 0 -U r ) as a Taylor expansion:-

37 Equilibrium for system in a heat bath The first term is simple The second term is related to the temperature as before through: The third term therefore describes changes in temperature of a heat bath due to temperature exchange with the system. By definition this and higher terms must be negligible. We keep terms up to linear in U r.

38 Equilibrium for system in a heat bath

39 The Boltzmann Distribution This is the Boltzmann distribution and gives “the probability that a system in contact with a heat bath at temperature T should be in a particular state”. The only property of the heat bath on which it depends is the temperature. The function Z is called the partition function of the system. It is fundamental to the study of systems at fixed temperature.

40 Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 18

41 The Boltzmann Distribution This is the Boltzmann distribution and gives “the probability that a system in contact with a heat bath at temperature T should be in a particular state”. r labels all the states of the system. At low temperature only the lowest states have any chance of being occupied. As the temperature is raised higher lying states become more and more likely to be occupied. In this case, in contact with the heat bath, all the microstates are therefore not equally likely to be populated.

42 The Boltzmann Distribution -Example Take a very simple system that has only three energy levels each corresponding to one microstate (non-degenerate). The energies of these states are: –U 1 = 0 J, U 2 = 1.4×10 -23 J, U 3 = 2.8×10 -23 J If the heat bath has a temperature of 2K Z = e 0 + e -1/2 + e -1 = 1.9744 The probability of being in state p 1 = 0.506, p 2 = 0.307 and p 3 = 0.186

43 The Boltzmann Distribution Usually there are huge numbers of microstates that can all have the same energy. This is called degeneracy. In this case we can do our summations above over each individual energy level rather than sum over each individual microstate. The summation is now over all the different energies U r and g(U r ) is the number of states possessing the energy U r. The probability is that of finding the system with energy U r.

44 Entropy in ensembles Our system embedded in a heat bath is called a canonical ensemble (our isolated system on its own from Lecture 16 is termed a microcanonical ensemble). When isolated the microcanonical ensemble has a defined internal energy so that the probability of finding a system in a particular microstate is the same as any other microstate. In a heat bath the energy of the system fluctuates and the probability of finding any particular microstate is not equal. Can we now calculate the entropy for such a system and hence derive thermodynamic variables from statistical properties?

45 Entropy in the canonical ensemble Embed our system in a heat bath made up of (M-1) replica subsystems to the one were interested in. Each subsystem may be in one of many microstates. The number of subsystems in the i th microstate is n i. The number of ways of arranging n 1 systems of µstate 1, n 2 systems of µstate 2, n 3 ….

46 Entropy in the canonical ensemble If we make M huge so that all n i are also large then we can (eventually) use Stirling’s approximation in calculating the entropy for the entire ensemble of M systems S M

47 Entropy in the canonical ensemble

48 As M becomes very large the ratios n i /M tend to a probability p i of finding the subsytem in state i. S M is the entropy for the ensemble of all the subsystems. But we know that entropy is extensive and scales with the size of the system. So the entropy per system is:

49 Entropy in the canonical ensemble This is the general definition of entropy and holds even if the probabilities of each individual microstate are different. If all microstates are equally probable p i = 1/W (microcanonical ensemble) Which brings us nicely back to the Boltzmann relation

50 Entropy in the canonical ensemble The general definition of entropy, in combination with the Boltzmann distribution allows us to calculate real properties of the system.

51 Helmholtz Free Energy Ū is the average value of the internal energy of the system. (Ū – TS) is the average value of the Helmholtz free energy, F. This is a function of state that we briefly mentioned in earlier lectures. It is central to statistical mechanics. The Partition function Z has appeared in our result –it seems to be much more than a mere normalising factor. Z acts as a bridge linking the microscopic world of microstates (quantum states) to the free energy and hence to all the large scale properties of a system.


Download ppt "Dr Roger Bennett Rm. 23 Xtn. 8559 Lecture 15."

Similar presentations


Ads by Google