Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Lecture 3 Entropy in statistical mechanics. Thermodynamic contacts: i.mechanical contact, ii.heat contact, iii.diffusion contact. Equilibrium. Chemical.

Similar presentations


Presentation on theme: "1 Lecture 3 Entropy in statistical mechanics. Thermodynamic contacts: i.mechanical contact, ii.heat contact, iii.diffusion contact. Equilibrium. Chemical."— Presentation transcript:

1 1 Lecture 3 Entropy in statistical mechanics. Thermodynamic contacts: i.mechanical contact, ii.heat contact, iii.diffusion contact. Equilibrium. Chemical potential. The main distributions in statistical mechanics. A system in the canonical ensemble. Thermostat. Ice melting" - classic example of entropy increase described in 1862 by Rudolf Clausius as an increase in the disgregation of the molecules of the body of ice.

2 2 Entropy in Statistical Mechanics S From the principles of thermodynamics we learn that the thermodynamic entropy S has the following important properties: dS DQ/T DQ  dS is an exact differential and is equal to DQ/T for a reversible process, where DQ is the heat quantity added to the system. S=S 1 +S 2.  Entropy is additive: S=S 1 +S 2. The entropy of the combined system is the sum of the entropy of two separate parts.   S  0   S  0. If the state of a closed system is given macroscopically at any instant, the most probable state at any other instant is one of equal or greater entropy.

3 3 State Function This statement means that entropy is a state function in that the value of entropy does not depend on the past history of the system but only on the actual state of the system... In this case one of the great accomplishments of statistical mechanics is to give us a physical picture of entropy.  The entropy  of a system (in classical statistical physics) in statistical equilibrium can be defined as  =ln   =ln  (3.1)  E-  E+  where  is the volume of phase space accessible to the system, i.e., the volume corresponding to the energies between E-  and E+ .

4 4   N Let us show first, that changes in the entropy are independent of the system of units used to measure . As  is a volume in the phase space of N point particles it has dimensions (Momentum  Length) 3N =(Action) 3N (3.2) Let denote the unit of action; then is dimensionless. If we were to define (3.3) we see that for changes (3.4)

5 5 independent of the system units. =Plank’s constant is a natural unit of action in phase space.  the change in entropy is an exact differential It is obvious that the entropy , as defined by (3.1), has a definite value for an ensemble in statistical equilibrium; thus the change in entropy is an exact differential. Once the ensemble is specified in terms of the spread in the phase space, the entropy is known.  as a measure of the “randomness” as a measure of the imprecision or randomness. We see that, if  is interpreted as a measure of the imprecision of our knowledge of a system or as a measure of the “randomness” of a system then the entropy is also to be interpreted as a measure of the imprecision or randomness.

6 6 Entropy is an additive  N N It can be easily shown that  is an additive. Let us consider a system made up of two parts, one with N 1 particles and the other with N 2 particles. Then and the phase space of the combined system is the product space of the phase spaces of the individual parts: (3.5) (3.6) The additive property of the entropy follows directly: (3.7)

7 7 Thermodynamic contacts between two systems condition of statistical equilibrium  maximum We have supposed that the condition of statistical equilibrium is given by the most probable condition of a closed system, and therefore we may also say that the entropy  is a maximum when a closed system is in equilibrium condition.  E (  )N i i The value of  for a system in equilibrium will depend on the energy E (  ) of the system; on the number N i of each molecular species i in the system; and on external variables, such as volume, strain, magnetization, etc.

8 8 Fig. 3.1. rigid, insulating, non-permeable barrier Let us consider the condition for equilibrium in a system made up of two interconnected subsystems, as in Fig. 3.1. Initially a rigid, insulating, non-permeable barrier separates the subsystems from each other.

9 9 Thermal contact Thermal contact thermal equilibrium Thermal contact - the systems can exchange the energy. In equilibrium, there is no any flow of energy between the systems. Let us suppose that the barrier is allowed to transmit energy, the other inhibitions remaining in effect. If the conditions of the two subsystems 1 and 2 do not change we say they are in thermal equilibrium.  In the thermal equilibrium the entropy  of the total system must be a maximum with respect to small transfers of energy from one subsystem to the other. Writing, by the additive property of the entropy

10 10 (3.8)  =  1 +  2 =0 we have in equilibrium (3.9) We know, however, that (3.11) thermally closedenergy constant as the total system is thermally closed, the energy in a microcanonical ensemble being constant. Thus (3.9)

11 11  E 1 As  E 1 was an arbitrary variation we must have (3.12) in thermal equilibrium. If we define a quantity  by (3.13) then in thermal equilibrium (3.14)  temperature T  =kTk Boltzmann constant Here  is known as the temperature and will shown later to be related to the absolute temperature T by  =kT, where k is the Boltzmann constant, 1.380  10 -23 j/deg K.

12 12 Mechanical contact Mechanical contact Mechanical contact the systems are separated by the mobile barrier; The equilibrium is reached in this case by adequation of pressure from both sides of the barrier. V 1 V 2 We now imagine that the wall is allowed to move and also passes energy but do not passes particles. The volumes V 1, V 2 of the two systems can readjust to maximize the entropy. In mechanical equilibrium (3.15)

13 13 After thermal equilibrium has been established the last two terms on the right add up to zero, so we must have (3.16) V=V 1 +V 2 Now the total volume V=V 1 +V 2 is constant, so that (3.17) We have then (3.18)

14 14 (3.19)  V 1 As  V 1 was an arbitrary variation we must have  in mechanical equilibrium. If we define a quantity  by (3.20) we see that for a system in thermal equilibrium the condition for mechanical equilibrium is (3.21)  p We show now that  has the essential characteristics of the usual pressure p.

15 15 Material-transferring contact i th The systems can be exchange by the particles. Let us suppose that the wall allows diffusion through it of molecules of the i th chemical species. We have (3.22) For equilibrium or (3.24) (3.23)

16 16  i We define a quantity  i by the relation  i ith The quantity  i is called the chemical potential of the ith species. For equilibrium at constant temperature (3.25) (3.26)

17 17 The Canonical Ensemble microcanonical ensemble The microcanonical ensemble is a general statistical tool, but it is often very difficult to use in practice because of difficulty in evaluating the volume of phase space or the number of states accessible to the system. canonical ensembleGibbs Boltzmann factor exp(-  E/kT)  E The canonical ensemble invented by Gibbs avoids some of the difficulties, and leads us easily to the familiar Boltzmann factor exp(-  E/kT) for the ration of populations of two states differing by  E in energy. microcanonical ensemble We shall see that the canonical ensemble describes systems in thermal contact with a heat reservoir; the microcanonical ensemble describes systems that are perfectly insulated. In 1901, at the age of 62, Gibbs (1839-1903) published a book called Elementary Principles in Statistical Mechanics (Dover, New York).

18 18 sr We imagine that each system of the ensemble is divided up into a large number of subsystems, which are in mutual thermal contact and can exchange energy with each other. We direct our attention (Fig.3.2) to one subsystem denoted s ; the rest of the system will be denoted by r and is sometimes referred to as a heat reservoir. (s) Rest ( r ) or heat reservoir Subsystem Total system (t) (Fig.3.2) t E t microcanonical ensemble subsystems The total system is denoted by t and has the constant energy E t as it is a member of a microcanonical ensemble. For each value of the energy we think of an ensemble of systems (and subsystems).

19 19 The subsystems will usually, but not necessarily, be themselves of macroscopic dimensions. subsystem The subsystem may be a single molecule if, as in a gas, the interactions between molecules are very weak, thereby permitting us to specify accurately the energy of a molecule. In a solid, a single atom will not be a satisfactory subsystem as the bond energy is shared with neighbors. dw t d  t of the appropriate phase space, we have for a microcanonical ensemble Letting dw t denote the probability that the total system is in an element of volume d  t of the appropriate phase space, we have for a microcanonical ensemble

20 20 (3.27) C here C is constant. Then we can write (3.28) dw s d  s, still requiring that the total system being in  E t E t We ask now the probability dw s that the subsystem is in d  s, without specifying the condition of the reservoir, but still requiring that the total system being in  E t at E t. Then (3.29)

21 21 The entropy of the reservoir is  r the total system being in  E t where  r is the volume of phase space of the reservoir which corresponds to the energy of the total system being in  E t at E t.  r d  s Our task is to evaluate  r ; that is, if we know that the subsystem is in d  s, how much phase space is accessible to the heat reservoir? (3.30) (3.31) Note that (3.32)

22 22 (3.33) (3.34) ( 3.35 ) E s << E t where we may take E s << E t because the subsystem is assumed to be small in comparison with the total system. We expend Thus As E t is necessarily close to E r, we can write, using (3.13) Here  is the temperature characterizing every part of the system, as thermal contact as assumed. Finally, from (3.29), (3.34) and (3.35)

23 23 (3.36) ( 3.37 ) ( 3.38 ) ( 3.39 ) where may be viewed as a quantity which takes care of the normalization: canonical ensemble Thus for the subsystem the probability density (distribution function) is given by the canonical ensemble

24 24 ln  1 =lnA 1 -E 1 /  ln  2 =lnA 2 -E 2 /  ln  1  2 =lnA 1 A 2 - (E 1 +E 2 )/  so that, with  =  1  2 ; A=A 1 A 2 ; E=E 1 +E 2,, we have ln  =lnA-E/  (3.40) ln  We note that ln  is additive for two subsystems in thermal contact: henceforth the subscript s is dropped. E energy of the entire subsystem. where here and henceforth the subscript s is dropped. We emphasize that E is the energy of the entire subsystem.

25 25 We have to note that for a subsystem consisting of a large number of particles, the subsystem energy in a canonical ensemble is very well defined. This is because the density of energy levels or the volume in phase space is a strongly varying function of energy, as is also the distribution function (3.39). f(p,q) The average value of any physical quantity f(p,q) over canonical distribution is given by additive property canonical ensemble for the combined systems. This additive property is central to the use of the canonical ensemble.


Download ppt "1 Lecture 3 Entropy in statistical mechanics. Thermodynamic contacts: i.mechanical contact, ii.heat contact, iii.diffusion contact. Equilibrium. Chemical."

Similar presentations


Ads by Google