1 Lecture 3 Entropy in statistical mechanics. Thermodynamic contacts: i.mechanical contact, ii.heat contact, iii.diffusion contact. Equilibrium. Chemical.

Slides:



Advertisements
Similar presentations
Grand Canonical Ensemble and Criteria for Equilibrium
Advertisements

Dr Roger Bennett Rm. 23 Xtn Lecture 19.
The microcanonical ensemble Finding the probability distribution We consider an isolated system in the sense that the energy is a constant of motion. We.
Statistical mechanics
CHEMICAL AND PHASE EQUILIBRIUM (1)
The Maxwell-Boltzmann Distribution Valentim M. B. Nunes ESTT - IPT April 2015.
1 Lecture 6 Ideal gas in microcanonical ensemble. Entropy. Sackur-Tetrode formula. De Broglie wavelength. Chemical potential. Ideal gas in canonical ensemble.
EQUILIBRIUM AND KINETICS. Mechanical Equilibrium of a Rectangular Block Centre Of Gravity Potential Energy = f(height of CG) Metastable state Unstable.
1 Lecture 5 The grand canonical ensemble. Density and energy fluctuations in the grand canonical ensemble: correspondence with other ensembles. Fermi-Dirac.
Intermediate Physics for Medicine and Biology Chapter 3: Systems of Many Particles Professor Yasser M. Kadah Web:
Section 3.5: Temperature. Temperature Temperature  The property of an object that determines the DIRECTION OF HEAT energy (Q) TRANSFER to or from other.
Lecture 4 Equilibrium and noequilibrium processes.
MSEG 803 Equilibria in Material Systems 8: Statistical Ensembles Prof. Juejun (JJ) Hu
Energy. Simple System Statistical mechanics applies to large sets of particles. Assume a system with a countable set of states.  Probability p n  Energy.
Introduction to Thermostatics and Statistical Mechanics A typical physical system has N A = X particles. Each particle has 3 positions and.
Lecture 1 Introduction to statistical mechanics.
1 Ch3. The Canonical Ensemble Microcanonical ensemble: describes isolated systems of known energy. The system does not exchange energy with any external.
Chapter 19 Chemical Thermodynamics
Entropy and the Second Law of Thermodynamics
PTT 201/4 THERMODYNAMIC SEM 1 (2012/2013). Objectives Apply the second law of thermodynamics to processes. Define a new property called entropy to quantify.
Chapter 15 Thermodynamics. MFMcGrawChap15d-Thermo-Revised 5/5/102 Chapter 15: Thermodynamics The first law of thermodynamics Thermodynamic processes Thermodynamic.
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
Boltzmann Distribution and Helmholtz Free Energy
Introduction to (Statistical) Thermodynamics
Chapter 19 Chemical Thermodynamics Lecture Presentation John D. Bookstaver St. Charles Community College Cottleville, MO © 2012 Pearson Education, Inc.
Entropy and the Second Law Lecture 2. Getting to know Entropy Imagine a box containing two different gases (for example, He and Ne) on either side of.
Statistical Thermodynamics CHEN 689 Fall 2015 Perla B. Balbuena 240 JEB
MSEG 803 Equilibria in Material Systems 7: Statistical Interpretation of S Prof. Juejun (JJ) Hu
Lecture slides by Mehmet Kanoglu
Constant-Volume Gas Thermometer
Lecture 21. Grand canonical ensemble (Ch. 7)
State Variables ∆X1 = ∆X2 X1 X2 Examples of State Variables:
1 CE 530 Molecular Simulation Lecture 6 David A. Kofke Department of Chemical Engineering SUNY Buffalo
Chapter seven The second Law of thermodynamics The direction of thermal phenomena IF a system for some reason or other is not in a state of equilibrium.
Dr.Salwa Al Saleh Lecture 11 Thermodynamic Systems Specific Heat Capacities Zeroth Law First Law.
The Second Law of Thermodynamics
 We just discussed statistical mechanical principles which allow us to calculate the properties of a complex macroscopic system from its microscopic characteristics.
Entropy and temperature Fundamental assumption : an isolated system (N, V and U and all external parameters constant) is equally likely to be in any of.
THEORY The argumentation was wrong. Halting theorem!
Summary Boltzman statistics: Fermi-Dirac statistics:
Ch 22 pp Lecture 2 – The Boltzmann distribution.
Thermodynamics System: Part of Universe to Study. Open or Closed boundaries. Isolated. Equilibrium: Unchanging State. Detailed balance State of System:
Thermodynamics – The science of energy and the properties of substances that bear a relationship to energy Energy – Ability to do work – OR – Ability to.
Ludwid Boltzmann 1844 – 1906 Contributions to Kinetic theory of gases Electromagnetism Thermodynamics Work in kinetic theory led to the branch of.
Lecture 7, p Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
An Introduction to Statistical Thermodynamics. ( ) Gas molecules typically collide with a wall or other molecules about once every ns. Each molecule has.
Other Partition Functions
Lecture 13. Thermodynamic Potentials (Ch. 5) So far, we have been using the total internal energy U and, sometimes, the enthalpy H to characterize various.
Statistical Mechanics and Multi-Scale Simulation Methods ChBE
Chapter 6: Basic Methods & Results of Statistical Mechanics
Chemical Thermodynamics © 2009, Prentice-Hall, Inc. Chapter 19 Chemical Thermodynamics Chemistry, The Central Science, 11th edition Theodore L. Brown;
Electrostatic field in dielectric media When a material has no free charge carriers or very few charge carriers, it is known as dielectric. For example.
Introduction to Entropy. Entropy A formal attempt to quantify randomness by relating it to the amount of heat “wasted” by a thermodynamic process. The.
Ch 2. THERMODYNAMICS, STATISTICAL MECHANICS, AND METROPOLIS ALGORITHMS 2.6 ~ 2.8 Adaptive Cooperative Systems, Martin Beckerman, Summarized by J.-W.
Boltzmann statistics, average values
AHMEDABAD INSTITUTE OF TECHNOLOGY
Entropy PREPARED BY: KANZARIYA JAYESHBHAI
Applications of the Canonical Ensemble: Simple Models of Paramagnetism
Entropy and the Second Law of Thermodynamics By Dr A K Mohapatra
Entropy in statistical mechanics. Thermodynamic contacts:
Lecture 41 Statistical Mechanics and Boltzmann factor
Boltzmann statistics Reservoir R U0 -  Combined system U0 = const
Applications of the Canonical Ensemble:
Chapter Two: Basic Concepts of Thermodynamics_ Part One
Total Energy is Conserved.
The Grand Canonical Ensemble
Rayat Shikshan Sanstha’s S. M. Joshi College, Hadapsar
Statistical Mechanics and Canonical Ensemble
Presentation transcript:

1 Lecture 3 Entropy in statistical mechanics. Thermodynamic contacts: i.mechanical contact, ii.heat contact, iii.diffusion contact. Equilibrium. Chemical potential. The main distributions in statistical mechanics. A system in the canonical ensemble. Thermostat. Ice melting" - classic example of entropy increase described in 1862 by Rudolf Clausius as an increase in the disgregation of the molecules of the body of ice.

2 Entropy in Statistical Mechanics S From the principles of thermodynamics we learn that the thermodynamic entropy S has the following important properties: dS DQ/T DQ  dS is an exact differential and is equal to DQ/T for a reversible process, where DQ is the heat quantity added to the system. S=S 1 +S 2.  Entropy is additive: S=S 1 +S 2. The entropy of the combined system is the sum of the entropy of two separate parts.   S  0   S  0. If the state of a closed system is given macroscopically at any instant, the most probable state at any other instant is one of equal or greater entropy.

3 State Function This statement means that entropy is a state function in that the value of entropy does not depend on the past history of the system but only on the actual state of the system... In this case one of the great accomplishments of statistical mechanics is to give us a physical picture of entropy.  The entropy  of a system (in classical statistical physics) in statistical equilibrium can be defined as  =ln   =ln  (3.1)  E-  E+  where  is the volume of phase space accessible to the system, i.e., the volume corresponding to the energies between E-  and E+ .

4   N Let us show first, that changes in the entropy are independent of the system of units used to measure . As  is a volume in the phase space of N point particles it has dimensions (Momentum  Length) 3N =(Action) 3N (3.2) Let denote the unit of action; then is dimensionless. If we were to define (3.3) we see that for changes (3.4)

5 independent of the system units. =Plank’s constant is a natural unit of action in phase space.  the change in entropy is an exact differential It is obvious that the entropy , as defined by (3.1), has a definite value for an ensemble in statistical equilibrium; thus the change in entropy is an exact differential. Once the ensemble is specified in terms of the spread in the phase space, the entropy is known.  as a measure of the “randomness” as a measure of the imprecision or randomness. We see that, if  is interpreted as a measure of the imprecision of our knowledge of a system or as a measure of the “randomness” of a system then the entropy is also to be interpreted as a measure of the imprecision or randomness.

6 Entropy is an additive  N N It can be easily shown that  is an additive. Let us consider a system made up of two parts, one with N 1 particles and the other with N 2 particles. Then and the phase space of the combined system is the product space of the phase spaces of the individual parts: (3.5) (3.6) The additive property of the entropy follows directly: (3.7)

7 Thermodynamic contacts between two systems condition of statistical equilibrium  maximum We have supposed that the condition of statistical equilibrium is given by the most probable condition of a closed system, and therefore we may also say that the entropy  is a maximum when a closed system is in equilibrium condition.  E (  )N i i The value of  for a system in equilibrium will depend on the energy E (  ) of the system; on the number N i of each molecular species i in the system; and on external variables, such as volume, strain, magnetization, etc.

8 Fig rigid, insulating, non-permeable barrier Let us consider the condition for equilibrium in a system made up of two interconnected subsystems, as in Fig Initially a rigid, insulating, non-permeable barrier separates the subsystems from each other.

9 Thermal contact Thermal contact thermal equilibrium Thermal contact - the systems can exchange the energy. In equilibrium, there is no any flow of energy between the systems. Let us suppose that the barrier is allowed to transmit energy, the other inhibitions remaining in effect. If the conditions of the two subsystems 1 and 2 do not change we say they are in thermal equilibrium.  In the thermal equilibrium the entropy  of the total system must be a maximum with respect to small transfers of energy from one subsystem to the other. Writing, by the additive property of the entropy

10 (3.8)  =  1 +  2 =0 we have in equilibrium (3.9) We know, however, that (3.11) thermally closedenergy constant as the total system is thermally closed, the energy in a microcanonical ensemble being constant. Thus (3.9)

11  E 1 As  E 1 was an arbitrary variation we must have (3.12) in thermal equilibrium. If we define a quantity  by (3.13) then in thermal equilibrium (3.14)  temperature T  =kTk Boltzmann constant Here  is known as the temperature and will shown later to be related to the absolute temperature T by  =kT, where k is the Boltzmann constant,  j/deg K.

12 Mechanical contact Mechanical contact Mechanical contact the systems are separated by the mobile barrier; The equilibrium is reached in this case by adequation of pressure from both sides of the barrier. V 1 V 2 We now imagine that the wall is allowed to move and also passes energy but do not passes particles. The volumes V 1, V 2 of the two systems can readjust to maximize the entropy. In mechanical equilibrium (3.15)

13 After thermal equilibrium has been established the last two terms on the right add up to zero, so we must have (3.16) V=V 1 +V 2 Now the total volume V=V 1 +V 2 is constant, so that (3.17) We have then (3.18)

14 (3.19)  V 1 As  V 1 was an arbitrary variation we must have  in mechanical equilibrium. If we define a quantity  by (3.20) we see that for a system in thermal equilibrium the condition for mechanical equilibrium is (3.21)  p We show now that  has the essential characteristics of the usual pressure p.

15 Material-transferring contact i th The systems can be exchange by the particles. Let us suppose that the wall allows diffusion through it of molecules of the i th chemical species. We have (3.22) For equilibrium or (3.24) (3.23)

16  i We define a quantity  i by the relation  i ith The quantity  i is called the chemical potential of the ith species. For equilibrium at constant temperature (3.25) (3.26)

17 The Canonical Ensemble microcanonical ensemble The microcanonical ensemble is a general statistical tool, but it is often very difficult to use in practice because of difficulty in evaluating the volume of phase space or the number of states accessible to the system. canonical ensembleGibbs Boltzmann factor exp(-  E/kT)  E The canonical ensemble invented by Gibbs avoids some of the difficulties, and leads us easily to the familiar Boltzmann factor exp(-  E/kT) for the ration of populations of two states differing by  E in energy. microcanonical ensemble We shall see that the canonical ensemble describes systems in thermal contact with a heat reservoir; the microcanonical ensemble describes systems that are perfectly insulated. In 1901, at the age of 62, Gibbs ( ) published a book called Elementary Principles in Statistical Mechanics (Dover, New York).

18 sr We imagine that each system of the ensemble is divided up into a large number of subsystems, which are in mutual thermal contact and can exchange energy with each other. We direct our attention (Fig.3.2) to one subsystem denoted s ; the rest of the system will be denoted by r and is sometimes referred to as a heat reservoir. (s) Rest ( r ) or heat reservoir Subsystem Total system (t) (Fig.3.2) t E t microcanonical ensemble subsystems The total system is denoted by t and has the constant energy E t as it is a member of a microcanonical ensemble. For each value of the energy we think of an ensemble of systems (and subsystems).

19 The subsystems will usually, but not necessarily, be themselves of macroscopic dimensions. subsystem The subsystem may be a single molecule if, as in a gas, the interactions between molecules are very weak, thereby permitting us to specify accurately the energy of a molecule. In a solid, a single atom will not be a satisfactory subsystem as the bond energy is shared with neighbors. dw t d  t of the appropriate phase space, we have for a microcanonical ensemble Letting dw t denote the probability that the total system is in an element of volume d  t of the appropriate phase space, we have for a microcanonical ensemble

20 (3.27) C here C is constant. Then we can write (3.28) dw s d  s, still requiring that the total system being in  E t E t We ask now the probability dw s that the subsystem is in d  s, without specifying the condition of the reservoir, but still requiring that the total system being in  E t at E t. Then (3.29)

21 The entropy of the reservoir is  r the total system being in  E t where  r is the volume of phase space of the reservoir which corresponds to the energy of the total system being in  E t at E t.  r d  s Our task is to evaluate  r ; that is, if we know that the subsystem is in d  s, how much phase space is accessible to the heat reservoir? (3.30) (3.31) Note that (3.32)

22 (3.33) (3.34) ( 3.35 ) E s << E t where we may take E s << E t because the subsystem is assumed to be small in comparison with the total system. We expend Thus As E t is necessarily close to E r, we can write, using (3.13) Here  is the temperature characterizing every part of the system, as thermal contact as assumed. Finally, from (3.29), (3.34) and (3.35)

23 (3.36) ( 3.37 ) ( 3.38 ) ( 3.39 ) where may be viewed as a quantity which takes care of the normalization: canonical ensemble Thus for the subsystem the probability density (distribution function) is given by the canonical ensemble

24 ln  1 =lnA 1 -E 1 /  ln  2 =lnA 2 -E 2 /  ln  1  2 =lnA 1 A 2 - (E 1 +E 2 )/  so that, with  =  1  2 ; A=A 1 A 2 ; E=E 1 +E 2,, we have ln  =lnA-E/  (3.40) ln  We note that ln  is additive for two subsystems in thermal contact: henceforth the subscript s is dropped. E energy of the entire subsystem. where here and henceforth the subscript s is dropped. We emphasize that E is the energy of the entire subsystem.

25 We have to note that for a subsystem consisting of a large number of particles, the subsystem energy in a canonical ensemble is very well defined. This is because the density of energy levels or the volume in phase space is a strongly varying function of energy, as is also the distribution function (3.39). f(p,q) The average value of any physical quantity f(p,q) over canonical distribution is given by additive property canonical ensemble for the combined systems. This additive property is central to the use of the canonical ensemble.