Statistical Thermodynamics

Slides:



Advertisements
Similar presentations
Dr Roger Bennett Rm. 23 Xtn Lecture 19.
Advertisements

The Kinetic Theory of Gases
Pressure and Kinetic Energy
Statistical Mechanics
Lecture 8, p 1 Lecture 8 The Second Law of Thermodynamics; Energy Exchange  The second law of thermodynamics  Statistics of energy exchange  General.
We’ve spent quite a bit of time learning about how the individual fundamental particles that compose the universe behave. Can we start with that “microscopic”
CHAPTER 14 THE CLASSICAL STATISTICAL TREATMENT OF AN IDEAL GAS.
The Maxwell-Boltzmann Distribution Valentim M. B. Nunes ESTT - IPT April 2015.
Chapter 3 Classical Statistics of Maxwell-Boltzmann
1.The Statistical Basis of Thermodynamics 1.The Macroscopic & the Microscopic States 2.Contact between Statistics & Thermodynamics: Physical Significance.
SOME BASIC IDEAS OF STATISTICAL PHYSICS Mr. Anil Kumar Associate Professor Physics Department Govt. College for Girls, Sector -11, Chandigarh.
Intermediate Physics for Medicine and Biology Chapter 3: Systems of Many Particles Professor Yasser M. Kadah Web:
Chapter 2 Statistical Thermodynamics. 1- Introduction - The object of statistical thermodynamics is to present a particle theory leading to an interpretation.
Thermal & Kinetic Lecture 13 Towards Entropy and the 2 nd Law: Permutations and Combinations.
MSEG 803 Equilibria in Material Systems 8: Statistical Ensembles Prof. Juejun (JJ) Hu
AME Int. Heat Trans. D. B. GoSlide 1 Non-Continuum Energy Transfer: Gas Dynamics.
Lecture 23. Systems with a Variable Number of Particles. Ideal Gases of Bosons and Fermions (Ch. 7) In L22, we considered systems with a fixed number of.
Statistical Mechanics
Introduction to Thermostatics and Statistical Mechanics A typical physical system has N A = X particles. Each particle has 3 positions and.
Entropy and the Second Law of Thermodynamics
The Statistical Interpretation of Entropy The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally.
Entropy Physics 202 Professor Lee Carkner Ed by CJV Lecture -last.
Statistical Mechanics Physics 313 Professor Lee Carkner Lecture 23.
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
Ch 23 pages Lecture 15 – Molecular interactions.
12.3 Assembly of distinguishable particles
Introduction to (Statistical) Thermodynamics
Chapter 2 Statistical Thermodynamics
Excerpts of Some Statistical Mechanics Lectures Found on the Web.
Molecular Information Content
Results from kinetic theory, 1 1. Pressure is associated with collisions of gas particles with the walls. Dividing the total average force from all the.
Statistical Thermodynamics CHEN 689 Fall 2015 Perla B. Balbuena 240 JEB
The Laws of Thermodynamics
Chap 7: The 2nd and 3rd Laws Entropy and Gibbs Free Energy Why do things fall apart? Why doe some things happen spontaneously? Why does anything worthwhile.
Lecture 21. Grand canonical ensemble (Ch. 7)
Statistical Thermodynamics Chapter Introduction The object: to present a particle theory which can interpret the equilibrium thermal properties.
Lecture 9 Energy Levels Translations, rotations, harmonic oscillator
Chapter 21ENTROPY AND THE SECOND LAW OF THERMODYNAMICS 21.1 Some One-Way Processes Consider the following examples: Example 1: If you drop a stone, it.
The Ideal Monatomic Gas. Canonical ensemble: N, V, T 2.
Chemical Reactions in Ideal Gases. Non-reacting ideal gas mixture Consider a binary mixture of molecules of types A and B. The canonical partition function.
 We just discussed statistical mechanical principles which allow us to calculate the properties of a complex macroscopic system from its microscopic characteristics.
Summary Boltzman statistics: Fermi-Dirac statistics:
Lecture 20. Continuous Spectrum, the Density of States (Ch. 7), and Equipartition (Ch. 6) The units of g(  ): (energy) -1 Typically, it’s easier to work.
Ch 22 pp Lecture 2 – The Boltzmann distribution.
Dr Roger Bennett Rm. 23 Xtn Lecture 15.
Chapter 14: The Classical Statistical Treatment of an Ideal Gas.
Classical and Quantum Statistics
3.The Canonical Ensemble 1.Equilibrium between a System & a Heat Reservoir 2.A System in the Canonical Ensemble 3.Physical Significance of Various Statistical.
Lecture 7, p Midterm coming up… Monday April 7 pm (conflict 5:15pm) Covers: Lectures 1-12 (not including thermal radiation) HW 1-4 Discussion.
Monatomic Crystals.
Entropy Change (at Constant Volume) For an ideal gas, C V (and C P ) are constant with T. But in the general case, C V (and C P ) are functions of T. Then.
An Introduction to Statistical Thermodynamics. ( ) Gas molecules typically collide with a wall or other molecules about once every ns. Each molecule has.
Other Partition Functions
Chapter 6: Basic Methods & Results of Statistical Mechanics
THE SECOND LAW OF THERMODYNAMICS Entropy. Entropy and the direction of time Microscopically the eqs. of physics are time reversible ie you can turn the.
1 12. Thermodynamics The essential concepts here are heat and temperature. Heat is a form of energy transferred between a system and an environment due.
Boltzmann statistics, average values
12. Thermodynamics Temperature
To understand entropy, we need to consider probability.
Entropy and the Second Law of Thermodynamics
The average occupation numbers
Entropy and the Second Law of Thermodynamics By Dr A K Mohapatra
The units of g(): (energy)-1
Equilibrium Carrier Statistics
Entropy and the Second Law of Thermodynamics
Lecture 41 Statistical Mechanics and Boltzmann factor
Boltzmann statistics Reservoir R U0 -  Combined system U0 = const
MIT Microstructural Evolution in Materials 3: Canonical Ensemble
Chapter 1: Statistical Basis of Thermodynamics
Statistical Thermodynamics
Presentation transcript:

Statistical Thermodynamics Chapter 12 Statistical Thermodynamics {Can delete discussion of Maxwell’s Demon.}

Introduction to statistical mechanics Statistical mechanics was developed alongside macroscopic thermodynamics. Macroscopic thermodynamics has great generality, but does not explain, in any fundamental way, why certain processes occur. As our understanding of the molecular nature of matter developed this knowledge was used to obtain a deeper understanding of thermal processes. Some uses: 1) ideal gas:- very successful 2) real gases:- more difficult, but some success 3) liquids:- very difficult, not much success 4) crystalline solids:- since they are highly organized they can be treated successfully 5) electron gas:- electrical properties of solids 6) photon gas:- radiation 7) plasmas:- very important As the results of kinetic theory can be obtained from statistical mechanics, we will not discuss kinetic theory.

Stat mech adds something very useful to thermodynamics, but does not replace it. Can we use our knowledge of the microscopic nature of a gas to, say, violate the 2nd law? Maxwell investigated this possibility and invented an intelligent being, now called a Maxwell demon, who does just that. As an example, imagine a container with a partition at the center which has a small trapdoor. The demon opens, momentarily, the trapdoor when a fast molecule approaches the trapdoor from the right. She also opens it when a slow molecule approaches it from the left. As a result the gas on the left becomes hotter and the gas on the right becomes cooler. One can adiabatic walls demon then consider operating a heat engine between the two sides to produce work, violating the second law.

The demon, clever lady, can keep the energy content of the cold reservoir constant (for a time). The net result is that energy is removed from a single reservoir (H) and is used to do some work. This violates the 2nd law. adiabatic wall H C demon E work Of course(?) no demon exists, but could some clever mechanical device be used?The demon must have information about the molecules if she is to operate successfully. Is there a connection between information and entropy? Yes!

The subject of information theory uses the concept of entropy. Let us consider another example:- free expansion. The demon removes the partition, free expansion occurs and the entropy of the system increases. Because of random motion of the molecules, there is some probability that, at some instant, they will all be in the region initially occupied by the gas. For this to occur, you will probably have to wait gas vacuum demon The demon could, at this instant, slide in the partition and we would have a decrease in entropy of the universe. Again the demon must have some information about the location of the molecules. No such demon has been sighted. Before starting Ch. 12 I should warn you that there are two different types of statistics that have some similarities and have similar names. These two types of statistics are easily confused. Maxwell-Boltzmann Statistics:-”classical limit” applies to dilute gases. The particles are indistinguishable. (2) Boltzmann Statistics: particles are distinguishable

Some jargon: assembly (or system): N identical submicroscopic entities, such as molecules. macrostate (or configuration): number of particles in each of the energy levels. microstate: number of particles in each energy state. thermodynamic probability: number of different microstates leading to a given macrostate. macrostate is the thermodynamic probability Basic postulate: All possible microstates are equally probable.

RECALL: In statistics, probabilities are multiplicative. As an example, consider a true die. The probability of throwing a one is 1/6. Now if there are two dies, the probability of one coming up on both dies is

Elementary Statistics We begin by considering 3 distinguishable coins (N D Q) The possible macrostates are HHH HHT HTT TTT Let us consider the microstates for the macrostate HHT The table shows the possible selection of coins. There are 6 possibilities. However the pairs shown are not different microstates (the order does not matter). Hence we have 3 microstates. H T N D Q

More generally, suppose that we have N distinct coins and we wish to select heads (a particular macrostate). There are N choices for the first head. There are (N-1) choices for the second head. There are choices for the head The thermodynamic probability (w) is the number of microstates for a given macrostate. We are then tempted to write However permuting the heads results in the same microstate, so For the above simple example with 3 coins: This is the number of microstates for the HHT macrostate.

Suppose we plot w as a function of for a given N. We plot a number of cases (Thermocoin.mws). (N1 is the number of heads.)

Notice that the peak occurs at For large N we can use Stirling’s formula For N=1000 This is the number of distinct microstates for the most probable macrostate (N1=500). Note that it is a very large number!

For large N the plot of w versus is very sharp (see next slide) The total thermodynamic probability is obtained by summing over all macrostates. Let k indicate a particular macrostate: Since, for large N, the peak is very sharp:

Now we consider N distinguishable particles placed in n boxes with in the first box, in the second box, etc. We wish to calculate (a particular macrostate) Before doing a general calculation, we consider the case of 4 particles (ABCD) with 3 boxes and We begin by indicating the possibilities for the first box. Since the order is irrelevant, there are 6 possible microstates. Now suppose A and B were selected for the first box. This leaves C and D when we consider filling the second box. We obviously have only two possibilities, C or D. Suppose that C was selected. That leaves only one possibility (D) for the third box. The total number of A B C D possibilities for this macrostate is (6)(2)(1)=12

Now we consider the general problem (macrostate(N1,N2,N3,…..)): Consider placing of the N distinguishable particles in the first box. 1st box 2nd box 3rd box The thermodynamic probability for this macrostate is: We have been considering distinguishable particles, such as atoms rigidly set in the lattice of a solid. For a gas, the statistics will be different.

Example (Problem 12.6) We will do an example illustrating the use of the formula on the previous slide. We have 4 distinguishable particles (ABCD). We wish to place them in 4 energy levels (“boxes”) subject to the constraint that the total energy is A macrostate will be labeled by k and is the thermodynamic probability for the kth macrostate. k 1 2 3 4 5 6 24 The most probable state, k=2, is the most random distribution. {Students should explicitly display one of the macrostates.}

Now consider an isolated system of volume V containing N distinguishable particles. The internal energy U is then fixed and the macrostate will be characterized by (N,V,U). There are n energy levels (like boxes) available and we wish to know the set at equilibrium. There are the following restrictions: Conservation of particles Conservation of energy The central problem is then to determine the most probable distribution. Since the system is isolated the total entropy must be a maximum with respect to all possible variations within the ensemble. The actual distribution of particles amongst the energy levels will be the one that maximizes the entropy of the system. Can we make a connection between the entropy and some specification of the macrostate?

A study of simple systems suggests that there is a connection between entropy and disorder. For example if one considers the free expansion of a gas, the entropy of the gas increases and so does the disorder. We know less about the distribution of the molecules after the expansion. The thermodynamic probability is also a measure of disorder. The larger the value of w, the greater the disorder. A simple example is as follows: Suppose we distribute 5 distinguishable particles among 4 boxes. We can use the equation developed to determine w. 5 1 4 3 2 10 20 30 60

The most ordered state, that is with all the particles in a single box, has the lowest w. The most disordered state, that is with the particles distributed amongst all the boxes, has the largest thermodynamic probability. As a system approaches equilibrium not only does the entropy approach a maximum, but the thermodynamic probability also approaches a maximum. Is there a relationship between entropy and thermodynamic probability? If so we would expect that S would be a monotonically increasing function of w: as the probability increases, so does S.

Thermodynamic Probability and Entropy Ludwig Boltzmann made many important contributions to thermodynamics. His most important contribution to physics is the relationship between w and the classical concept of entropy. His argument was as follows. Consider an isolated assembly which undergoes a spontaneous, irreversible process. At equilibrium S has its maximum value consistent with U and V. But w also increases and approaches a maximum when equilibrium is achieved. Boltzmann therefore assumed that there must be some connection between w and S. He therefore wrote S=f(w), and S and w are state variables. To be physically meaningful f(w) must be a single-valued monotonically increasing function. Now consider two systems, A and B, in thermal contact. (Such a system of two or more assemblies is called a canonical ensemble.) Entropy is an extensive property and so S for the composite system is the sum of the individual entropies: Hence or

On the other hand independent probabilities are multiplicative so Hence From (1) and (2) we obtain: The only appropriate function for which this relationship is true is a logarithm. Hence Boltzmann wrote The constant k has the units of entropy and is, in fact, the Boltzmann constant that we have previously introduced. This celebrated equation provides the connecting link between statistical and classical thermodynamics. (One can begin with statistical mechanics and define S by the above equation.)

Quantum States and Energy Levels We consider a closed system containing a monatomic ideal gas of N particles. They are in some macroscopic volume V. According to quantum mechanics only certain discrete energy levels are permitted for the particles. These allowed energy states are given by where the nj are integers commencing with 1. The symbol h represents Planck’s constant, which is a fundamental constant. The symbol m is the mass of a molecule. The symbol n is called a quantum number.

At ordinary temperatures the ’s of the particles are such that the n - values are extremely large (109 is a typical value). When n changes by 1, the change in is so small that may be treated as a continuous variable. This will later permit us to replace sums by integrals. Example: A Hg atom moves in a cubical box whose edges are 1m long. Its kinetic energy is equal to the average kinetic energy of an atom of an ideal gas at 1000K. If the quantum numbers in the three directions are all equal to n, calculate n. Hg atom:

Each different represents a quantum level. Each specification of (nx, ny, nz) represents a quantum state. The energy levels are degenerate in that a number of different states have the same energy. The degree of degeneracy of level i will be specified by gi. There is only one way to form the level (nx = ny = nz = 1) so g1 = 1 that is, the ground state is not degenerate. The next level occurs when one of the n’s assumes the value 2 so g2 = 3 and so forth. As one goes to higher energy levels gi increases very rapidly.

In the terminology of statistical mechanics a number N of identical particles is called an assembly or a system. Let us now consider an assembly of N indistinguishable particles. A macrostate is a given distribution of particles in the various energy levels. A microstate is a given distribution of particles in the energy states. Basic Postulate of statistical mechanics: All accessible microstates of an isolated system are equally probable of occurring. We are interested in the macrostates In particular, what is the macrostate when the system is in equilibrium? We address this problem in succeeding chapters.

Density of Quantum States. A concept that is important for later work is that of the density of states. Under conditions in which the n’s are large and the energy levels close together, we regard as continuous variables. From We consider a quantum- number space, Each point in this space represents an energy state. Each unit volume in this space will contain one state. All the states are in the first quadrant. We then consider a radius R (which is n) in this space and a second radius (R+dR). The volume between these two surfaces is This gives the number of states between We represent this number by But Substitute in

This result is correct for only certain particles. We have assumed that a state is uniquely specified by the quantum numbers In many cases other quantum numbers play a role in the unique specification of a state. Particles fall into two categories which are radically different. Bosons: have integral spin quantum number Fermions: have odd half-integral spin quantum number Examples are: Bosons photons, gravitons, pi mesons Fermions electrons, muons, nucleons, quarks

For electrons, two spin states are possible for each translational state. Thus each point in space represents two distinctly different states. This leads to a multiplicative factor of 2 in the density of states formula. To be completely general we write For s=(1/2) fermions, =2 The density of states replaces the degeneracy when we go from discrete energy levels to a continuum of energy levels. Notice that g depends on V, but not on N. {Students: Show that the unit of g is J-1.}

Problem 12.1 Consider N “honest” coins. How many microstates are possible? Consider the coins lined up in a row. Each coin has two possibilities (H or T). For the N coins As an example consider 3 coins, so We will show these microstates explicitly by considering the possibilities for the 2nd and 3rd coins and then adding H or T for the first coin. The possibilities are displayed in the next slide.

COIN 1 COIN 2 COIN 3 H T We use MAPLE to calculate the factorials. (b) How many microstates for the most probable macrostate? The most probable macrostate has the same number of heads and tails. (slide 9) (c) True probability:

Problem 12.2 This is the same problem as 12.1 except that N=1000 The results are: {Students: Consider 4 identical coins in a row. Display all the possible microstates and indicate the various macrostates.}

Problem 12.5 We have N distinguishable coins. The thermodynamic probability for a particular microstate is (slide 9) (a) (Stirling’s Formula) {Maximum} (b) Now for the number of microstates at the maximum

Problem 12.8. In this problem we show explicity the microstates associated with each macrostate. There are two distinguishable particles and three energy levels, with a total energy of (a) A macrostate is labeled k. k A B 1 2 w=3 S=k ln(w) S=k ln(3) (b) Now we have 3 particles with the restriction that at least one particle is in the ground state. (This is obviously necessary.)

k A B C w 1 2 3 For this case, S=k ln(6)

What have we accomplished in this chapter? We have started to consider the statistics of the microscopic particles (atoms, molecules,…….) of a system. The thermodynamic probability, w, was introduced. For a given macrostate k, wk is the number of different microstates that give rise to this particular macrostate. A larger value of w for a macrostate means that the macrostate is more likely to occur. We also saw the link between macroscopic thermodynamics (S) and statistical mechanics (w). The basic postulate of statistical mechanics was also introduced: Basic Postulate of statistical mechanics: All accessible microstates of an isolated system are equally probable of occurring.

We will be considering situations for which the energy levels are so closely spaced that they may be considered to form a continuum. The degeneracy of isolated states is then replaced by the density of states: We now apply what we have developed in this chapter to different situations.