Lecture 6. Entropy of an Ideal Gas (Ch. 3)

Slides:



Advertisements
Similar presentations
The Kinetic Theory of Gases
Advertisements

Pressure and Kinetic Energy
Ideal gas and kinetic gas theory Boltzmann constant.
Statistical Mechanics
Lecture 2, p 1 Thermal reservoir Today: Equipartition First Law of Thermodynamics  Internal energy  Work and heat Lecture 2: Ideal Gases.
Lecture 8, p 1 Lecture 8 The Second Law of Thermodynamics; Energy Exchange  The second law of thermodynamics  Statistics of energy exchange  General.
Lecture 4. Entropy and Temperature (Ch. 3) In Lecture 3, we took a giant step towards the understanding why certain processes in macrosystems are irreversible.
1 Lecture 6 Ideal gas in microcanonical ensemble. Entropy. Sackur-Tetrode formula. De Broglie wavelength. Chemical potential. Ideal gas in canonical ensemble.
1.The Statistical Basis of Thermodynamics 1.The Macroscopic & the Microscopic States 2.Contact between Statistics & Thermodynamics: Physical Significance.
Intermediate Physics for Medicine and Biology Chapter 3: Systems of Many Particles Professor Yasser M. Kadah Web:
Lecture 2 The First Law of Thermodynamics (Ch.1)
Entropy Physics 202 Professor Lee Carkner Lecture 15.
Lecture 8. Thermodynamic Identities (Ch. 3) We have been considering the entropy changes in the processes where two interacting systems exchanged the thermal.
AME Int. Heat Trans. D. B. GoSlide 1 Non-Continuum Energy Transfer: Gas Dynamics.
Lecture 2 The First Law of Thermodynamics (Ch.1)
First law of thermodynamics
How much work is done by the gas in the cycle shown? A] 0 B] p 0 V 0 C] 2p 0 V 0 D] -2p 0 V 0 E] 4 p 0 V 0 How much total heat is added to the gas in the.
Entropy and the Second Law of Thermodynamics
For the cyclic process shown, W is:D A] 0, because it’s a loop B] p 0 V 0 C] - p 0 V 0 D] 2 p 0 V 0 E] 6 p 0 V 0 For the cyclic process shown,  U is:
Copyright © 2009 Pearson Education, Inc. Lecture 11: Laws of Thermodynamics.
Interaction Between Macroscopic Systems
Thermodynamic principles JAMES WATT Lectures on Medical Biophysics Dept. Biophysics, Medical faculty, Masaryk University in Brno.
Lecture 14. More on Thermodynamic Potentials PotentialVariables U (S,V,N)S, V, N H (S,P,N)S, P, N F (T,V,N)V, T, N G (T,P,N)P, T, N.
Thermodynamics Chapter 15. Expectations After this chapter, students will:  Recognize and apply the four laws of thermodynamics  Understand what is.
Chapter 2: The Second Law. Start with Combinatorics, Probability and Multiplicity Combinatorics and probability 2-state paramagnet and Einstein solid Multiplicity.
The Laws of Thermodynamics
MSEG 803 Equilibria in Material Systems 2: First Law of TD Prof. Juejun (JJ) Hu
Chapter 3: Interactions and Implications
Results from kinetic theory, 1 1. Pressure is associated with collisions of gas particles with the walls. Dividing the total average force from all the.
17.4 State Variables State variables describe the state of a system
THERMODYNAMICS Branch of science which deals with the processes involving heat and temperature inter conversion of heat and other forms of energy.
The Laws of Thermodynamics
Physics I The First Law of Thermodynamics Prof. WAN, Xin
Lecture 6. Entropy of an Ideal Gas (Ch. 3) Find  (U,V,N,...) – the most challenging step S (U,V,N,...) = k B ln  (U,V,N,...) Solve for U = f (T,V,N,...)
Dr.Salwa Al Saleh Lecture 11 Thermodynamic Systems Specific Heat Capacities Zeroth Law First Law.
1 Chapter 7. Applications of the Second Law. 2 Consider entropy changes in various reversible (!!!) processes We have: (a) Adiabatic process Hence a reversible.
Lecture 20. Continuous Spectrum, the Density of States (Ch. 7), and Equipartition (Ch. 6) The units of g(  ): (energy) -1 Typically, it’s easier to work.
Lecture 9 Overview (Ch. 1-3) Format of the first midterm: four problems with multiple questions. The Ideal Gas Law, calculation of  W,  Q and dS for.
CHAPTER 15 Thermodynamics Thermodynamic Systems and Their Surroundings Thermodynamics is the branch of physics that is built upon the fundamental.
Heat & The First Law of Thermodynamics
Thermodynamics. Thermodynamic Systems, States and Processes Objectives are to: define thermodynamics systems and states of systems explain how processes.
Thermodynamics Internal energy of a system can be increased either by adding energy to the system or by doing work on the system Remember internal energy.
IV. Kinetic theory (continued – see previous lecture) 5. Heat capacitance a) Monoatomic gas } b) Equipartition principle. Degrees of freedom Diatomic gas,
Lecture 3 Examples and Problems
An Introduction to Statistical Thermodynamics. ( ) Gas molecules typically collide with a wall or other molecules about once every ns. Each molecule has.
Physics 1210/1310 Mechanics&Thermodynamics Lecture 39~40 Thermodynamics.
Physics 207: Lecture 29, Pg 1 Physics 207, Lecture 29, Dec. 13 l Agenda: Finish Ch. 22, Start review, Evaluations  Heat engines and Second Law of thermodynamics.
Lecture 13. Thermodynamic Potentials (Ch. 5) So far, we have been using the total internal energy U and, sometimes, the enthalpy H to characterize various.
Halliday/Resnick/Walker Fundamentals of Physics
Lecture 7. Thermodynamic Identities (Ch. 3). Diffusive Equilibrium and Chemical Potential Sign “-”: out of equilibrium, the system with the larger  S/
H. Saibi January 20 th,  The internal Energy of an Ideal Gas  Work and the PV Diagram for a Gas  Heat capacities of Gases  Heat capacities of.
Chapter 6: Basic Methods & Results of Statistical Mechanics
Results of Midterm points # of students GradePoints A> 85 B+B B60-79 C+C C30-54 D<
Chapter 15 Thermodynamics Thermodynamic Systems and Their Surroundings Thermodynamics is the branch of physics that is built upon the fundamental.
Work in Thermodynamic Processes
THE SECOND LAW OF THERMODYNAMICS Entropy. Entropy and the direction of time Microscopically the eqs. of physics are time reversible ie you can turn the.
1 12. Thermodynamics The essential concepts here are heat and temperature. Heat is a form of energy transferred between a system and an environment due.
Chapter 20 Lecture 35: Entropy and the Second Law of Thermodynamics HW13 (problems):19.3, 19.10, 19.44, 19.75, 20.5, 20.18, 20.28,
12. Thermodynamics Temperature
Lecture 9 Overview (Ch. 1-3)
Classical Ideal Gas.
The units of g(): (energy)-1
Lecture 6. Entropy of an Ideal Gas (Ch. 3)
Ideal Gas in the Canonical Ensemble
Atmospheric Thermodynamics
Gibbs’ Paradox.
Thermal & Kinetic Lecture 12
Lecture 4. Entropy and Temperature (Ch. 3)
Classical Ideal Gas.
Presentation transcript:

Lecture 6. Entropy of an Ideal Gas (Ch. 3) Today we will achieve an important goal: we’ll derive the equation(s) of state for an ideal gas from the principles of statistical mechanics. We will follow the path outlined in the previous lecture: Find  (U,V,N,...) – the most challenging step S (U,V,N,...) = kB ln  (U,V,N,...) Solve for U = f (T,V,N,...) So far we have treated quantum systems whose states in the configuration (phase) space may be enumerated. When dealing with classical systems with translational degrees of freedom, we need to learn how to calculate the multiplicity.

Multiplicity for a Single particle is more complicated than that for an Einstein solid, because it depends on three rather than two macro parameters (e.g., N, U, V). Example: particle in a one-dimensional “box” -L L The total number of ways of filling up the cells in phase space is the product of the number of ways the “space” cells can be filled times the number of ways the “momentum” cells can be filled px The total number of microstates: -L L x Quantum mechanics (the uncertainty principle) helps us to numerate all different states in the configuration (phase) space: px -px x Q.M. The number of microstates:

Multiplicity of a Monatomic Ideal Gas (simplified) For a molecule in a three-dimensional box: the state of the molecule is a point in the 6D space - its position (x,y,z) and its momentum (px,py,pz). The number of “space” microstates is: For N molecules: p n There is some momentum distribution of molecules in an ideal gas (Maxwell), with a long “tail” that goes all the way up to p = (2mU)1/2 (U is the total energy of the gas). However, the momentum vector of an “average” molecule is confined within a sphere of radius p ~ (2mU/N)1/2 (U/N is the average energy per molecule). Thus, for a single “average” molecule: The total number of microstates for N molecules: However, we have over-counted the multiplicity, because we have assumed that the atoms are distinguishable. For indistinguishable quantum particles, the result should be divided by N! (the number of ways of arranging N identical atoms in a given set of “boxes”):

More Accurate Calculation of p px py pz More Accurate Calculation of p Momentum constraints: 1 particle - 2 particles - The accessible momentum volume for N particles = the “area” of a 3N-dimensional hyper-sphere p N =1 The reason why m matters: for a given U, a molecule with a larger mass has a larger momentum, thus a larger “volume” accessible in the momentum space. Monatomic ideal gas: (3N degrees of freedom) f N- the total # of “quadratic” degrees of freedom

Entropy of an Ideal Gas Monatomic ideal gas: the Sackur-Tetrode equation an average volume per molecule an average energy per molecule In general, for a gas of polyatomic molecules: f  3 (monatomic), 5 (diatomic), 6 (polyatomic)

Problem Two cylinders (V = 1 liter each) are connected by a valve. In one of the cylinders – Hydrogen (H2) at P = 105 Pa, T = 200C , in another one – Helium (He) at P = 3·105 Pa, T=1000C. Find the entropy change after mixing and equilibrating. For each gas: The temperature after mixing: H2 : He :

Entropy of Mixing if N1=N2=1/2N , V1=V2=1/2V Consider two different ideal gases (N1, N2) kept in two separate volumes (V1,V2) at the same temperature. To calculate the increase of entropy in the mixing process, we can treat each gas as a separate system. In the mixing process, U/N remains the same (T will be the same after mixing). The parameter that changes is V/N: Entropy of Mixing if N1=N2=1/2N , V1=V2=1/2V The total entropy of the system is greater after mixing – thus, mixing is irreversible.

- applies only if two gases are different ! Gibbs Paradox - applies only if two gases are different ! If two mixing gases are of the same kind (indistinguishable molecules): Stotal = 0 because U/N and V/N available for each molecule remain the same after mixing. Quantum-mechanical indistinguishability is important! (even though this equation applies only in the low density limit, which is “classical” in the sense that the distinction between fermions and bosons disappear.

at T1=T2, S=0, as it should be (Gibbs paradox) Problem Two identical perfect gases with the same pressure P and the same number of particles N, but with different temperatures T1 and T2, are confined in two vessels, of volume V1 and V2 , which are then connected. find the change in entropy after the system has reached equilibrium. - prove it! at T1=T2, S=0, as it should be (Gibbs paradox)

An Ideal Gas: from S(N,V,U) - to U(N,V,T) (fN degrees of freedom)  - the “energy” equation of state - in agreement with the equipartition theorem, the total energy should be ½kBT times the number of degrees of freedom. The heat capacity for a monatomic ideal gas:

Partial Derivatives of the Entropy We have been considering the entropy changes in the processes where two interacting systems exchanged the thermal energy but the volume and the number of particles in these systems were fixed. In general, however, we need more than just one parameter to specify a macrostate, e.g for an ideal gas When all macroscopic quantities S,V,N,U are allowed to vary: We are familiar with the physical meaning only one partial derivative of entropy: Today we will explore what happens if we let the other two parameters (V and N) vary, and analyze the physical meaning of the other two partial derivatives of the entropy:

Thermodynamic Identity for dU(S,V,N)  if monotonic as a function of U (“quadratic” degrees of freedom!), may be inverted to give pressure chemical potential compare with  shows how much the system’s energy changes when one particle is added to the system at fixed S and V. The chemical potential units – J. - the so-called thermodynamic identity for U This holds for quasi-static processes (T, P,  are well-define throughout the system).

The Exact Differential of S(U,V,N) The coefficients may be identified as: Again, this holds for quasi-static processes (T and P are well defined). Type of interaction Exchanged quantity Governing variable Formula thermal energy temperature mechanical volume pressure diffusive particles chemical potential connection between thermodynamics and statistical mechanics

Mechanical Equilibrium and Pressure Let’s fix UA,NA and UB,NB , but allow V to vary (the membrane is insulating, impermeable for gas molecules, but its position is not fixed). Following the same logic, spontaneous “exchange of volume” between sub-systems will drive the system towards mechanical equilibrium (the membrane at rest). The equilibrium macropartition should have the largest (by far) multiplicity  (U, V) and entropy S (U, V). Mechanical Equilibrium and Pressure UA, VA, NA UB, VB, NB - the sub-system with a smaller volume-per-molecule (larger P at the same T) will have a larger S/V, it will expand at the expense of the other sub-system. In mechanical equilibrium: S AB VA VAeq S A S B - the volume-per-molecule should be the same for both sub-systems, or, if T is the same, P must be the same on both sides of the membrane. The stat. phys. definition of pressure:

The “Pressure” Equation of State for an Ideal Gas (fN degrees of freedom) The “energy” equation of state (U  T): The “pressure” equation of state (P  T): - we have finally derived the equation of state of an ideal gas from first principles!

Quasi-Static Processes (quasi-static processes with fixed N) (all processes) Thus, for quasi-static processes : Comment on State Functions : P - is an exact differential (S is a state function). Thus, the factor 1/T converts Q into an exact differential for quasi-static processes. V Quasistatic adiabatic (Q = 0) processes:  isentropic processes The quasi-static adiabatic process with an ideal gas : - we’ve derived these equations from the 1st Law and PV=RT On the other hand, from the Sackur-Tetrode equation for an isentropic process :

Problem: Since U = 0,  (Pr. 2.34) (all the processes are quasi-static) (a) Calculate the entropy increase of an ideal gas in an isothermal process. (b) Calculate the entropy increase of an ideal gas in an isochoric process. You should be able to do this using (a) Sackur-Tetrode eq. and (b) Let’s verify that we get the same result with approaches a) and b) (e.g., for T=const): Since U = 0,  (Pr. 2.34)

Problem: A body of mass M with heat capacity (per unit mass) C, initially at temperature T0+T, is brought into thermal contact with a heat bath at temperature T0.. (a) Show that if T<<T0, the increase S in the entropy of the entire system (body+heat bath) when equilibrium is reached is proportional to (T)2. Find S if the body is a bacteria of mass 10-15kg with C=4 kJ/(kg·K), T0=300K, T=0.03K. What is the probability of finding the bacteria at its initial T0+T for t =10-12s over the lifetime of the Universe (~1018s). (a) (b)

Problem (cont.) (b)  for the (non-equilibrium) state with Tbacteria = 300.03K is greater than  in the equilibrium state with Tbacteria = 300K by a factor of The number of “1ps” trials over the lifetime of the Universe: Thus, the probability of the event happening in 1030 trials:

An example of a non-quasistatic adiabatic process Caution: for non-quasistatic adiabatic processes, S might be non-zero!!! Pr. 3.32. A non-quasistatic compression. A cylinder with air (V = 10-3 m3, T = 300K, P =105 Pa) is compressed (very fast, non-quasistatic) by a piston (S = 0.01 m2, F = 2000N, x = 10-3m). Calculate W, Q, U, and S. P 2 S = const along the isentropic line 2* holds for all processes, energy conservation 1 quasistatic, T and P are well-defined for any intermediate state Vf Vi V quasistatic adiabatic  isentropic Q = 0 for both non-quasistatic adiabatic The non-quasistatic process results in a higher T and a greater entropy of the final state.

Direct approach: adiabatic quasistatic  isentropic adiabatic non-quasistatic

2 P To calculate S, we can consider any quasistatic process that would bring the gas into the final state (S is a state function). For example, along the red line that coincides with the adiabat and then shoots straight up. Let’s neglect small variations of T along this path ( U << U, so it won’t be a big mistake to assume T  const):  U =  Q = 1J 1 Vf Vi V The entropy is created because it is an irreversible, non-quasistatic compression. 2 P For any quasi-static path from 1 to 2, we must have the same S. Let’s take another path – along the isotherm and then straight up: U = Q = 2J isotherm: 1 Vf Vi V “straight up”: Total gain of entropy:

The inverse process, sudden expansion of an ideal gas (2 – 3) also generates entropy (adiabatic but not quasistatic). Neither heat nor work is transferred: W = Q = 0 (we assume the whole process occurs rapidly enough so that no heat flows in through the walls). 2 P Because U is unchanged, T of the ideal gas is unchanged. The final state is identical with the state that results from a reversible isothermal expansion with the gas in thermal equilibrium with a reservoir. The work done on the gas in the reversible expansion from volume Vf to Vi: 3 1 Vf Vi V The work done on the gas is negative, the gas does positive work on the piston in an amount equal to the heat transfer into the system Thus, by going 1  2  3 , we will increase the gas entropy by