Presentation is loading. Please wait.

Presentation is loading. Please wait.

Thermodynamic Potentials

Similar presentations


Presentation on theme: "Thermodynamic Potentials"— Presentation transcript:

1 Thermodynamic Potentials
We have been using the total internal energy U and, sometimes, the enthalpy H to characterize various macroscopic systems. These functions are called the thermodynamic potentials: all the thermodynamic properties of the system can be found by taking partial derivatives of the TP. For each TP, a set of so-called “natural variables” exists: The other two thermodynamic potentials: the Helmhotz free energy F and Gibbs free energy G. Depending on the type of a process, one of these four thermodynamic potentials provides the most convenient description (and is tabulated). All four functions have units of energy. When considering different types of processes, we will be interested in two main issues: what determines the stability of a system and how the system evolves towards an equilibrium; how much work can be extracted from a system. Potential Variables U (S,V,N) S, V, N H (S,P,N) S, P, N F (T,V,N) V, T, N G (T,P,N) P, T, N

2 Diffusive Equilibrium and Chemical Potential
For completeness, let’s recall what we’ve learned about the chemical potential. The meaning of the partial derivative (S/N)U,V : let’s fix VA and VB (the membrane’s position is fixed), but assume that the membrane becomes permeable for gas molecules (exchange of both U and N between the sub-systems, the molecules in A and B are the same ). UA, VA, SA UB, VB, SB For sub-systems in diffusive equilibrium: S UA NA - the chemical potential In equilibrium, Sign “-”: out of equilibrium, the system with the larger S/N will get more particles. In other words, particles will flow from from a high /T to a low /T.

3 Chemical Potential of an Ideal gas
 has units of energy: it’s an amount of energy we need to (usually) remove from the system after adding one particle in order to keep its total energy fixed. Monatomic ideal gas: At normal T and P,  for an ideal gas is negative (e.g., for He,  ~ - 5·10-20 J ~ eV). Sign “-”: by adding particles to this system, we increase its entropy. To keep dS = 0, we need to subtract some energy, thus U is negative. The chemical potential increases with with its pressure. Thus, the molecules will flow from regions of high density to regions of lower density or from regions of high pressure to those of low pressure . when P increases Note that  in this case is negative because S increases with n. This is not always the case. For example, for a system of fermions at T0, the entropy is zero (all the lowest states are occupied), but adding one fermion to the system costs some energy (the Fermi energy). Thus,

4 The Quantum Concentration
where n=N/V is the concentration of particles When n << nQ (In the limit of low densities), the gas is in the classical regime, and <0. When n  nQ,   0 - the so-called quantum concentration (one particle per cube of side equal to the thermal de Broglie wavelength). At T=300K, P=105 Pa , n << nQ. When n  nQ, the quantum statistics comes into play.

5 Isolated Systems, independent variables S and V
Advantages of U : it is conserved for an isolated system (it also has a simple physical meaning – the sum of all the kin. and pot. energies of all the particles). In particular, for an isolated system Q=0, and dU = W. Earlier, by considering the total differential of S as a function of variables U, V, and N, we arrived at the thermodynamic identity for quasistatic processes : The combination of parameters on the right side is equal to the exact differential of U . This implies that the natural variables of U are S, V, N, Considering S, V, and N as independent variables: Since these two equations for dU must yield the same result for any dS and dV, the corresponding coefficients must be the same: Again, this shows that among several macroscopic variables that characterize the system (P, V, T, , N, etc.), only three are independent, the other variables can be found by taking partial derivatives of the TP with respect to its natural variables.

6 Isolated Systems, independent variables S and V (cont.)
Work is the transfer of energy to a system by a change in the external parameters such as volume, magnetic and electric fields, gravitational potential, etc. We can represent W as the sum of two terms, a mechanical work on changing the volume of a system (an “expansion” work) - PdV and all other kinds of work, Wother (electrical work, work on creating the surface area, etc.): If the system comprises only solids and liquids, we can usually assume dV  0, and the difference between W and Wother vanishes. For gases, the difference may be very significant. initially, the system is not necessarily in equilibrium The energy balance for an isolated system : (for fixed N) If we consider a quasi-static process (the system evolves from one equilibrium state to the other), than, since for an isolated system Q=TdS=0,

7 Equilibrium in Isolated Systems
UA, VA, SA UB, VB, SB For a thermally isolated system Q = 0. If the volume is fixed, then no work gets done (W = 0) and the internal energy is conserved: While this constraint is always in place, the system might be out of equilibrium (e.g., we move a piston that separates two sub-systems, see Figure). If the system is initially out of equilibrium, then some spontaneous processes will drive the system towards equilibrium. In a state of stable equilibrium no further spontaneous processes (other than ever-present random fluctuations) can take place. The equilibrium state corresponds to the maximum multiplicity and maximum entropy. All microstates in equilibrium are equally accessible (the system is in one of these microstates with equal probability). This implies that in any of these spontaneous processes, the entropy tends to increase, and the change of entropy satisfies the condition S Suppose that the system is characterized by a parameter x which is free to vary (e.g., the system might consist of ice and water, and x is the relative concentration of ice). By spontaneous processes, the system will approach the stable equilibrium (x = xeq) where S attains its absolute maximum. xeq x

8 Systems in Contact with a Thermal Reservoir
When we consider systems in contact with a large thermal reservoir (a “thermal bath, there are two complications: (a) the energy in the system is no longer fixed (it may flow between the system and reservoir), and (b) in order to investigate the stability of an equilibrium, we need to consider the entropy of the combined system (= the system of interest+the reservoir) – according to the 2nd Law, this total entropy should be maximized. What should be the system’s behavior in order to maximize the total entropy? For the systems in contact with a eat bath, we need to invent a better, more useful approach. The entropy, along with V and N, determines the system’s energy U =U (S,V,N). Among the three variable, the entropy is the most difficult to control (the entropy-meters do not exist!). For an isolated system, we have to work with the entropy – it cannot be replaced with some other function. And we did not want to do this so far – after all, our approach to thermodynamics was based on this concept. However, for systems in thermal contact with a reservoir, we can replace the entropy with another, more-convenient-to-work-with function. This, of course, does not mean that we can get rid of entropy. We will be able to work with a different “energy-like” thermodynamic potential for which entropy is not one of the natural variables.

9 Review: Entropy of a system in a given macrostate (N,U,V...):
Units: J/K The entropy is a state function, i.e. it depends on the macrostate alone and not on the path of the system to this macrostate. Entropy is just another (more convenient) way of talking about multiplicity. Convenience: reduces ridiculously large numbers to manageable numbers Examples: for N~1023, , ln  ~1023, being multiplied by kB ~ 10-23, it gives S ~ 1J/K. The “inverse” procedure: the entropy of a certain macropartition is 4600kB. What is the multiplicity of the macropartition? if a system contains two or more interacting sub-systems having their own distinct macrostates, the total entropy of the combined system in a given macropartition is the sum of the entropies of the subsystems they have in that macropartition:  AB =  A x  B x  C x....  S AB = S A + S B + S C + ...

10 Problem: Imagine that one macropartition of a combined system of two Einstein solids has an entropy of 1 J/K, while another (where the energy is more evenly divided) has an entropy of J/K. How many times more likely are you to find the system in the second macropartition compared to the first?

11 Microcanonical  Canonical
Our description of the microcanonical and canonical ensembles was based on counting the number of accessible microstates. Let’s compare these two cases: microcanonical ensemble canonical ensemble For an isolated system, the multiplicity  provides the number of accessible microstates. The constraint in calculating the states: U, V, N – const For a fixed U, the mean temperature T is specified, but T can fluctuate. For a system in thermal contact with reservoir, the partition function Z provides the # of accessible microstates. The constraint: T, V, N – const For a fixed T, the mean energy U is specified, but U can fluctuate. - the probability of finding a system in one of the accessible states - the probability of finding a system in one of these states - in equilibrium, S reaches a maximum - in equilibrium, F reaches a minimum For the canonical ensemble, the role of Z is similar to that of the multiplicity  for the microcanonical ensemble. This equation gives the fundamental relation between statistical mechanics and thermodynamics for given values of T, V, and N, just as S = kB ln gives the fundamental relation between statistical mechanics and thermodynamics for given values of U, V, and N.

12 Boltzmann Statistics Now we want to learn how to “statistically” treat a system in contact with a heat bath. The fundamental assumption states that a closed (isolated) system visits every one of its microstates with equal frequency: all allowed states of the system are equally probable. This statement applies to the combined system (the system of interest + the reservoir). We wish to translate this statement into a statement that applies to the system of interest only. Thus, the question: how often does the system visit each of its microstates being in the thermal equilibrium with the reservoir? The only information we have about the reservoir is that it is at the temperature T. Combined system U0 = const Reservoir R U0 -  System S a combined (isolated) system = a heat reservoir and a system in thermal contact

13 The Fundamental Assumption for an Isolated System
Isolated  the energy is conserved The ensemble of all equi-energetic states  a microcanonical ensemble  2  1 The ergodic hypothesis: an isolated system in thermal equilibrium, evolving in time, will pass through all the accessible microstates states at the same recurrence rate, i.e. all accessible microstates are equally probable. The average over long times will equal the average over the ensemble of all equi-energetic microstates: if we take a snapshot of a system with N microstates, we will find the system in any of these microstates with the same probability. Probability of a particular microstate of a microcanonical ensemble = 1 / (# of all accessible microstates)

14 Probability of a particular macrostate
The probability of a certain macrostate is determined by how many microstates correspond to this macrostate – the multiplicity of a given macrostate  Probability of a particular macrostate = ( of a particular macrostate) / (# of all accessible microstates) Note that the assumption that a system is isolated is important. If a system is coupled to a heat reservoir and is able to exchange energy, in order to replace the system’s trajectory by an ensemble, we must determine the relative occurrence of states with different energies.

15 Systems in Contact with the Reservoir
Reservoir R U0 -  System S The system – any small macroscopic or microscopic object. If the interactions between the system and the reservoir are weak, we can assume that the spectrum of energy levels of a weakly-interacting system is the same as that for an isolated system. Example: a two-level system in thermal contact with a heat bath. R S 2 1 We ask the following question: under conditions of equilibrium between the system and reservoir, what is the probability P(k) of finding the system S in a particular quantum state k of energy k? We assume weak interaction between R and S so that their energies are additive. The energy conservation in the isolated system “system+reservoir”: U0 = UR+ US = const According to the fundamental assumption of thermodynamics, all the states of the combined (isolated) system “R+S” are equally probable. By specifying the microstate of the system k, we have reduced S to 1 and SS to 0. Thus, the probability of occurrence of a situation where the system is in state k is proportional to the number of states accessible to the reservoir R . The total multiplicity:

16 Systems in Contact with the Reservoir (cont.)
The ratio of the probability that the system is in quantum state 1 at energy 1 to the probability that the system is in quantum state 2 at energy 2 is: SR UR U0- 1 U0- 2 S(U0- 2) S(U0- 1) S(U0) U0 Let’s now use the fact that S is much smaller than R (US=k << UR). Also, we’ll consider the case of fixed volume and number of particles (the latter limitation will be removed later, when we’ll allow the system to exchange particles with the heat bath

17 T is the characteristic of the heat reservoir
Boltzmann Factor T is the characteristic of the heat reservoir exp(- k/kBT) is called the Boltzmann factor This result shows that we do not have to know anything about the reservoir except that it maintains a constant temperature T ! reservoir T The corresponding probability distribution is known as the canonical distribution. An ensemble of identical systems all of which are in contact with the same heat reservoir and distributed over states in accordance with the Boltzmann distribution is called a canonical ensemble. The fundamental assumption for an isolated system has been transformed into the following statement for the system of interest which is in thermal equilibrium with the thermal reservoir: the system visits each microstate with a frequency proportional to the Boltzmann factor. Apparently, this is what the system actually does, but from the macroscopic point of view of thermodynamics, it appears as if the system is trying to minimize its free energy. Or conversely, this is what the system has to do in order to minimize its free energy.

18 One of the most useful equations in Termodynamics + Statistical Physics
Firstly, notice that only the energy difference  = i - j comes into the result so that provided that both energies are measured from the same origin it does not matter what that origin is. Secondly, what matters in determining the ratio of the occupation numbers is the ratio of the energy difference  to kBT. Suppose that i = kBT and j = 10kBT . Then (i - j ) / kBT = -9, and The lowest energy level 0 available to a system (e.g., a molecule) is referred to as the “ground state”. If we measure all energies relative to 0 and n0 is the number of molecules in this state, than the number molecules with energy  > 0

19 Helmholtz Free Energy (independ. variables T and V)
Let’s do the trick (Legendre transformation) again, now to exclude S : Helmholtz free energy The natural variables for F are T, V, N: Comparison yields the relations: can be rewritten as: The first term – the “energy” pressure – is dominant in most solids, the second term – the “entropy” pressure – is dominant in gases. (For an ideal gas, U does not depend on V, and only the second term survives). F is the total energy needed to create the system, minus the heat we can get “for free” from the environment at temperature T. If we annihilate the system, we can’t recover all its U as work, because we have to dispose its entropy at a non-zero T by dumping some heat into the environment.

20 The Minimum Free Energy Principle (V,T = const)
The total energy of the combined system (= the system of interest+the reservoir) is U = UR+Us, this energy is to be shared between the reservoir and the system (we assume that V and N for all the systems are fixed). Sharing is controlled by the maximum entropy principle: system’s parameters only Since U ~ UR >> Us Us SR+s Fs reservoir +system system loss in SR due to transferring Us to the system gain in Ss due to transferring Us to the system Thus, we can enforce the maximum entropy principle by simply minimizing the Helmholtz free energy of the system without having to know anything about the reservoir except that it maintains a fixed T! Under these conditions (fixed T, V, and N), stable equilibrium the maximum entropy principle of an isolated system is transformed into a minimum Helmholtz free energy principle for a system in thermal contact with the thermal bath.

21 Processes at T = const In general, if we consider processes with “other” work: For the processes at T = const (in thermal equilibrium with a large reservoir): The total work performed on a system at T = const in a reversible process is equal to the change in the Helmholtz free energy of the system. In other words, for the T = const processes the Helmholtz free energy gives all the reversible work. Problem: Consider a cylinder separated into two parts by an adiabatic piston. Compartments a and b each contains one mole of a monatomic ideal gas, and their initial volumes are Vai=10l and Vbi=1l, respectively. The cylinder, whose walls allow heat transfer only, is immersed in a large bath at 00C. The piston is now moving reversibly so that the final volumes are Vaf=6l and Vbi=5l. How much work is delivered by (or to) the system? UA, VA, SA UB, VB, SB The process is isothermal : The work delivered by the system: For one mole of monatomic ideal gas:

22 Gibbs Free Energy (independent variables T and P)
Let’s do the trick of Legendre transformation again, now to exclude both S and V : - the thermodynamic potential G is called the Gibbs free energy. Let’s rewrite dU in terms of independent variables T and P : Considering T, P, and N as independent variables: Comparison yields the relations:

23 Gibbs Free Energy and Chemical Potential
Combining with - this gives us a new interpretation of the chemical potential: at least for the systems with only one type of particles, the chemical potential is just the Gibbs free energy per particle. The chemical potential If we add one particle to a system, holding T and P fixed, the Gibbs free energy of the system will increase by . By adding more particles, we do not change the value of  since we do not change the density:   (N). Note that U, H, and F, whose differentials also have the term dN, depend on N non-linearly, because in the processes with the independent variables (S,V,N), (S,P,N), and (V,T,N),  = (N) might vary with N.

24 Example: Sketch a qualitatively accurate graph of G vs. T for a pure substance as it changes from solid to liquid to gas at fixed pressure. - the slope of the graph G(T ) at fixed P should be –S. Thus, the slope is always negative, and becomes steeper as T and S increases. When a substance undergoes a phase transformation, its entropy increases abruptly, so the slope of G(T ) is discontinuous at the transition. G solid liquid gas T S - these equations allow computing Gibbs free energies at “non-standard” T (if G is tabulated at a “standard” T) solid gas liquid T

25 The Minimum Free Energy Principle (P,T = const)
The total energy of the combined system (=the system of interest+the reservoir) is U = UR+Us, this energy is to be shared between the reservoir and the system (we assume that P and N for all the systems are fixed). Sharing is controlled by the maximum entropy principle: Thus, we can enforce the maximum entropy principle by simply minimizing the Gibbs free energy of the system without having to know anything about the reservoir except that it maintains a fixed T! Under these conditions (fixed P, V, and N), the maximum entropy principle of an isolated system is transformed into a minimum Gibbs free energy principle for a system in the thermal contact + mechanical equilibrium with the reservoir. SR+s reservoir +system Us Gs system Us stable equilibrium Thus, if a system, whose parameters T,P, and N are fixed, is in thermal contact with a heat reservoir, the stable equilibrium is characterized by the condition: G/T is the net entropy cost that the reservoir pays for allowing the system to have volume V and energy U, which is why minimizing it maximizes the total entropy of the whole combined system.

26 Processes at P = const and T = const
Let’s consider the processes at P = const and T = const in general, including the processes with “other” work: Then The “other” work performed on a system at T = const and P = const in a reversible process is equal to the change in the Gibbs free energy of the system. In other words, the Gibbs free energy gives all the reversible work except the PV work. If the mechanical work is the only kind of work performed by a system, the Gibbs free energy is conserved: dG = 0. Gibbs Free Energy and the Spontaneity of Chemical Reactions The Gibbs free energy is particularly useful when we consider the chemical reactions at constant P and T, but the volume changes as the reaction proceeds. G associated with a chemical reaction is a useful indicator of weather the reaction will proceed spontaneously. Since the change in G is equal to the maximum “useful” work which can be accomplished by the reaction, then a negative G indicates that the reaction can happen spontaneously. On the other hand, if G is positive, we need to supply the minimum “other” work  Wother= G to make the reaction go.

27 Helmholtz and Gibbs free energy
Potential Variables U (S,V,N) S, V, N H (S,P,N) S, P, N F (T,V,N) V, T, N G (T,P,N) P, T, N

28 The Partition Function
For the absolute values of probability (rather than the ratio of probabilities), we need an explicit formula for the normalizing factor 1/Z: - we will often use this notation The quantity Z, the partition function, can be found from the normalization condition  the total probability to find the system in all allowed quantum states is 1: The Zustandsumme in German or The partition function Z is called “function” because it depends on T, the spectrum (thus, V), etc. Example: a single particle, continuous spectrum. (kBT1)-1 The areas under these curves must be the same (=1). Thus, with increasing T, 1/Z decreases, and Z increases. At T = 0, the system is in its ground state (=0) with the probability =1. T1 < (kBT2)-1 T2

29 Partition Function and Helmholtz Free Energy
Now we can relate F to Z: Comparing this with the expression for the average energy: This equation provides the connection between the microscopic world which we specified with microstates and the macroscopic world which we described with F. If we know Z=Z(T,V,N), we know everything we want to know about the thermal behavior of a system. We can compute all the thermodynamic properties:

30 Partition Function for a Hydrogen Atom
Any reference energy can be chosen. Let’s choose  = 0 in the ground state: 1=0, 2=10.2 eV, 3=12.1 eV, etc. The partition function: r 1= eV 2= -3.4 eV 3= -1.5 eV (a) Estimate the partition function for a hydrogen atom at T = 5800K (= 0.5 eV) by taking into account only three lowest energy states. - we can forget about the spin degeneracy – it is the same for all the levels – the only factor that matters is n2 However, if we take into account all discrete levels, the full partition function diverges:

31 Partition Function for a Hydrogen Atom (cont.)
Intuitively, only the lowest levels should matter at  >> kBT . To resolve this paradox, let’s go back to our assumptions: we neglected the term PdV in If we keep this term, then For a H atom in its ground state, V~(0.1 nm)3 , and at the atmospheric pressure, PV~ 10-6 eV (negligible correction). However, this volume increases as n3 (the Bohr radius grows as n), and for n=100, PV is already ~1 eV. The PV terms cause the Boltzmann factors to decrease exponentially, and this rehabilitates our physical intuition: the correct partition function will be dominated by just a few lowest energy levels.

32 Average Values in a Canonical Ensemble
We have developed the tools that permit us to calculate the average value of different physical quantities for a canonical ensemble of identical systems. In general, if the systems in an ensemble are distributed over their accessible states in accordance with the distribution P(i), the average value of some quantity x (i) can be found as: In particular, for a canonical ensemble: Let’s apply this result to the average (mean) energy of the systems in a canonical ensemble (the average of the energies of the visited microstates according to the frequency of visits): The average values are additive. The average total energy Utot of N identical systems is: Another useful representation for the average energy: thus, if we know Z=Z(T,V,N), we know the average energy!

33 Degenerate Energy Levels
If several quantum states of the system (different sets of quantum numbers) correspond to the same energy level, this level is called degenerate. The probability to find the system in one of these degenerate states is the same for all the degenerate states. Thus, the total probability to find the system in a state with energy i is where di is the degree of degeneracy. Taking the degeneracy of energy levels into account, the partition function should be modified:

34 Degenerate Energy Levels
Example: The energy levels of an electron in the hydrogen atom: where ni = 1,2,... is the principle quantum number (these levels are obtained by solving the Schrödinger equation for the Coulomb potential). In addition to ni, the states of the electron in the H atom are characterized with three other quantum numbers: the orbital quantum number li max = 0,1, ..., ni – 1, the projection of the orbital momentum mli = - li, - li+1,...0, li-1,li, and the projection of spin si = ±1/2. In the absence of the external magnetic field, all electron states with the same ni are degenerate (the property of Coulomb potential). The degree of degeneracy in this case: r i di =2ni2 2 d2 =8 1 d1 =2 (for a continuous spectrum, we need another approach)

35 Boltzmann Statistics: classical (low-density) limit
We have developed the formalism for calculating the thermodynamic properties of the systems whose particles can occupy particular quantum states regardless of the other particles (classical systems). In other words, we ignored all kind of interactions between the particles. However, the occupation numbers are not free from the rule of quantum mechanics. In quantum mechanics, even if the particles do not interact through forces, they still might participate in the so-called exchange interaction which is dependent on the spin of interacting particles (this is related to the principle of indistinguishability of elementary particles, we’ll consider bosons and fermions). This type of interactions becomes important if the particles are in the same quantum state (the same set of quantum numbers), and their wave functions overlap in space: strong exchange interaction weak exchange interaction the de Broglie wavelength the average distance btw particles Boltzmann statistics applies (for N2 molecule,  ~ m at RT) Violations of the Boltzmann statistics are observed if the the density of particles is very large (neutron stars), or particles are very light (electrons in metals, photons), or they are at very low temperatures (liquid helium), However, in the limit of small density of particles, the distinctions between Boltzmann, Fermi, and Bose-Einstein statistics vanish.

36 Problem (partition function)
Consider a system of distinguishable particles with five microstates with energies 0, , , , and 2 (  = 1 eV ) in equilibrium with a reservoir at temperature T =0.5 eV. Find the partition function of the system. Find the average energy of a particle. What is the average energy of 10 such particles? the average energy of a single particle: the same result you’d get from this: the average energy of N = 10 such particles:

37 Problem 1 (partition function, average energy)
Consider a system of N particles with only 3 possible energy levels separated by  (let the ground state energy be 0). The system occupies a fixed volume V and is in thermal equilibrium with a reservoir at temperature T. Ignore interactions between particles and assume that Boltzmann statistics applies. (a) (2) What is the partition function for a single particle in the system? (b) (5) What is the average energy per particle? (c) (5) What is probability that the 2 level is occupied in the high temperature limit, kBT >> ? Explain your answer on physical grounds. (d) (5) What is the average energy per particle in the high temperature limit, kBT >> ? (e) (3) At what temperature is the ground state 1.1 times as likely to be occupied as the 2 level? (f) (25) Find the heat capacity of the system, CV, analyze the low-T (kBT<<) and high-T (kBT >> ) limits, and sketch CV as a function of T. Explain your answer on physical grounds. (a) (b) (c) all 3 levels are populated with the same probability (d)

38 Problem 1 (cont.) (e) (f) CV Low T (>>): high T (<<):

39 Problem 1 (partition function, average energy)
The neutral carbon atom has a 9-fold degenerate ground level and a 5-fold degenerate excited level at an energy 0.82 eV above the ground level. Spectroscopic measurements of a certain star show that 10% of the neutral carbon atoms are in the excited level, and that the population of higher levels is negligible. Assuming thermal equilibrium, find the temperature.

40 Problem (the average values)
A gas is placed in a very tall container at the temperature T. The container is in a uniform gravitational field, the acceleration of free fall, g, is given. Find the average potential energy of the molecules. h # of molecules within dh: For a very tall container (mgH/kBT ):

41 Example: energy and heat capacity of a two-level system
the slope ~ T The partition function: Ei 2=  The average energy: 1= 0 - lnni (check that the same result follows from )  /2 CV The heat capacity at constant volume: T

42 Problem Problem. Use Boltzmann factors to derive the exponential formula for the density of an isothermal atmosphere. The system is a single air molecule, with two states: 1 at the sea level (z = 0), 2 – at a height z. The energies of these two states differ only by the potential energy mgz (the temperature T does not vary with z): Home work: A system of particles are placed in a uniform field at T=280K. The particle concentrations measured at two points along the field line 3 cm apart differ by a factor of 2. Find the force F acting on the particles. Answer: F = 0.9·10-19 N A mixture of two gases with the molecular masses m1 and m2 (m2 >m1) is placed in a very tall container. The container is in a uniform gravitational field, the acceleration of free fall, g, is given. Near the bottom of the container, the concentrations of these two types of molecules are n1 and n2 respectively (n2 >n1) . Find the height from the bottom where these two concentrations become equal. Answer:

43 Problem At very high temperature (as in the very early universe), the proton and the neutron can be thought of as two different states of the same particle, called the “nucleon”. Since the neutron’s mass is higher than the proton’s by m = 2.3·10-30 kg, its energy is higher by mc2. What was the ratio of the number of protons to the number of neutrons at T=1011 K?

44 Problem (Boltzmann distribution)
A solid is placed in an external magnetic field B = 3 T. The solid contains weakly interacting paramagnetic atoms of spin ½ so that the energy of each atom is ±  B,  =9.3·10-23 J/T. Below what temperature must one cool the solid so that more than 75 percent of the atoms are polarized with their spins parallel to the external magnetic field? An absorption of the radio-frequency electromagnetic waves can induce transitions between these two energy levels if the frequency f satisfies he condition h f = 2  B. The power absorbed is proportional to the difference in the number of atoms in these two energy states. Assume that the solid is in thermal equilibrium at  B << kBT. How does the absorbed power depend on the temperature? (a) (b) The absorbed power is proportional to the difference in the number of atoms in these two energy states: The absorbed power is inversely proportional to the temperature.

45 Problem (Boltzmann distribution)
Consider an isothermic atmosphere at T=300K in a uniform gravitational field. Find the ratio of the number of molecules in two layers: one is 10 cm thick at the earth’s surface, and another one is 1 km thick at a height of 100 km. The mass of an air molecule m= 5·10-26 kg, the acceleration of free fall g=10 m/s2. - more air in the 10-cm-thick layer at the earth’s surface

46 Next … Maxwell-Boltzmann Distribution Bose, Einstein distribution, BB


Download ppt "Thermodynamic Potentials"

Similar presentations


Ads by Google