# Statistical Mechanics statistical distributions Maxwell-Boltzmann Statistics molecular energies in an ideal gas quantum statistics Since I’ve been unjustly.

## Presentation on theme: "Statistical Mechanics statistical distributions Maxwell-Boltzmann Statistics molecular energies in an ideal gas quantum statistics Since I’ve been unjustly."— Presentation transcript:

Statistical Mechanics statistical distributions Maxwell-Boltzmann Statistics molecular energies in an ideal gas quantum statistics Since I’ve been unjustly picking on chemists: “All science is either Physics or stamp collecting.”—Ernest Rutherford, physicist and 1908 Nobel Prize winner in Chemistry. Mar. 25 go to slide 9. Apr. 4 go to slide 24.

Chapter 9 Statistical Mechanics spinachgood stuff! spinach get it outta here As a grade schooler, I went to a Catholic school. They served lots of stewed spinach. Some of us got sick just from the fumes wafting up from the cafeteria 2 floors down.

The nuns made us “clean our plate” at lunch. It had something to do with the starving children in China. They would inspect our trays as we passed through the “dump the trash” line. What to do on spinach day? Stuff it in your empty milk carton and hope the nuns didn’t inspect it? Sit next to the one kid in class who liked stewed spinach,* and see how much you could pass off to him? *The most valuable kid in school that day.

What does this have to do with statistical mechanics? Physics faculty tend to think of thermodynamics (including statistical mechanics) as the stewed spinach of college physics courses. The wacko faculty member who actually likes teaching that stuff is a treasured friend whenever it comes time to give out teaching assignments. Just thought you might want to know* that before we start this chapter on statistical mechanics (thermodynamics). Fair warnings and all that. There are more cautionary tales, but not for now. I don’t want to scare you all off at once.

Before we get into chapter 9, let’s think about this course a bit. We started with relativity. A logical starting point, because relativity heralded the beginning of modern physics. Relativity forced us to start accepting previously unthinkable ideas. Relativity is more “fun” than most of the rest of the material, so it won’t drive prospective students away from the class. Relativity shows us that photons have momentum—a particle property, and gets us thinking about particle properties of waves.

Waves having particle properties leads us (e.g., de Broglie) to ask if particles have wave properties. After chapter 3, we backtracked to try to explain the properties of hydrogen, the simplest atom. This is backtracking, because we had to move backwards in the time sequence of discoveries, from de Broglie back to Rutherford. It doesn’t break the logical chain because we find we can’t explain hydrogen without invoking wave properties.

The puzzle of hydrogen forces us to completely re-think the fundamental ideas of physics. If physics can’t explain hydrogen—the simplest of all atoms—it is in dire shape. Something drastic must be done. “Something drastic” = “quantum mechanics” Once quantum mechanics is discovered we rush off to find applications and confirmations. A logical place to start testing quantum mechanics (Schrödinger’s equation) is to start with simple model systems (particle in box) and move up from there.

Once we’ve practiced with model systems, we go full circle and apply quantum mechanics to the system that started this trouble in the first place—hydrogen. The next step up from hydrogen is atoms (chapter 7). The next step up from atoms is a few atoms bonded together (chapter 8). What’s the next step up from a few atoms bonded together? Good! Lots of atoms. Lets start with them interacting but not bonded together in large masses (chapter 9). Then we’ll be able to tackle lots of atoms bonded together (chapter 10). There’s a logic to this, isn’t there. No wonder most modern physics books follow the same sequence.

9.1 Statistical Distributions Statistical mechanics deals with the behavior of systems of a large number of particles. We give up trying to keep track of individual particles. If we can’t solve Schrödinger’s equation in closed form for helium (4 particles) what hope do we have of solving it for the gas molecules in this room (10 really big number particles). Statistical mechanics handles many particles by calculating the most probable behavior of the system as a whole, rather than by being concerned with the behavior of individual particles. not again? yes, again

In statistical mechanics, we assume that the more ways there are to arrange the particles to give a particular distribution of energies, the more probable is that distribution. (Seems reasonable?) 6 ways 3 2 1 3 1 2 2 1 3 2 3 1 1 2 3 1 3 2 1 1 4 4 1 1 1 4 1 3 ways more likely 6 units of energy, 3 particles to give it to

We begin with an assumption that we believe describes nature. We see if the consequences of the assumption correspond in any way with reality. It is not “bad” to begin with an assumption, as long as we realize what we have done, and discard (or modify) the assumption when it fails to describe things we measure and observe. (repeating) In statistical mechanics, we assume that the more ways there are to arrange the particles to give a particular distribution of energies, the more probable is that distribution. (Seems reasonable.)

Beiser mentions W, which is the number of ways to arrange particles to give a particular distribution of energies. The idea is to calculate W, and find when W is maximized. That gives us the most probable state of the system. W doesn't appear anywhere else in this chapter. In previous editions, it was calculated in an appendix, where Beiser derived all the distribution functions we will use. So you don’t need to worry about W. A brief note... http://www.rozies.com /Zzzz/920/alpha.html

Here, in words, is the equation we will be working with in this chapter: (# of particles in a state of energy E) =(# of particles in a state of energy E) = (# of states having energy E) x (# of particles in a state of energy E) = (# of states having energy E) x (probability that a particle occupies the state of energy E). If we know the distribution function, the (probability that a particle occupies a state of energy E), we can make a number of useful calculations. Mathematically, the equation is written It is common to use epsilon to represent energy; I will call it "E" when I say it.

In systems such as atoms, only discrete energy levels are occupied, and the distribution g(  ) of energies is not continuous. On the other hand, it may be that the distribution of energies is continuous, or at least can be approximated as being continuous. In that case, we replace g(ε) by g(ε)dε, the number of states between ε and ε+dε. We will find that there are several possible distributions f(ε) which depend on whether particles are distinguishable, and what their spins are. Beiser mentions them (Maxwell-Boltzmann, Bose-Einstein, Fermi-Dirac) in this section. Let’s wait and introduce them one at a time.

9.2 Maxwell-Boltzmann Statistics Classical particles which are identical but far enough apart to be distinguishable obey Maxwell-Boltzmann statistics. classical  “slow,” wave functions don’t overlap distinguishable  you would know if two particles changed places (you could put your finger on one and follow it as it moves about) Example: ideal gas molecules. We take another step back in time from quantum mechanics (1930’s) to statistical mechanics (late 1800’s). Two particles can be considered distinguishable if their separation is large compared to their de Broglie wavelength.

The Maxwell-Boltzmann distribution function is I’ll explain the various symbols in a minute. Boltzmann discovered statistical mechanics and was a pioneer of quantum mechanics. His work contained elements of relativity and quantum mechanics, including discrete atomic energy levels.

“In his statistical interpretation of the second law of thermodynamics he introduced the theory of probability into a fundamental law of physics and thus broke with the classical prejudice, that fundamental laws have to be strictly deterministic.” (Flamm, 1997.)Flamm, 1997. “With Boltzmann's pioneering work the probabilistic interpretation of quantum mechanics had already a precedent.” Boltzmann constantly battled for acceptance of his work. He also struggled with depression and poor health. He committed suicide in 1906. Most of us believe thermodynamics was the cause. See a biography here.here

Paul Eherenfest, who wrote Boltzmann’s eulogy, carried on (among other things) the development of statistical thermodynamics for nearly three decades. Ehrenfest was filled with self-doubt and deeply troubled by the disagreements between his friends (Bohr, Einstein, etc.) which arose during the development of quantum mechanics. Ehrenfest shot himself in 1933.

Back to physics… US physicist Percy Bridgmann (the man on the right, winner of the 1946 Nobel Prize) took up the banner of thermodynamics, and studied the physics of matter under high pressure. Bridgman committed suicide in 1961. There’s no need for you to worry; I’ve never lost a student as a result of chapter 9 yet… The facts above accurate but rather selectively presented for dramatic effect.

The number of particles having energy ε at temperature T is A is like a normalization constant; we integrate n(ε) over all energies to get N, the total number of particles. A is fixed to give the "right" answer for the number of particles. For some calculations, we may not care exactly what the value of A is. ε is the particle energy, k is Boltzmann's constant (k = 1.38x10 -23 J/K), and T is the temperature in Kelvin. Often k is written k B. When k and T appear together, you can be sure that k is Boltzmann's constant. Maxwell-Boltzmann distribution function

We still need g(ε), the number of states having energy ε. We will find that g(ε) depends on the problem under study. Beiser justifies this distribution in Chapter 9, and but doesn't derive it in the current text. I won't go through all this justification. You can read it for yourself. Before we do an example… monatomic hydrogen is less stable than H 2, so are you more likely to find H 2 or H in nature? Nevertheless, suppose we could “make” a cubic meter of H atoms. How many atoms would we have? H 2, of course!

Example 9.1 A cubic meter of atomic H at 0 ºC and atmospheric pressure contains about 2.7x10 27 H atoms. Find how many are in their first excited state, n=2. Gas atoms at atmospheric pressure and temperature behave like ideal gases. Furthermore, they are far enough apart that Maxwell-Boltzmann statistics can be used.far enough For the hydrogen atoms in the ground state, For the hydrogen atoms in first excited state,

We can divide the equation for n(ε 2 ) by the equation for n(ε 1 ) to get We know ε 1, ε 2, and T. We need to calculate the g(ε)'s, which are the number of states of energy ε. We don’t need to know A, because it divides out. We get g(ε) for hydrogen by counting up the number of allowed electron states corresponding to each ε. Or we can simply recall that there are 2n 2 states corresponding to each n, so that g(ε 1 )=2(1) 2 and g(ε 2 )=2(2) 2. Plugging all of our numbers into the above equation gives n(ε 2 )/n(ε 1 )=1.3x10 -188. In other words, none of the atoms are in the n=2 state. Important: this is temperature in K, not in  C!

Caution: the solution to example 9.1 and g(ε n )=2(n) 2 only works for energy levels in atomic H and not for other assigned problems! For example, to do it for H 2 would require knowledge of H 2 molecular energy levels. Skip example 9.2; I won’t test you on it. 9.3 Molecular Energies in an Ideal Gas The example in section 9.2 dealt with atomic electronic energy levels in atomic hydrogen. In this section, we apply Maxwell- Boltzmann statistics to ideal gas molecules in general. We use the Maxwell-Boltzmann distribution to learn about the energies and speeds of molecules in an ideal gas.

We already have f(  ). We assume a continuous distribution of energies (why?), so that We need to calculate g(ε), the number states having an energy ε in the range ε to ε+dε. It turns out to be easier to find the number of momentum states corresponding to a momentum p, and transform back to energy states. g(ε) is called the “density of states.” Why? Every classical particle has a position and momentum given by the usual equations of classical mechanics.

Corresponding to every value of momentum is a value of energy. Momentum is a 3-dimensional vector quantity. Every (p x,p y,p z ) corresponds to some energy. We need to find how many momentum states are in this spherical shell. Think of as (p x,p y,p z ) forming a 3D grid in space. We count how many momentum states there are in a region of space (the density of momentum states) and then transform to the density of energy states.

The Maxwell-Boltzmann distribution is for classical particles, so we write The number of momentum states in a spherical shell from p to p+dp is proportional to 4 π p 2 dp (the volume of the shell). Thus, we can write the number of states having momentum between p and p+dp as where B is a proportionality constant, which we will worry about later.

Because each p corresponds to a single ε, Now, so that and The constant C contains B and all the other proportionality constants lumped together.

If the derivation on the previous four slides went by rather fast and seems quite confusing… If the derivation on the previous four slides went by rather fast and seems quite confusing… don’t worry, that’s quite normal. It’s only the final result (which we haven’t got to yet) which I want you to be able to use. Here are a couple of links presenting the same (or similar) derivation: hyperphysics Britney Spears' Guide to Semiconductor Physics: Density of StatesBritney Spears' Guide to Semiconductor Physics: Density of States

To find the constant C, we evaluate where N is the total number of particles in the system. Could you do the integral?Could you do the integral? Could I do the integral?Could you do the integral? Could I do the integral? No, not any more. Could you do the integral? Could I do the integral? No, not any more. Could I look the integral up in a table? Could you do the integral? Could I do the integral? No, not any more. Could I look the integral up in a table? Absolutely! The result is

so that This is the number of molecules having energy between ε and ε+dε in a sample containing N molecules at temperature T. WikipediaWikipedia says: “The Maxwell-Boltzmann distribution is an important relationship that finds many applications in physics and chemistry.” “It forms the basis of the kinetic theory of gases, which accurately explains many fundamental gas properties, including pressure and diffusion.”kinetic theory of gases “The Maxwell-Boltzmann distribution also finds important applications in electron transport and other phenomena.”

Webchem.netWebchem.net shows how the Maxwell-Boltzmann distribution is important for its influence on reaction rates and catalytic reactions. Here’s a plot of the distribution: k has units of [energy]/[temperature] so kT has units of energy. Notice how “no” molecules have E=0, few molecules have high energy (a few kT or greater), and there is no maximum of molecular energy.

Here’s how the distribution changes with temperature (each vertical grid line corresponds to 1 kT). Notice how the distribution for higher temperature is skewed towards higher energies (makes sense!) but all three curves have the same total area (also makes sense). Notice how the probability of a particle having energy greater than 3kT (in this example) increases as T increases.

If you aren’t interested enough in the derivation of g(ε) to visit Britney Spears' Guide to Semiconductor Physics: Density of States… Britney Spears' Guide to Semiconductor Physics: Density of States … you might miss this graphic. (Can you tell which one has the Ph.D.?)this

Continuing with the physics, the total energy of the system is Evaluation of the integral gives This is the total energy for the N molecules, so the average energy per molecule is exactly the result you get from elementary kinetic theory of gases.

Things to note about our ideal gas energy: The energy is independent of the molecular mass. Which gas molecules will move faster at a given temperature: lighter or heavier ones? Why? at room temperature is about 40 meV, or (1/25) eV. This is not a large amount of energy. kT/2 of energy "goes with" each degree of freedom.

Because ε = mv 2 /2, we can also calculate the number of molecules having speeds between v and v + dv. The result is Here’s a plot (number having a given speed vs. speed): “Looks like” n(  ) plot— nothing at speed=0, long exponential tail. “We” (Beiser) call this n(v). The hyperphysics web page calls it f(v).

The speed of a molecule having the average energy comes from solving for v. The result is v rms is the speed of a molecule having the average energy. It is an rms speed because we took the square root of the square of an average quantity.

The average speed can be calculated from The result is Comparing this with v rms, we find that Because the velocity distribution curve is skewed towards high energies, this result makes sense (why?).

You can also set dn(v) / dv = 0 to find the most probable speed. The result is The subscript “p” means “most probable.” Summarizing the different velocity results:

Plot of velocity distribution again: This plot comes from the hyperphysics web site. The R’s and M’s in the equations are a result of a different scaling than we used. See here for how it works (not testable material).hyperphysicshere n(v)

Example 9.4 Find the rms speed of oxygen molecules at 0 ºC. You need to know that an oxygen molecule is O 2. The atomic mass of O is 16 u (1 u = 1 atomic mass unit = 1.66x10 -27 kg). Would anybody (who hasn’t done the calculation yet) care to guess the rms speed before we do the calculation? 0 m/s?0 m/s? 10 m/s?0 m/s? 10 m/s? 100 m/s?0 m/s? 10 m/s? 100 m/s? 1,000 m/s?0 m/s? 10 m/s? 100 m/s? 1,000 m/s? 10,000 m/s?

Holy cow! You think you’d feel all these zillions of O 2 molecules constantly crashing into your skin at more than 1000 mph! And why no sonic booms?? (No—this is not a question I expect you to answer.)

Now, we've gone through a lot of math without thinking much about the physics of our system of gas molecules. We should step back and consider what we've done. A statistical approach has let us calculate the properties of an ideal gas. We've looked at a few of these properties (energies and speeds). This is "modern" physics, but not quantum physics. Click here and scroll down for a handy molecular speed calculator.here Who cares about ideal gases? Anybody interested in the atmosphere, or things moving through it, or machines moving gases around. Chemists. Biologists. Engineers. Physicists. Etc.

9.4 Quantum Statistics Here we deal with ideal particles whose wave functions overlap. We introduce quantum physics because of this overlap. Remember: The function f(ε) for quantum statistics depends on whether or not the particles obey the Pauli exclusion principle. “The wierd thing about the half-integral spin particles (also known as fermions) is that when you rotate one of them by 360 degrees, it's wavefunction changes sign. For integral spin particles (also known as bosons), the wavefunction is unchanged.” –Phil Fraundorf of UMSL, discussing why Balinese candle dancers have understood quantum mechanics for centuries.Balinese candle dancers

Links--- http://hyperphysics.phy-astr.gsu.edu/hbase/math/statcon.html#c1 http://www.chem.uidaho.edu/~honors/boltz.html http://www.wikipedia.org/wiki/Boltzmann_distribution http://www.webchem.net/notes/how_far/kinetics/maxwell_boltzmann.htm http://www.physics.nwu.edu/classes/2002Spring/Phyx103taylor/mbdist.html http://mats.gmd.de/~skaley/pwc/boltzmann/Boltzmann.html http://britneyspears.ac/physics/dos/dos.htm Cut and paste stuff:  ℓ ö         ħ       º

Download ppt "Statistical Mechanics statistical distributions Maxwell-Boltzmann Statistics molecular energies in an ideal gas quantum statistics Since I’ve been unjustly."

Similar presentations