Presentation is loading. Please wait.

Presentation is loading. Please wait.

Conservation of Entropy ??? The question arose, is entropy conserved? After all, energy is. But a great deal of experimental experience indicated that:

Similar presentations


Presentation on theme: "Conservation of Entropy ??? The question arose, is entropy conserved? After all, energy is. But a great deal of experimental experience indicated that:"— Presentation transcript:

1 Conservation of Entropy ??? The question arose, is entropy conserved? After all, energy is. But a great deal of experimental experience indicated that:  S(system) +  S(surroundings)  0 This is the Second Law of Thermodynamics. Heat never spontaneously flows from a cold body to a hot one. A cyclic process can never remove heat from a hot body and achieve complete conversion into work

2 Entropy is Not Conserved Vacuum 2 Atm T V 1 1Atm T 1 Atm T V 2 =2V 1 Two cases of expansion of an ideal gas: 1) Expansion in to a vacuum. 2) Reversible expansion (1)w=0,  E=0, q irreversible =0  S(surroundings)=0 To calculate  S(system) we look to the second process (they must be the same).

3 Entropy is Not Conserved In the second process we follow an isothermal, reversible path. We know that  T=0 and thus  E=0 Now…q rev =  E - w= nRT ln(2) so  S(system)= q rev /T= nR ln(2) Vacuum 2 Atm T V 1 1Atm T 1 Atm T V 2 =2V 1

4 For the reversible process we’ve already calculated q rev = nRT ln(V2/V1) = nRT ln(2)  S(system) = q rev /T= nR ln(2) One way to make sure this is reversible is to make sure the outside temperature is only differentially hotter. In this case,  S(surroundings) = -q rev /T  S(total) = 0 Back to case (1), expansion into vacuum... For the total irreversible process,  S(surroundings) = 0 (nothing exchanged)  S (total) = 0 + nR ln(2) > 0 Entropy is Not Conserved Question: If S is a state variable, how can  S(total) be path dependent?

5 Dissorder and Entropy Ludwig Boltzmann c. 1875 It turns out that disorder and entropy are intimately related. We start out by considering the spontaneity of this process. Why doesn’t the gas spontaneously reappear back in the box? After all, w=0,  E=0, q=0,  T=0, but  S=nRln(V2/V1)

6 Dissorder and Entropy * Let’s break the box into N cells and consider the gas to be an ideal gas composed of M molecules. We ask: What is the probability that M molecules will be in the box labeled ‘*’ This obviously depends on both M and N. We assume N>M for this problem. Number of ways of distributing M indistinguishable balls in N boxes is approximately:  ~ N M /M! (for N and M large) Boltzmann noted that an entropy could be defined as S= k ln(  )= R ln(  )/N A There are a number of reasons this is a good definition. One is it connects to thermodynamics.

7 So for a given state we have S= k ln(  )= R ln(  )/N A = R ln(N M /M!)/N A Let’s say we change state by increasing the volume. Well, for the same sized cells, N increases to N’. S’-S= (R/N A ) (ln(N’ M /M!) - ln(N M /M!)) =(R/N A ) ln(N’ M /N M ) So  S= (R/N A ) ln(N’ M /N M ) = M (R/N A ) ln(N’/N) And since N is proportional to volume:  S = M (R/N A ) ln(V2/V1) = nR ln(V2/V1) Dissorder and Entropy

8 Feynman’s Description of Entropy The number of ways the inside can be arranged so that, from the outside, the system looks the same. Then from a probabilistic point of view, it is natural that entropy should tend go up. Imagine an ideal gas, molecules do not interact with each other, how do they “know where to go”. Answer: They don’t.

9 Fluctuations All the previous arguments relate to processes involving large numbers of molecules averages over long periods of time. When the system is small and observation time is short, then fluctuations around the “maximum” entropy solution can be found.

10 Entropy of Materials Diamond S° 298 = 2.4 J/K Graphite S° 298 = 5.7 J/K How about water in its different phases: S° 298 (J/K mol) H 2 O(s,ice)44.3 H 2 O(l)69.91 H 2 O(g)188.72 Why does graphite have more entropy than diamond?

11 Entropy of Mixing What happens when you mix two ideal gases? What happens when you solvate a molecule?

12 Entropy Calculations Another example: dropping an ice cube into boiling hot water. 0°C 100 ° C What happens? Is it reversible? Let’s take the ice cube as the system, and the hot water as the surroundings (i.e. there’s enough hot water so that it stays 100° C).

13 Entropy Calculations 0° C 100 ° C Ice, 0 ° C Water, 100 ° C Water, 0 ° C irreversible reversible Let’s calculate for the reversible path... 12

14 Entropy Calculations Entropy of phase transition1 (for water)

15 Entropy Calculations Entropy of heating2

16 Entropy Calculations 0° C 100 ° C 21 + Altogether,  S = = 45.5 J K –1 mol –1 Is this process spontaneous? It better be… How can we tell? Remember, this is only for the system (ice cube).

17 Entropy Calculations Entropy change in the surroundings: How much total heat was exchanged with the surroundings? For melting, –6 kJ mol –1. For heating, –0.75 kJ mol –1. This is just C P (T 2 –T 1 ) Total = –6.75 kJ mol –1.  S = –6750 J mol –1 / 373 K = –18.1 J mol –1 K –1 Total entropy change for the universe: +27.4 J mol –1 K –1 Later, we will combine S and H into another state variable, G, which directly relates to spontaneity.

18 Entropy and Chemical Reactions Gas phase reaction: More products than reactants implies entropy goes up There are no restrictions on the positions of unbound atoms, so there are more possible configurations (  ) S= k ln(  ) tells us entropy increases There is some tendency for reaction to occur even if  H is positive.

19 Entropy and Chemical Reactions CO 3 2- (aq) + H + (aq)HCO 3 - (aq)148.1 J/K mol HC 2 O 4 - (aq) + OH - (aq)C 2 O 4 2- + H 2 O(l)-23.1 SS Its hard to predict the change in entropy without considering solvent effects Anything that imposes structure or organization on a molecular arrangement lowers the entropy (fewer available configurations) Solvation involves arranging solvent molecules around a solute generally driven by  H at a cost with respect to  S

20 The Third Law There is an absolute reference point for entropy. The Third Law states: “The entropy of a perfect crystal at 0 K is zero.” More precisely, it states that the entropy of all perfect crystals at absolute 0 is the same, so we might as well call it zero. Molecular (stat mech) considerations show that zero is a good thing to call it.

21 The Third Law A couple of Third-Law points: Just cooling something to 0 K is not enough. Elements, compounds, it’s all zero! Example: Carbon monoxide Not a perfect crystal !!! Unlike E, the reference state for S is a perfect crystal of a pure substance, not necessarily an element.

22 The Third Law The Third Law makes sense from the molecular point of view: A perfect crystal has only one possible arrangement of atoms. At 0 K there is only one possible distribution of energy.  = 1 ln  = 0


Download ppt "Conservation of Entropy ??? The question arose, is entropy conserved? After all, energy is. But a great deal of experimental experience indicated that:"

Similar presentations


Ads by Google