Download presentation
Presentation is loading. Please wait.
1
Thermodynamics and the Gibbs Paradox
Presented by: Chua Hui Ying Grace Goh Ying Ying Ng Gek Puey Yvonne
2
Overview The three laws of thermodynamics The Gibbs Paradox
The Resolution of the Paradox Gibbs / Jaynes Von Neumann Shu Kun Lin’s revolutionary idea Conclusion
3
The Three Laws of Thermodynamics
1st Law Energy is always conserved 2nd Law Entropy of the Universe always increase 3rd Law Entropy of a perfect crystalline substance is taken as zero at the absolute temperature of 0K.
4
Unravel the mystery of The Gibbs Paradox
5
The mixing of non-identical gases
Before
6
Shows obvious increase in entropy (disorder)
After Shows obvious increase in entropy (disorder)
7
The mixing of identical gases
Before
8
Shows zero increase in entropy as action is reversible
After Shows zero increase in entropy as action is reversible
9
There is a CONTRADICTION!!!
Compare the two scenarios of mixing and we realize that…… There is a CONTRADICTION!!!
10
To resolve the Contradiction
Look at how people do this Gibbs /Jaynes Von Neumann Lin Shu Kun
11
Gibbs’ opinion When 2 non-identical gases mix and entropy increase, we imply that the gases can be separated and returned to their original state When 2 identical gases mix, it is impossible to separate the two gases into their original state as there is no recognizable difference between the gases
12
Gibbs’ opinion (2) Thus, these two cases stand on different footing and should not be compared with each other The mixing of gases of different kinds that resulted in the entropy change was independent of the nature of the gases Hence independent of the degree of similarity between them
13
Entropy Smax Similarity S=0 Z=0 Z = 1
14
Jaynes’ explanation The entropy of a macrostate is given as
Where S(X) is the entropy associated with a chosen set of macroscopic quantities W(C) is the phase volume occupied by all the microstates in a chosen reference class C
15
Jaynes’ explanation (2)
This thermodynamic entropy S(X) is not a property of a microstate, but of a certain reference class C(X) of microstates For entropy to always increase, we need to specify the variables we want to control and those we want to change. Any manipulation of variables outside this chosen set may cause us to see a violation of the second law.
16
Von Neumann’s Resolution
Makes use of the quantum mechanical approach to the problem He derives the equation Where measures the degree of orthogonality, which is the degree of similarity between the gases.
17
Von Neumann’s Resolution (2)
Hence when = 0 entropy is at its highest and when = 1 entropy is at its lowest Therefore entropy decreases continuously with increasing similarity
18
Entropy Smax Similarity S=0 Z=0 Z = 1
19
Resolving the Gibbs Paradox - Using Entropy and its revised relation with Similarity proposed by Lin Shu Kun. Draws a connection between information theory and entropy proposed that entropy increases continuously with similarity of the gases
20
Why “entropy increases with similarity” ?
Due to Lin’s proposition that entropy is the degree of symmetry and information is the degree of non-symmetry Analyse 3 concepts! (1) high symmetry = high similarity, (2) entropy = information loss and (3) similarity = information loss.
21
High symmetry can be described as high similarity !
(1) high symmetry = high similarity symmetry is a measure of indistinguishability high symmetry contributes to high indistinguishability similarity can be described as a continuous measure of imperfect symmetry High Symmetry Indistinguishability High similarity High symmetry can be described as high similarity !
22
(2) entropy = information loss
an increase in entropy means an increase in disorder. a decrease in entropy reflects an increase in order. A more ordered system is more highly organized thus possesses greater information content.
23
Do you have any idea what the picture is all about?
25
entropy = information loss
From the previous example, Greater entropy would result in least information registered Higher entropy , higher information loss Thus if the system is more ordered, This means lower entropy and thus less information loss. entropy = information loss
26
(3) similarity = information loss.
1 Particle (n-1) particles For a system with distinguishable particles, Information on N particles = different information of each particle = N pieces of information For a system with indistinguishable particles, Information of N particles = Information of 1 particle = 1 piece of information High similarity (high symmetry) there is greater information loss.
27
(1) high symmetry = high similarity (2) entropy = information loss and
Concepts explained: (1) high symmetry = high similarity (2) entropy = information loss and (3) similarity = information loss After establishing the links between the various concepts, If a system is highly symmetrical high similarity Greater information loss Higher entropy
28
The mixing of identical gases (revisited)
Before
29
After
30
Lin’s Resolution of the Gibbs Paradox
Compared to the non-identical gases, we have less information about the identical gases According to his theory, less information=higher entropy Therefore, the mixing of gases should result in an increase with entropy. No Paradox!
31
Comparing the 3 graphs Gibbs Von Neumann Lin Entropy Smax Similarity
Z=0 Z = 1 Entropy Smax Similarity S=0 Z=0 Z = 1 Z=0 Entropy Smax Similarity S=0 Z = 1 Gibbs Von Neumann Lin
32
Why are there different ways in resolving the paradox?
Different ways of considering Entropy Lin—Static Entropy: consideration of configurations of fixed particles in a system Gibbs & von Neumann—Dynamic Entropy: dependent of the changes in the dispersal of energy in the microstates of atoms and molecules
33
We cannot compare the two ways of resolving the paradox!
Since Lin’s definition of entropy is essentially different from that of Gibbs and von Neumann, it is unjustified to compare the two ways of resolving the paradox.
34
Conclusion The Gibbs Paradox poses problem to the second law due to an inadequate understanding of the system involved. Lin’s novel idea sheds new light on entropy and information theory, but which also leaves conflicting grey areas for further exploration.
35
Acknowledgements We would like to thank
Dr. Chin Wee Shong for her support and guidance throughout the semester Dr Kuldip Singh for his kind support And all who have helped in one way or another
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.