1. Trang chủ
  2. » Kỹ Thuật - Công Nghệ

Introduction to Thermodynamics and Statistical Physics phần 1 ppt

17 565 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 17
Dung lượng 266,47 KB

Nội dung

Eyal Buks Introduction to Thermodynamics and Statistical Physics (114016) - Lecture Notes April 13, 2011 Technion Preface to be written Con tents 1. The Principle of Larg est Uncertain ty 1 1.1 EntropyinInformationTheory 1 1.1.1 Example- TwoStatesSystem 1 1.1.2 SmallestandLargest Entropy 2 1.1.3 Thecompositionproperty 5 1.1.4 Alternative Definitionof Entropy 8 1.2 LargestUncertaintyEstimator 9 1.2.1 Useful Relations 11 1.2.2 TheFreeEntropy 13 1.3 The Principle of Largest Uncertainty in Statistical Mechanics 14 1.3.1 Microcanonical Distribution 14 1.3.2 CanonicalDistribution 15 1.3.3 Grandcanonical Distribution 16 1.3.4 TemperatureandChemicalPotential 17 1.4 Time Evolutionof EntropyofanIsolatedSystem 18 1.5 Thermal Equilibrium . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 1.5.1 ExternallyAppliedPotential Energy 20 1.6 FreeEntropyandFreeEnergies 21 1.7 ProblemsSet1 21 1.8 SolutionsSet1 29 2. Ideal Gas 45 2.1 AParticleinaBox 45 2.2 GibbsParadox 48 2.3 FermionsandBosons 50 2.3.1 Fermi-DiracDistribution 51 2.3.2 Bose-EinsteinDistribution 52 2.3.3 ClassicalLimit 52 2.4 IdealGasintheClassicalLimit 53 2.4.1 Pressure 55 2.4.2 Useful Relations 56 2.4.3 HeatCapacity 57 2.4.4 InternalDegrees of Freedom 57 2.5 ProcessesinIdealGas 60 Contents 2.5.1 IsothermalProcess 62 2.5.2 IsobaricProcess 62 2.5.3 IsochoricProcess 63 2.5.4 Isentropic Process 63 2.6 CarnotHeatEngine 64 2.7 Limits Imposed Upon the Efficiency 66 2.8 ProblemsSet2 71 2.9 SolutionsSet2 79 3. Bosonic and Fermionic Systems 97 3.1 ElectromagneticRadiation 97 3.1.1 ElectromagneticCavity 97 3.1.2 PartitionFunction 100 3.1.3 CubeCavity 100 3.1.4 AverageEnergy 102 3.1.5 Stefan-Boltzmann Radiation Law 103 3.2 PhononsinSolids 105 3.2.1 OneDimensionalExample 105 3.2.2 The3DCase 107 3.3 FermiGas 110 3.3.1 OrbitalPartitionFunction 110 3.3.2 PartitionFunctionof theGas 110 3.3.3 EnergyandNumberofParticles 112 3.3.4 Example: ElectronsinMetal 112 3.4 Semiconductor Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114 3.5 ProblemsSet3 115 3.6 SolutionsSet3 117 4. Classical Limit of Statistical M echanics 127 4.1 ClassicalHamiltonian 127 4.1.1 Hamilton-Jacobi Equations 128 4.1.2 Example 128 4.1.3 Example 129 4.2 Density Function 130 4.2.1 EquipartitionTheorem 130 4.2.2 Example 131 4.3 NyquistNoise 132 4.4 ProblemsSet4 136 4.5 SolutionsSet4 138 5. Exam Wint er 2010 A 147 5.1 Problems 147 5.2 Solutions 148 Eyal Buks Thermodynamics and Statistical Physics 6 Con tents 6. Exam Wint er 2010 B 155 6.1 Problems 155 6.2 Solutions 156 References 163 Index 165 Eyal Buks Thermodynamics and Statistical Physics 7 1. The Principle of Largest Uncertainty In this chapter w e discuss relations between information theory and statistical mechanics. We show that the canonical and grand canonical distributions can be obtained from Shannon’s principle of maximum uncertainty [1, 2, 3]. Moreover, the tim e evolution of the entropy of an isolated system and the H theorem are discussed. 1.1 Entropy in Information Theory The possible states of a given system are denoted as e m ,wherem =1, 2, 3, , and the prob ability that s tate e m is occupied is denoted by p m . The normal- ization condition reads X m p m =1. (1.1) For a giv en probability distribution {p m } the entropy is defined as σ = − X m p m log p m . (1.2) Below we show that this quantity characterizes the uncertaint y in the knowl- edge of the state of the syste m. 1.1.1 Example - Two States System Consider a system which can occupy either state e 1 with probability p,or state e 2 with probability 1 − p,where0≤ p ≤ 1. The entropy is given by σ = −p log p −(1 − p)log(1− p) . (1.3) Chapter 1. The Principle o f Largest Un certainty 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0 0.2 0.4 0.6 0.8 1 x −p log p − (1 − p)log(1− p) As expected, the entropy vanishes at p =0andp = 1, since in bo th cases there is no uncertainty in wha t is the state which is occupied by the system. The largest uncertainty is obtained at p =0.5, for which σ =log2=0.69. 1.1.2 Smallest and Largest Entropy Smallest value. The term −p log p in the range 0 ≤ p ≤ 1 is plotted in the figure belo w. Note that the value of −p log p in the limit p → 0canbe calculated using L’H ospital’s rule lim p→0 (−p log p) = lim p→0 Ã − dlogp dp d dp 1 p ! =0. (1.4) From this figure , which shows that −p log p ≥ 0 in the range 0 ≤ p ≤ 1, it is easy to infer that the smallest possible value of the entropy is zero. Moreover, since −p log p =0iff p =0orp =1,itisevidentthatσ =0iff the system occupies one of the states with probability one and all the other states with probability zero. In this case ther e is no uncertainty in what is the state which is occupied by the system. Eyal Buks Thermodynamics and Statistical Physics 2 [...]... Statistical Physics (1. 25) 6 1. 1 Entropy in Information Theory p1 e1 , e2 , , eM 1 q1 p1 e1 q2 p1 p2 … eM 1 +1 , , eM 1 + M 2 qM 1 qM1 qM 1 + 2 qM 1 + M 2 p1 p2 p2 p2 … e e2 M 1 p1 = q1 + q2 + + qM 1 … eM 1 +1 eM1 + 2 eM 1 + M 2 p2 = qM 1 +1 + + qM 1 + M 2 Fig 1. 2 The composition property - the general case µ ¶ q1 q2 qM , , , 1 p1 p1 p1 µ ¶ q1 q2 q2 qM1 qM q1 − log − − log 1 = p1 − log p1 p1 p1 p1 p1 p1 =... states e2 and e3 contributes only when state e1 is not occupied, an event which occurs with probability 1 − p1 Using the definition (1. 2) and the normalization condition p1 + p2 + p3 = 1 , (1. 21) one finds σi = −p1 log p1 − (1 − p1 ) log (1 − p1 ) · ¸ p2 p2 p3 p3 log − log + (1 − p1 ) − 1 − p1 1 − p1 1 − p1 1 − p1 = −p1 log p1 − p2 log p2 − p3 log p3 − (1 − p1 − p2 − p3 ) log (1 − p1 ) = σ (p1 , p2 ,... p1 p1 p1 p1 p1 = −q1 log q1 − q2 log q2 − − qM1 log qM1 + p1 log p1 , p1 σ (1. 26) µ ¶ qM1 +1 qM1 +2 qM1 +M2 p2 σ , , , p2 p2 p2 = −qM1 +1 log qM1 +1 − qM1 +2 log qM1 +2 − − qM1 +M2 log qM1 +M2 + p2 log p2 , (1. 27) etc., thus it is evident that condition (1. 24) is indeed satisfied Eyal Buks Thermodynamics and Statistical Physics 7 Chapter 1 The Principle of Largest Uncertainty 1. 1.4 Alternative Definition... p2 = qM1 +1 + qM1 +2 + + qM1 +M2 , etc., where p1 + p2 + = 1 (1. 23) The composition property requires that the following holds [see Fig 1. 2] σ (q1 , q2 , , qM0 ) = σ (p1 , p2 , ) µ ¶ q1 q2 qM , , , 1 + p1 σ p1 p1 p1 µ ¶ qM1 +1 qM1 +2 qM1 +M2 , , , + + p2 σ p2 p2 p2 (1. 24) Using the definition (1. 2) the following holds σ (p1 , p2 , ) = −p1 log p1 − p2 log p2 − , Eyal Buks Thermodynamics and Statistical. .. p1 , or not occupy state e1 with probability 1 − p1 ; (b) given that the system does not occupy state e1 , it can either occupy state e2 with probability p2 / (1 − p1 ) or occupy state e3 with probability p3 / (1 − p1 ) Assuming that uncertainty (entropy) is additive, the total uncertainty (entropy) is given by µ ¶ p2 p3 σi = σ (p1 , 1 − p1 ) + (1 − p1 ) σ , (1. 20) 1 − p1 1 − p1 The factor (1 − p1... ∇σ = ∇σ k + ∇σ ⊥ , (1. 9) (1. 10) ¯ (1. 11) δ p = (δ p)k + (δ p)⊥ , ¯ ¯ ¡ ¢ ¡ ¢ ¯ ¯ ¯ ¯ ¯ where ∇σ k and (δ p)k are parallel to ∇g0 , and where ∇σ ⊥ and (δ p)⊥ are ¯ orthogonal to ∇g0 Using this notation Eq (1. 8) can be expressed as Eyal Buks Thermodynamics and Statistical Physics 3 Chapter 1 The Principle of Largest Uncertainty ¡ ¢ ¡ ¢ ¯ ¯ δσ = ∇σ k · (δ p)k + ∇σ ⊥ · (δ p)⊥ ¯ ¯ (1. 12) Given that the... = σ (p1 , p2 , , pN ) + p1 Λ (M1 ) + p2 Λ (M2 ) + (1. 28) In particular, consider the case were M1 = M2 = = MN = K For this case one finds Λ (NK) = Λ (N) + Λ (K) (1. 29) Taking K = N = 1 yields Λ (1) = 0 (1. 30) Taking N = 1 + x yields 1 Λ (1 + x) Λ (K + Kx) − Λ (K) = Kx K x (1. 31) Taking the limit x → 0 yields C dΛ = , dK K (1. 32) where C = lim x→0 Λ (1 + x) x (1. 33) Integrating Eq (1. 32) and using... initial condition (1. 30) yields Λ (K) = C log K Eyal Buks Thermodynamics and Statistical Physics (1. 34) 8 1. 2 Largest Uncertainty Estimator Moreover, the second property requires that C > 0 Choosing C = 1 and using Eq (1. 28) yields σ (p1 , p2 , , pN ) = Λ (M0 ) − p1 Λ (M1 ) − p2 Λ (M2 ) − M1 M2 MN = −p1 log − p2 log − − pM log M0 M0 M0 = −p1 log p1 − p2 log p2 − − pN log pN , (1. 35) in agreement... vectors ∇σ and ∇g0 are parallel to each other In other words, only when ¯ ¯ ∇σ = ξ 0 ∇g0 , (1. 13) where ξ 0 is a constant This constant is called Lagrange multiplier Using Eqs (1. 2) and (1. 5) the condition (1. 13) is expressed as log pm + 1 = ξ 0 (1. 14) Let M be the number of available states From Eq (1. 14) we find that all probabilities are equal Thus using Eq (1. 5), one finds that p1 = p2 = = 1 ... property given by Eq (1. 24) Exercise 1. 1 .1 Show that the above definition leads to the entropy given by Eq (1. 2) up to multiplication by a positive constant Solution 1. 1 .1 The 1st property allows approximating the probabilities p1 , p2 , , pN using rational numbers, namely p1 = M1 /M0 , p2 = M2 /M0 , etc., where M1 , M2 , are integers and M0 = M1 + M2 + + MN Using the composition property (1. 24) one finds . MMM ee ++ 2 p … 1 2 p q 1 1 p q M 1 1 +M e 2 1 p q M 2 1 +M e 21 MM e + … 2 2 1 p q M + 2 21 p q MM + 1 211 M qqqp +++= 211 12 MMM qqp ++ ++= 1 p 1 e 1 , ,, 21 M eee 1 1 p q 2 e 1 M e … 211 , , 1 MMM ee ++ 2 p … 1 2 p q 1 1 p q M 1 1 +M e 2 1 p q M 2 1 +M e 21 MM e + … 2 2 1 p q M. , q M 1 p 1 ¶ = p 1 µ − q 1 p 1 log q 1 p 1 − q 2 p 1 log q 2 p 1 − − q M 1 p 1 log q M 1 p 1 ¶ = −q 1 log q 1 − q 2 log q 2 − −q M 1 log q M 1 + p 1 log p 1 , (1. 26) p 2 σ µ q M 1 +1 p 2 , q M 1 +2 p 2 ,. p 2 − , (1. 25) Eyal Buks Thermodynamics and Statistical Physics 6 1. 1. Entropy in Information Theory 1 p 1 e 1 , ,, 21 M eee 1 1 p q 2 e 1 M e … 211 , , 1 MMM ee ++ 2 p … 1 2 p q 1 1 p q M 1 1 +M e 2 1 p q M 2 1 +M e 21 MM e + … 2 2 1 p q M

Ngày đăng: 08/08/2014, 15:21

TỪ KHÓA LIÊN QUAN