In this talk, we introduce the notions of information entropy (Kullbeck-Leibler divergence) and statistical internal energy function as purely statistical concepts in connection to a large, recurrent, set of data under a probabilistic model. We show a statistical thermodynamic structure emerges, à la P. W. Anderson, in the infinite limit of data ad infinitum. Through Legendre transform and its duality, we discuss a novel insight on the Second Law that is independent of time, as forcefully suggested by Lieb and Yngvason. We suggest a return to Boltzmann-Ehrenfest's ergodic hypothesis, not about mathematical models but on empirical big data.
[1] Lu, Z. and Qian, H. (2022) Phys. Rev. Lett. 128, 150603.
[2] Qian, H. (2022) J. Chem. Theory Comput. 18, 6421.
请线下参加报告的外单位老师、同学注册时填写个人信息(中文姓名、身份证号码及手机号码),用于预约进入物理学院。
Prof. Yuxin Liu